954 resultados para Aspect Oriented Development


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies have shown that the brand “owner” is very influential in positioning the brand and when the brand “owner” ceases his or her active role the brand will be perceived differently by the consumers. Balance Theory (HBT), a cognitive psychological theory, studies the triadic relationships between two persons and an entity and predicts that when a person’s original perception of the relationship is disturbed, the person restructures to a new balanced perception. Consequently, this research was undertaken to: conceptualize the brand owner’s impact on consumer’s brand perception; test the applicability of both the static and dynamic predictions of the Heider’s Balance Theory in brand owner-consumer-brand relation (OCB); construct and test a model of brand owner-consumer-brand relation; and examine if personality has an influence on OCB. A discovery-oriented approach was taken to understand the selected market segment, the ready-to-wear and diffusion lines of international designer labels. Chinese Brand Personality Scale, fashion proneness and hedonic and utilitarian shopping scales were developed, and validated. 51 customers were surveyed. Both traditional and extended methods used in the Balance Theory were employed in this study. Responses to liked brand have been used to test and develop the model, while those for disliked brand were used for test and confirmation. A “what if’ experimental approach was employed to test the applicability of dynamic HBT theory in OCB Model. The hypothesized OCB Model has been tested and validated. Consumers have been found to have separate views on the brand and the brand owner; and their responses to contrasting ethical and non-ethical news of the brand owner are different. Personality has been found to have an influence and two personality adapted models have been tested and validated. The actual results go beyond the prediction of the Balance Theory. Dominant triple positive balance mode, dominant negative balance mode, and mode of extreme antipathy have been found. It has been found that not all balanced modes are good for the brand. Contrary to Heider’s findings, simply liking may not necessarily lead to unit relation in the OCB Model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Topical and transdermal formulations are promising platforms for the delivery of drugs. A unit dose topical or transdermal drug delivery system that optimises the solubility of drugs within the vehicle provides a novel dosage form for efficacious delivery that also offers a simple manufacture technique is desirable. This study used Witepsol® H15 wax as a abase for the delivery system. One aspect of this project involved determination of the solubility of ibuprofen, flurbiprofen and naproxen in the was using microscopy, Higuchi release kinetics, HyperDSC and mathematical modelling techniques. Correlations between the results obtained via these techniques were noted with additional merits such as provision of valuable information on drug release kinetics and possible interactions between the drug and excipients. A second aspect of this project involved the incorporation of additional excipients: Tween 20 (T), Carbopol®971 (C) and menthol (M) to the wax formulation. On in vitro permeation through porcine skin, the preferred formulations were: ibuprofen (5% w/w) within Witepsol®H15 + 1% w/w T; flurbiprofen (10% w/w) within Witepsol®H15 + 1% w/w T; naproxen (5% w/w) within Witepsol®H15 + 1% w/w T + 1% C and sodium diclofenac (10% w/w) within Witepsol®H15 + 1% w/w T + 1% w/w T + 1% w/w C + 5% w/w M. Unit dose transdermal tablets containing ibuprofen and diclofenac were produced with improved flux compared to marketed products; Voltarol Emugel® demonstrated flux of 1.68x10-3 cm/h compared to 123 x 10-3 cm/h for the optimised product as detailed above; Ibugel Forte® demonstrated a permeation coefficient value of 7.65 x 10-3 cm/h compared to 8.69 x 10-3 cm/h for the optimised product as described above.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distortion or deprivation of vision during an early `critical' period of visual development can result in permanent visual impairment which indicates the need to identify and treat visually at-risk individuals early. A significant difficulty in this respect is that conventional, subjective methods of visual acuity determination are ineffective before approximately three years of age. In laboratory studies, infant visual function has been quantified precisely, using objective methods based on visual evoked potentials (VEP), preferential looking (PL) and optokinetic nystagmus (OKN) but clinical assessment of infant vision has presented a particular difficulty. An initial aim of this study was to evaluate the relative clinical merits of the three techniques. Clinical derivatives were devised, the OKN method proved unsuitable but the PL and VEP methods were evaluated in a pilot study. Most infants participating in the study had known ocular and/or neurological abnormalities but a few normals were included for comparison. The study suggested that the PL method was more clinically appropriate for the objective assessment of infant acuity. A study of normal visual development from birth to one year was subsequently conducted. Observations included cycloplegic refraction, ophthalmoscopy and preferential looking visual acuity assessment using horizontally and vertically oriented square wave gratings. The aims of the work were to investigate the efficiency and sensitivity of the technique and to study possible correlates of visual development. The success rate of the PL method varied with age; 87% of newborns and 98% of infants attending follow-up successfully completed at least one acuity test. Below two months monocular acuities were difficult to secure; infants were most testable around six months. The results produced were similar to published data using the acuity card procedure and slightly lower than, but comparable with acuity data derived using extended PL methods. Acuity development was not impaired in infants found to have retinal haemorrhages as newborns. A significant relationship was found between newborn binocular acuity and anisometropia but not with other refractive findings. No strong or consistent correlations between grating acuity and refraction were found for three, six or twelve months olds. Improvements in acuity and decreases in levels of hyperopia over the first week of life were suggestive of recovery from minor birth trauma. The refractive data was analysed separately to investigate the natural history of refraction in normal infants. Most newborns (80%) were hyperopic, significant astigmatism was found in 86% and significant anisometropia in 22%. No significant alteration in spherical equivalent refraction was noted between birth and three months, a significant reduction in hyperopia was evident by six months and this trend continued until one year. Observations on the astigmatic component of the refractive error revealed a rather erratic series of changes which would be worthy of further investigation since a repeat refraction study suggested difficulties in obtaining stable measurements in newborns. Astigmatism tended to decrease between birth and three months, increased significantly from three to six months and decreased significantly from six to twelve months. A constant decrease in the degree of anisometropia was evident throughout the first year. These findings have implications for the correction of infantile refractive error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following Andersen's (1986, 1991) study of untutored anglophone learners of Spanish, aspectual features have been at the centre of hypotheses on the development of past verbal morphology in language acquisition. The Primacy of Aspect Hypothesis claims that the association of any verb category (Aktionsart) with any aspect (perfective or imperfective) constitutes the endpoint of acquisition. However, its predictions rely on the observation of a limited number of untutored learners at the early stages of their acquisition, and have yet to be confirmed in other settings. The aim of the present thesis is to evaluate the explanatory power of the PAH in respect of the acquisition of French past tenses, an aspect of the language which constitutes a serious stumbling block for foreign learners, even those at the highest levels of proficiency (Coppieters 1987). The present research applies the PAH to the production of 61 anglophone 'advanced learners' (as defined in Bartning 1997) in a tutored environment. In so doing, it tests concurrent explanations, including the influence of the input, the influence of chunking, and the hypothesis of cyclic development. Finally, it discusses the cotextual and contextual factors that still provoke what Anderson (1991) terms "non-native glitches" at the final stage, as predicted by the PAH. The first part of the thesis provides the theoretical background to the corpus analysis. It opens with a diachronic presentation of the French past tense system focusing on present areas of competition and developments that emphasize the complexity of the system to be acquired. The concepts of time, grammatical aspect and lexical aspect (Aktionsart) are introduced and discussed in the second chapter, and a distinctive formal representation of the French past tenses is offered in the third chapter. The second part of the thesis is devoted to a corpus analysis. The data gathering procedures and the choice of tasks (oral and written film narratives based on Modern Times, cloze tests and acceptability judgement tests) are described and justified in the research methodology chapter. The research design was shaped by previous studies and consequently allows comparison with these. The second chapter is devoted to the narratives analysis and the third to the grammatical tasks. This section closes with a summary of discoveries and a comparison with previous results. The conclusion addresses the initial research questions in the light of both theory and practice. It shows that the PAH fails to account for the complex phenomenon of past tense development in the acquisitional settings under study, as it adopts a local (the verb phrase) and linear (steady progression towards native usage) approach. It is thus suggested that past tense acquisition rather follows a pendular development as learners reformulate their learning hypotheses and become increasingly able to shift from local to global cues and so to integrate the influence of cotext and context in their tense choice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concerns the application of a model of effective interpersonal relationships to problems arising from staff assessment at I.C.I. Ltd. Corporate Laboratory between 1972 and 1974. In collaboration with academic and industrial supervision, the study commenced with a survey of management and supervisor opinions about the effectiveness of current staff (work) relationships, with particular reference to the problem of recognising and developing creative potential. This survey emphasised a need to improve the relationships between staff in the staff assessment context. A survey of research into creativity emphasised the importance of the interpersonal environment for obtaining creative behaviour in an organisation context. A further survey of theories of how inter­personal behaviour related to personal creativity (therapeutic psychology) provided a model of effective interpersonal behaviour (Carkhuff, 1969) that could be applied to the organisation context of staff assessment. The objective of the project was redefined as a need to improve the conditions of interpersonal behaviour in relation to certain (career development) problems arising from staff assessment practices. In order to demonstrate the application of the model of effective interpersonal behaviour, the research student recorded interviews between himself and members of staff designed to develop and operate the dimensions of the model. Different samples of staff were used to develop the 'facilitative' and the 'action oriented' dimensions of bahaviour, and then for the operation of a helping programme (based on vocational guidance tests). These interactions have been analysed, according to the scales of measurement in the model ana the results are presented in case study form in this thesis. At each stage of the project, results and conclusions were presented to the sponsoring organisation (e.g. industrial supervisor) in order to assess their (subjective) opinion of relevance to the organ­ isation. Finally, recommendations on further actions towards general improvement of the work relationships in the laboratory were presented in a brief report to the sponsor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In analysing manufacturing systems, for either design or operational reasons, failure to account for the potentially significant dynamics could produce invalid results. There are many analysis techniques that can be used, however, simulation is unique in its ability to assess detailed, dynamic behaviour. The use of simulation to analyse manufacturing systems would therefore seem appropriate if not essential. Many simulation software products are available but their ease of use and scope of application vary greatly. This is illustrated at one extreme by simulators which offer rapid but limited application whilst at the other simulation languages which are extremely flexible but tedious to code. Given that a typical manufacturing engineer does not posses in depth programming and simulation skills then the use of simulators over simulation languages would seem a more appropriate choice. Whilst simulators offer ease of use their limited functionality may preclude their use in many applications. The construction of current simulators makes it difficult to amend or extend the functionality of the system to meet new challenges. Some simulators could even become obsolete as users, demand modelling functionality that reflects the latest manufacturing system design and operation concepts. This thesis examines the deficiencies in current simulation tools and considers whether they can be overcome by the application of object-oriented principles. Object-oriented techniques have gained in popularity in recent years and are seen as having the potential to overcome any of the problems traditionally associated with software construction. There are a number of key concepts that are exploited in the work described in this thesis: the use of object-oriented techniques to act as a framework for abstracting engineering concepts into a simulation tool and the ability to reuse and extend object-oriented software. It is argued that current object-oriented simulation tools are deficient and that in designing such tools, object -oriented techniques should be used not just for the creation of individual simulation objects but for the creation of the complete software. This results in the ability to construct an easy to use simulator that is not limited by its initial functionality. The thesis presents the design of an object-oriented data driven simulator which can be freely extended. Discussion and work is focused on discrete parts manufacture. The system developed retains the ease of use typical of data driven simulators. Whilst removing any limitation on its potential range of applications. Reference is given to additions made to the simulator by other developers not involved in the original software development. Particular emphasis is put on the requirements of the manufacturing engineer and the need for Ihe engineer to carrv out dynamic evaluations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manufacturing planning and control systems are fundamental to the successful operations of a manufacturing organisation. 10 order to improve their business performance, significant investment is made by companies into planning and control systems; however, not all companies realise the benefits sought Many companies continue to suffer from high levels of inventory, shortages, obsolete parts, poor resource utilisation and poor delivery performance. This thesis argues that the fit between the planning and control system and the manufacturing organisation is a crucial element of success. The design of appropriate control systems is, therefore, important. The different approaches to the design of manufacturing planning and control systems are investigated. It is concluded that there is no provision within these design methodologies to properly assess the impact of a proposed design on the manufacturing facility. Consequently, an understanding of how a new (or modified) planning and control system will perform in the context of the complete manufacturing system is unlikely to be gained until after the system has been implemented and is running. There are many modelling techniques available, however discrete-event simulation is unique in its ability to model the complex dynamics inherent in manufacturing systems, of which the planning and control system is an integral component. The existing application of simulation to manufacturing control system issues is limited: although operational issues are addressed, application to the more fundamental design of control systems is rarely, if at all, considered. The lack of a suitable simulation-based modelling tool does not help matters. The requirements of a simulation tool capable of modelling a host of different planning and control systems is presented. It is argued that only through the application of object-oriented principles can these extensive requirements be achieved. This thesis reports on the development of an extensible class library called WBS/Control, which is based on object-oriented principles and discrete-event simulation. The functionality, both current and future, offered by WBS/Control means that different planning and control systems can be modelled: not only the more standard implementations but also hybrid systems and new designs. The flexibility implicit in the development of WBS/Control supports its application to design and operational issues. WBS/Control wholly integrates with an existing manufacturing simulator to provide a more complete modelling environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time after time… and aspect and mood. Over the last twenty five years, the study of time, aspect and - to a lesser extent - mood acquisition has enjoyed increasing popularity and a constant widening of its scope. In such a teeming field, what can be the contribution of this book? We believe that it is unique in several respects. First, this volume encompasses studies from different theoretical frameworks: functionalism vs generativism or function-based vs form-based approaches. It also brings together various sub-fields (first and second language acquisition, child and adult acquisition, bilingualism) that tend to evolve in parallel rather than learn from each other. A further originality is that it focuses on a wide range of typologically different languages, and features less studied languages such as Korean and Bulgarian. Finally, the book gathers some well-established scholars, young researchers, and even research students, in a rich inter-generational exchange, that ensures the survival but also the renewal and the refreshment of the discipline. The book at a glance The first part of the volume is devoted to the study of child language acquisition in monolingual, impaired and bilingual acquisition, while the second part focuses on adult learners. In this section, we will provide an overview of each chapter. The first study by Aviya Hacohen explores the acquisition of compositional telicity in Hebrew L1. Her psycholinguistic approach contributes valuable data to refine theoretical accounts. Through an innovating methodology, she gathers information from adults and children on the influence of definiteness, number, and the mass vs countable distinction on the constitution of a telic interpretation of the verb phrase. She notices that the notion of definiteness is mastered by children as young as 10, while the mass/count distinction does not appear before 10;7. However, this does not entail an adult-like use of telicity. She therefore concludes that beyond definiteness and noun type, pragmatics may play an important role in the derivation of Hebrew compositional telicity. For the second chapter we move from a Semitic language to a Slavic one. Milena Kuehnast focuses on the acquisition of negative imperatives in Bulgarian, a form that presents the specificity of being grammatical only with the imperfective form of the verb. The study examines how 40 Bulgarian children distributed in two age-groups (15 between 2;11-3;11, and 25 between 4;00 and 5;00) develop with respect to the acquisition of imperfective viewpoints, and the use of imperfective morphology. It shows an evolution in the recourse to expression of force in the use of negative imperatives, as well as the influence of morphological complexity on the successful production of forms. With Yi-An Lin’s study, we concentrate both on another type of informant and of framework. Indeed, he studies the production of children suffering from Specific Language Impairment (SLI), a developmental language disorder the causes of which exclude cognitive impairment, psycho-emotional disturbance, and motor-articulatory disorders. Using the Leonard corpus in CLAN, Lin aims to test two competing accounts of SLI (the Agreement and Tense Omission Model [ATOM] and his own Phonetic Form Deficit Model [PFDM]) that conflicts on the role attributed to spellout in the impairment. Spellout is the point at which the Computational System for Human Language (CHL) passes over the most recently derived part of the derivation to the interface components, Phonetic Form (PF) and Logical Form (LF). ATOM claims that SLI sufferers have a deficit in their syntactic representation while PFDM suggests that the problem only occurs at the spellout level. After studying the corpus from the point of view of tense / agreement marking, case marking, argument-movement and auxiliary inversion, Lin finds further support for his model. Olga Gupol, Susan Rohstein and Sharon Armon-Lotem’s chapter offers a welcome bridge between child language acquisition and multilingualism. Their study explores the influence of intensive exposure to L2 Hebrew on the development of L1 Russian tense and aspect morphology through an elicited narrative. Their informants are 40 Russian-Hebrew sequential bilingual children distributed in two age groups 4;0 – 4;11 and 7;0 - 8;0. They come to the conclusion that bilingual children anchor their narratives in perfective like monolinguals. However, while aware of grammatical aspect, bilinguals lack the full form-function mapping and tend to overgeneralize the imperfective on the principles of simplicity (as imperfective are the least morphologically marked forms), universality (as it covers more functions) and interference. Rafael Salaberry opens the second section on foreign language learners. In his contribution, he reflects on the difficulty L2 learners of Spanish encounter when it comes to distinguishing between iterativity (conveyed with the use of the preterite) and habituality (expressed through the imperfect). He examines in turn the theoretical views that see, on the one hand, habituality as part of grammatical knowledge and iterativity as pragmatic knowledge, and on the other hand both habituality and iterativity as grammatical knowledge. He comes to the conclusion that the use of preterite as a default past tense marker may explain the impoverished system of aspectual distinctions, not only at beginners but also at advanced levels, which may indicate that the system is differentially represented among L1 and L2 speakers. Acquiring the vast array of functions conveyed by a form is therefore no mean feat, as confirmed by the next study. Based on the prototype theory, Kathleen Bardovi-Harlig’s chapter focuses on the development of the progressive in L2 English. It opens with an overview of the functions of the progressive in English. Then, a review of acquisition research on the progressive in English and other languages is provided. The bulk of the chapter reports on a longitudinal study of 16 learners of L2 English and shows how their use of the progressive expands from the prototypical uses of process and continuousness to the less prototypical uses of repetition and future. The study concludes that the progressive spreads in interlanguage in accordance with prototype accounts. However, it suggests additional stages, not predicted by the Aspect Hypothesis, in the development from activities and accomplishments at least for the meaning of repeatedness. A similar theoretical framework is adopted in the following chapter, but it deals with a lesser studied language. Hyun-Jin Kim revisits the claims of the Aspect Hypothesis in relation to the acquisition of L2 Korean by two L1 English learners. Inspired by studies on L2 Japanese, she focuses on the emergence and spread of the past / perfective marker ¬–ess- and the progressive – ko iss- in the interlanguage of her informants throughout their third and fourth semesters of study. The data collected through six sessions of conversational interviews and picture description tasks seem to support the Aspect Hypothesis. Indeed learners show a strong association between past tense and accomplishments / achievements at the start and a gradual extension to other types; a limited use of past / perfective marker with states and an affinity of progressive with activities / accomplishments and later achievements. In addition, - ko iss– moves from progressive to resultative in the specific category of Korean verbs meaning wear / carry. While the previous contributions focus on function, Evgeniya Sergeeva and Jean-Pierre Chevrot’s is interested in form. The authors explore the acquisition of verbal morphology in L2 French by 30 instructed native speakers of Russian distributed in a low and high levels. They use an elicitation task for verbs with different models of stem alternation and study how token frequency and base forms influence stem selection. The analysis shows that frequency affects correct production, especially among learners with high proficiency. As for substitution errors, it appears that forms with a simple structure are systematically more frequent than the target form they replace. When a complex form serves as a substitute, it is more frequent only when it is replacing another complex form. As regards the use of base forms, the 3rd person singular of the present – and to some extent the infinitive – play this role in the corpus. The authors therefore conclude that the processing of surface forms can be influenced positively or negatively by the frequency of the target forms and of other competing stems, and by the proximity of the target stem to a base form. Finally, Martin Howard’s contribution takes up the challenge of focusing on the poorer relation of the TAM system. On the basis of L2 French data obtained through sociolinguistic interviews, he studies the expression of futurity, conditional and subjunctive in three groups of university learners with classroom teaching only (two or three years of university teaching) or with a mixture of classroom teaching and naturalistic exposure (2 years at University + 1 year abroad). An analysis of relative frequencies leads him to suggest a continuum of use going from futurate present to conditional with past hypothetic conditional clauses in si, which needs to be confirmed by further studies. Acknowledgements The present volume was inspired by the conference Acquisition of Tense – Aspect – Mood in First and Second Language held on 9th and 10th February 2008 at Aston University (Birmingham, UK) where over 40 delegates from four continents and over a dozen countries met for lively and enjoyable discussions. This collection of papers was double peer-reviewed by an international scientific committee made of Kathleen Bardovi-Harlig (Indiana University), Christine Bozier (Lund Universitet), Alex Housen (Vrije Universiteit Brussel), Martin Howard (University College Cork), Florence Myles (Newcastle University), Urszula Paprocka (Catholic University of Lublin), †Clive Perdue (Université Paris 8), Michel Pierrard (Vrije Universiteit Brussel), Rafael Salaberry (University of Texas at Austin), Suzanne Schlyter (Lund Universitet), Richard Towell (Salford University), and Daniel Véronique (Université d’Aix-en-Provence). We are very much indebted to that scientific committee for their insightful input at each step of the project. We are also thankful for the financial support of the Association for French Language Studies through its workshop grant, and to the Aston Modern Languages Research Foundation for funding the proofreading of the manuscript.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technological capabilities in Chinese manufacturing have been transformed in the last three decades. However, the extent to which domestic market oriented state owned enterprises (SOEs) have developed their capabilities is not clear. Six SOEs in the automotive, steel and machine tools sectors in Beijing and Tianjin have been studied since the mid-1990s to assess the capability levels attained and the role of external sources and internal efforts in developing them. Aided by government policies, acquisition of technology and their own efforts, the case study companies appear to be broadly following the East Asian late industrialisation model. All six enterprises demonstrate competences in operating established technology, managing investment and making product and process improvements. The evidence suggests that companies without foreign joint venture (JV) collaborations have made more progress in this respect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methodologies for understanding business processes and their information systems (IS) are often criticized, either for being too imprecise and philosophical (a criticism often levied at softer methodologies) or too hierarchical and mechanistic (levied at harder methodologies). The process-oriented holonic modelling methodology combines aspects of softer and harder approaches to aid modellers in designing business processes and associated IS. The methodology uses holistic thinking and a construct known as the holon to build process descriptions into a set of models known as a holarchy. This paper describes the methodology through an action research case study based in a large design and manufacturing organization. The scientific contribution is a methodology for analysing business processes in environments that are characterized by high complexity, low volume and high variety where there are minimal repeated learning opportunities, such as large IS development projects. The practical deliverables from the project gave IS and business process improvements for the case study company.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on a part of work for the UNIDO initiative on technology transfer for sustainable industrial development. The proposed technology transfer framework, adapted from the East Asian late industrialisers model, identifies two categories of countries requiring support for enhancing their technological capabilities: (a) very late industrialisers (“low income” developing countries), and (b) slow industrialisers (countries with sizeable manufacturing sectors but limited success in gaining international competitiveness) and three technology transfer routes: (a) through trade and aid to strengthen indigenous production for domestic markets (Route 1); (b) through FDI and contracting to develop export oriented firms (Route 2), and (c) through the supply chain of capital equipment and materials to develop local subcontracting capacity (Route 3). Very late industrialisers need support to start with Route 1 in selected sectors and upgrade through imported mature technologies. Appropriate product innovations are also possible. The slow industrialisers have more scope for increased technology transfer through Routes 2 and 3.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims - To develop a method that prospectively assesses adherence rates in paediatric patients with acute lymphoblastic leukaemia (ALL) who are receiving the oral thiopurine treatment 6-mercaptopurine (6-MP). Methods - A total of 19 paediatric patients with ALL who were receiving 6-MP therapy were enrolled in this study. A new objective tool (hierarchical cluster analysis of drug metabolite concentrations) was explored as a novel approach to assess non-adherence to oral thiopurines, in combination with other objective measures (the pattern of variability in 6-thioguanine nucleotide erythrocyte concentrations and 6-thiouric acid plasma levels) and the subjective measure of self-reported adherence questionnaire. Results - Parents of five ALL patients (26.3%) reported at least one aspect of non-adherence, with the majority (80%) citing “carelessness at times about taking medication” as the primary reason for non-adherence followed by “forgetting to take the medication” (60%). Of these patients, three (15.8%) were considered non-adherent to medication according to the self-reported adherence questionnaire (scored ≥ 2). Four ALL patients (21.1%) had metabolite profiles indicative of non-adherence (persistently low levels of metabolites and/or metabolite levels clustered variably with time). Out of these four patients, two (50%) admitted non-adherence to therapy. Overall, when both methods were combined, five patients (26.3%) were considered non-adherent to medication, with higher age representing a risk factor for non-adherence (P < 0.05). Conclusions - The present study explored various ways to assess adherence rates to thiopurine medication in ALL patients and highlighted the importance of combining both objective and subjective measures as a better way to assess adherence to oral thiopurines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The publication represents a multi-dimensional and multi-faced, in depth assessment of the most significant determinants of the EU development as a political, economic and legal entity, in its format emerging from the Lisbon Treaty. The book represents an important contribution to our understanding of the most profound issues in the recent process of EU integration, including the issue of maintaining its cohesion and coherence under the stress of global challanges faced also by the European Union. Autohors formulated worthwhile conclusions of high value not only for academics but also for political decision-makers, which gives the book same competitive edge over its more theoretical and, hence, less practice-oriented, knack. The arumentation presented in the book would not be left without a reaction of the academic and/or professional circles. I take it almost for granted that the overall setting of the argumentation presented in it, as well as specific points made in its various chapters would find their adequate resonance in a high profile discussion likely to emerge after the book would have been published.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Teallach project has adapted model-based user-interface development techniques to the systematic creation of user-interfaces for object-oriented database applications. Model-based approaches aim to provide designers with a more principled approach to user-interface development using a variety of underlying models, and tools which manipulate these models. Here we present the results of the Teallach project, describing the tools developed and the flexible design method supported. Distinctive features of the Teallach system include provision of database-specific constructs, comprehensive facilities for relating the different models, and support for a flexible design method in which models can be constructed and related by designers in different orders and in different ways, to suit their particular design rationales. The system then creates the desired user-interface as an independent, fully functional Java application, with automatically generated help facilities.