33 resultados para Value Systems And The Professionalization Of Hoteliers
Resumo:
Purpose - The idea that knowledge needs to be codified is central to many claims that knowledge can be managed. However, there appear to be no empirical studies in the knowledge management context that examine the process of knowledge codification. This paper therefore seeks to explore codification as a knowledge management process. Design/methodology/approach - The paper draws on findings from research conducted around a knowledge management project in a section of the UK Post Office, using a methodology of participant-observation. Data were collected through observations of project meetings, correspondence between project participants, and individual interviews. Findings - The principal findings about the nature of knowledge codification are first, that the process of knowledge codification also involves the process of defining the codes needed to codify knowledge, and second, that people who participate in the construction of these codes are able to interpret and use the codes more similarly. From this it can be seen that the ability of people to decodify codes similarly places restrictions on the transferability of knowledge between them. Research limitations/implications - The paper therefore argues that a new conceptual approach is needed for the role of knowledge codification in knowledge management that emphasizes the importance of knowledge decodification. Such an approach would start with one's ability to decodify rather than codify knowledge as a prerequisite for knowledge management. Originality/value - The paper provides a conceptual basis for explaining limitations to the management and transferability of knowledge. © Emerald Group Publishing Limited.
Resumo:
Vaccination remains a key tool in the protection and eradication of diseases. However, the development of new safe and effective vaccines is not easy. Various live organism based vaccines currently licensed, exhibit high efficacy; however, this benefit is associated with risk, due to the adverse reactions found with these vaccines. Therefore, in the development of vaccines, the associated risk-benefit issues need to be addressed. Sub-unit proteins offer a much safer alternative; however, their efficacy is low. The use of adjuvanted systems have proven to enhance the immunogenicity of these sub-unit vaccines through protection (i.e. preventing degradation of the antigen in vivo) and enhanced targeting of these antigens to professional antigen-presenting cells. Understanding of the immunological implications of the related disease will enable validation for the design and development of potential adjuvant systems. Novel adjuvant research involves the combination of both pharmaceutical analysis accompanied by detailed immunological investigations, whereby, pharmaceutically designed adjuvants are driven by an increased understanding of mechanisms of adjuvant activity, largely facilitated by description of highly specific innate immune recognition of components usually associated with the presence of invading bacteria or virus. The majority of pharmaceutical based adjuvants currently being investigated are particulate based delivery systems, such as liposome formulations. As an adjuvant, liposomes have been shown to enhance immunity against the associated disease particularly when a cationic lipid is used within the formulation. In addition, the inclusion of components such as immunomodulators, further enhance immunity. Within this review, the use and application of effective adjuvants is investigated, with particular emphasis on liposomal-based systems. The mechanisms of adjuvant activity, analysis of complex immunological characteristics and formulation and delivery of these vaccines are considered.
Resumo:
This work presents significant development into chaotic mixing induced through periodic boundaries and twisting flows. Three-dimensional closed and throughput domains are shown to exhibit chaotic motion under both time periodic and time independent boundary motions, A property is developed originating from a signature of chaos, sensitive dependence to initial conditions, which successfully quantifies the degree of disorder withjn the mixing systems presented and enables comparisons of the disorder throughout ranges of operating parameters, This work omits physical experimental results but presents significant computational investigation into chaotic systems using commercial computational fluid dynamics techniques. Physical experiments with chaotic mixing systems are, by their very nature, difficult to extract information beyond the recognition that disorder does, does not of partially occurs. The initial aim of this work is to observe whether it is possible to accurately simulate previously published physical experimental results through using commercial CFD techniques. This is shown to be possible for simple two-dimensional systems with time periodic wall movements. From this, and subsequent macro and microscopic observations of flow regimes, a simple explanation is developed for how boundary operating parameters affect the system disorder. Consider the classic two-dimensional rectangular cavity with time periodic velocity of the upper and lower walls, causing two opposing streamline motions. The degree of disorder within the system is related to the magnitude of displacement of individual particles within these opposing streamlines. The rationale is then employed in this work to develop and investigate more complex three-dimensional mixing systems that exhibit throughputs and time independence and are therefore more realistic and a significant advance towards designing chaotic mixers for process industries. Domains inducing chaotic motion through twisting flows are also briefly considered. This work concludes by offering possible advancements to the property developed to quantify disorder and suggestions of domains and associated boundary conditions that are expected to produce chaotic mixing.
Resumo:
The research described in this thesis investigates three issues related to the use of expert systems for decision making in organizations. These are the effectiveness of ESs when used in different roles, to replace a human decision maker or to advise a human decision maker, the users' behaviourand opinions towards using an expertadvisory system and, the possibility of organization-wide deployment of expert systems and the role of an ES in different organizational levels. The research was based on the development of expert systems within a business game environment, a simulation of a manufacturing company. This was chosen to give more control over the `experiments' than would be possible in a real organization. An expert system (EXGAME) was developed based on a structure derived from Anthony's three levels of decision making to manage the simulated company in the business game itself with little user intervention. On the basis of EXGAME, an expert advisory system (ADGAME) was built to help game players to make better decisions in managing the game company. EXGAME and ADGAME are thus two expert systems in the same domain performing different roles; it was found that ADGAME had, in places, to be different from EXGAME, not simply an extension of it. EXGAME was tested several times against human rivals and was evaluated by measuring its performance. ADGAME was also tested by different users and was assessed by measuring the users' performance and analysing their opinions towards it as a helpful decision making aid. The results showed that an expert system was able to replace a human at the operational level, but had difficulty at the strategic level. It also showed the success of the organization-wide deployment of expert systems in this simulated company.
Resumo:
In response to the increasing international competitiveness, many manufacturing businesses are rethinking their management strategies and philosophies towards achieving a computer integrated environment. The explosive growth in Advanced Manufacturing Technology (AMI) has resulted in the formation of functional "Islands of Automation" such as Computer Aided Design (CAD), Computer Aided Manufacturing (CAM), Computer Aided Process Planning (CAPP) and Manufacturing Resources Planning (MRPII). This has resulted in an environment which has focussed areas of excellence and poor overall efficiency, co-ordination and control. The main role of Computer Integrated Manufacturing (CIM) is to integrate these islands of automation and develop a totally integrated and controlled environment. However, the various perceptions of CIM, although developing, remain focussed on a very narrow integration scope and have consequently resulted in mere linked islands of automation with little improvement in overall co-ordination and control. This thesis, that is the research described within, develops and examines a more holistic view of CIM, which is based on the integration of various business elements. One particular business element, namely control, has been shown to have a multi-facetted and underpinning relationship with the CIM philosophy. This relationship impacts various CIM system design aspects including the CIM business analysis and modelling technique, the specification of systems integration requirements, the CIM system architectural form and the degree of business redesign. The research findings show that fundamental changes to CIM system design are required; these are incorporated in a generic CIM design methodology. The affect and influence of this holistic view of CIM on a manufacturing business has been evaluated through various industrial case study applications. Based on the evidence obtained, it has been concluded that this holistic, control based approach to CIM can provide a greatly improved means of achieving a totally integrated and controlled business environment. This generic CIM methodology will therefore make a significant contribution to the planning, modelling, design and development of future CIM systems.
Resumo:
This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.
Resumo:
In recent years, freshwater fish farmers have come under increasing pressure from the Water Authorities to control the quality of their farm effluents. This project aimed to investigate methods of treating aquacultural effluent in an efficient and cost-effective manner, and to incorporate the knowledge gained into an Expert System which could then be used in an advice service to farmers. From the results of this research it was established that sedimentation and the use of low pollution diets are the only cost effective methods of controlling the quality of fish farm effluents. Settlement has been extensively investigated and it was found that the removal of suspended solids in a settlement pond is only likely to be effective if the inlet solids concentration is in excess of 8 mg/litre. The probability of good settlement can be enhanced by keeping the ratio of length/retention time (a form of mean fluid velocity) below 4.0 metres/minute. The removal of BOD requires inlet solids concentrations in excess of 20 mg/litre to be effective, and this is seldom attained on commercial fish farms. Settlement, generally, does not remove appreciable quantities of ammonia from effluents, but algae can absorb ammonia by nutrient uptake under certain conditions. The use of low pollution, high performance diets gives pollutant yields which are low when compared with published figures obtained by many previous workers. Two Expert Systems were constructed, both of which diagnose possible causes of poor effluent quality on fish farms and suggest solutions. The first system uses knowledge gained from a literature review and the second employs the knowledge obtained from this project's experimental work. Consent details for over 100 fish farms were obtained from the public registers kept by the Water Authorities. Large variations in policy from one Authority to the next were found. These data have been compiled in a computer file for ease of comparison.
Resumo:
Cationic liposomes of dimethyldioctadecylammonium bromide (DDA) incorporating the glycolipid trehalose 6,6-dibehenate (TDB) forms a promising liposomal vaccine adjuvant. To be exploited as effective subunit vaccine delivery systems, the physicochemical characteristics of liposomes were studied in detail and correlated with their effectiveness in vivo, in an attempt to elucidate key aspects controlling their efficacy. This research took the previously optimised DDA-TDB system as a foundation for a range of formulations incorporating additional lipids of 1,2-dipalmitoyl-sn-glycero-3-phosphocholine (DPPC) or 1,2-distearoyl-sn-glycero-3-phosphocholine (DSPC), by incrementally replacing the cationic content within DDA-TDB or reducing the total DDA-TDB dose upon its substitution, to ascertain the role of DDA and the effect of DDA-TDB concentration in influencing the resultant immunological performance upon delivery of the novel subunit TB vaccine, Ag85B–ESAT-6-Rv2660c (H56 vaccine). With the aim of using the DPPC based systems for pulmonary vaccine delivery and the DSPC systems for application via the intramuscular route, initial work focused on physicochemical characterisation of the systems with incorporation of DPPC or DSPC displaying comparable physical stability, morphological structure and levels of antigen retention to that of DDA-TDB. Thermodynamic analysis was also conducted to detect main phase transition temperatures and subsequent in vitro cell culture studies demonstrated a favourable reduction in cytotoxicity, stimulation of phagocytic activity and macrophage activation in response to the proposed liposomal immunoadjuvants. Immunisation of mice with H56 vaccine via the proposed liposomal adjuvants showed that DDA was an important factor in mediating resultant immune responses, with partial replacement or substitution of DDA-TDB stimulating Th1 type cellular immunity characterised by elevated levels of IgG2b antibodies and IFN-? and IL-2 cytokines, essential for providing protective efficacy against TB. Upon increased DSPC content within the formulation, either by DDA replacement or reduction of DDA and TDB, responses were skewed towards Th2 type immunity with reduced IgG2b antibody levels and elevated IL-5 and IL-10 cytokine production, as resultant immunological responses were independent of liposomal zeta potential. The role of the cationic DDA lipid and the effect of DDA-TDB concentration were appreciated as the proposed liposomal formulations elicited antigen specific antibody and cellular immune responses, demonstrating the potential of cationic liposomes to be utilised as adjuvants for subunit vaccine delivery. Furthermore, the promising capability of the novel H56 vaccine candidate in eliciting protection against TB was apparent in a mouse model.
Resumo:
This paper explores the nature of the co-called ‘private equity business model’ (PEBM) and assesses its shortcomings, using the illustrative example of the role of private equity in structuring the finance and subsequent collapse of MG Rover, as the automotive industry has been a significant destination for private equity financing. The paper outlines the nature of the PEBM. It then details how the PEBM extracts value, before stressing how this can affect workers in a portfolio business. We argue that the emergence of the PEBM changes the basis of competitive rules in organizations and the running of erstwhile going concerns, necessitating a need for further regulation—particularly, how to secure wider stakeholder oversight without reducing the efficiency of PEBM concerns.
Resumo:
Purpose – Threats of extreme events, such as terrorist attacks or infrastructure breakdown, are potentially highly disruptive events for all types of organizations. This paper seeks to take a political perspective to power in strategic decision making and how this influences planning for extreme events. Design/methodology/approach – A sample of 160 informants drawn from 135 organizations, which are part of the critical national infrastructure in the UK, forms the empirical basis of the paper. Most of these organizations had publicly placed business continuity and preparedness as a strategic priority. The paper adopts a qualitative approach, coding data from focus groups. Findings – In nearly all cases there is a pre-existing dominant coalition which keeps business continuity decisions off the strategic agenda. The only exceptions to this are a handful of organizations which provide continuous production, such as some utilities, where disruption to business as usual can be readily quantified. The data reveal structural and decisional elements of the exercise of power. Structurally, the dominant coalition centralizes control by ensuring that only a few functional interests participate in decision making. Research limitations/implications – Decisional elements of power emphasize the dominance of calculative rationality where decisions are primarily made on information and arguments which can be quantified. Finally, the paper notes the recursive aspect of power relations whereby agency and structure are mutually constitutive over time. Organizational structures of control are maintained, despite the involvement of managers charged with organizational preparedness and resilience, who remain outside the dominant coalition. Originality/value – The paper constitutes a first attempt to show how planning for emergencies fits within the strategy-making process and how politically controlled this process is.
Resumo:
Focussing on the period from 1948 to 1997, this paper examines the history of rationing in the British National Health Service (NHS), with special reference to the role of hospital accounting in this context. The paper suggests that concerns regarding rationing first emerged in the 1960s and 1970s in response to the application of economic theories to the health services, and that rationing only became an issue of wider concern when the NHS increasingly came to resemble economic models of health services in the early 1990s. The paper moreover argues that, unlike in the USA, hospital accounting did not play a significant role in allocating or withholding health resources in Britain. Rudimentary information systems as well as resistance from medical professionals are identified as significant factors in this context.
Resumo:
Purpose: This research paper aims to examine the global trends in publishing in the leading marketing journals between 1964 and 2008, focusing on how public policy intervention in the assessment and funding of academic research has influenced Britain's relative productivity in the world's leading marketing journals. Design/methodology/approach: The method was an audit of contributions to the leading journals based on the authors' affiliation, country of origin and country in which they obtained their doctoral training. Findings: The results show that the proportion of leading marketing publications by authors affiliated to British universities have held steady at about 2 per cent, while the productivity of several other countries has accelerated past Britain. However, to retain that share, Britain has increasingly depended upon importing people whose PhD is not British. This contrasts with some other European countries that are now more productive than Britain, but mainly recruit locals with local PhDs. The pattern of decline in the UK is related to the impact of Britain's research assessment exercise and the continuation of relatively weak social science research training. Research limitations/implications: The analysis is limited by only looking at one academic discipline and only the top few academic journals in the field. Practical implications: The findings have implications at several levels. At a national policy level it questions the value of the research assessment exercises that appear to have presided over a decline in research productivity. For institutions, it questions the value in investing in developing local talent when success has come to those who buy talent internationally. Perhaps, the major implication arises from Britain's academic productivity declining while neighbouring countries have grown in international excellence. Originality/value: At a time when the continuation of expensive university research assessments is being questioned the research findings add value to the current debate in showing how that very process has accompanied academic decline. © Emerald Group Publishing Limited.
Resumo:
This paper ends with a brief discussion of climate change and suggests that a practical solution would be to transfer much of the current air, sea and long-haul trucking of intercontinental freight between China and Europe (and the USA) to maglev systems. First we review the potential of Asian knowledge management and organisational learning and contrast this against Western precepts finding that there seems to be little incentive to 'look after one's fellows' in China (and perhaps across Asia) outside of tight personal guanxi networks. This is likely to be the case in the intense production regions of China where little time is allowed for 'organisational learning' by the staff and there is little incentive to initiate 'knowledge management' by senior managers. Thus the 'tragedy of the commons' will be enacted by individuals, township, and provincial leaders upwards to top ministers - no one will care for the climate or pollution, only for their own group and their wealth creation prospects. Copyright © 2011 Inderscience Enterprises Ltd.
Resumo:
Purpose - The paper develops a model of employee innovative behavior conceptualizing it as distinct from innovation outputs and as a multi-faceted behavior rather than a simple count of ‘innovative acts’ by employees. It understands individual employee innovative behaviors as a micro-foundation of firm intrapreneurship that is embedded in and influenced by contextual factors such as managerial, organizational and cultural support for innovation. Building from a review of existing employee innovative behavior scales and theoretical considerations we develop and validate the Innovative Behavior Inventory (IBI) and the Innovation Support Inventory (ISI). Design/methodology/approach – Two pilot studies, a third validation study in the Czech Republic and a fourth cross-cultural validation study using population representative samples from Switzerland, Germany, Italy and the Czech Republic (N=2812 employees and 450 entrepreneurs) were conducted. Findings - Both inventories were reliable and showed factorial, criterion, convergent and discriminant validity as well as cross-cultural equivalence. Employee innovative behavior was supported as comprising of idea generation, idea search, idea communication, implementation starting activities, involving others and overcoming obstacles. Managerial support was the most proximal contextual influence on innovative behavior and mediated the effect of organizational support and national culture. Originality/value - The paper advances our understanding of employee innovative behavior as a multi-faceted phenomenon and the contextual factors influencing it. Where past research typically focuses on convenience samples within a particular country, we offer first robust evidence that our model of employee innovative behavior generalizes across cultures and types of samples. Our model and the IBI and ISI inventories enable researchers to build a deeper understanding of the important micro-foundation underpinning intrapreneurial behavior in organizations and allow practitioners to identify their organizations’ strengths and weaknesses related to intrapreneurship.