23 resultados para job order production
Resumo:
Job satisfaction is a significant predictor of organisational innovation – especially where employees (including shop-floor workers) experience variety in their jobs and work in a single-status environment. The relationship between job satisfaction and performance has long intrigued work psychologists. The idea that "happy workers are productive workers" underpins many theories of performance, leadership, reward and job design. But contrary to popular belief, the relationship between job satisfaction and performance at individual level has been shown to be relatively weak. Research investigating the link between job satisfaction and creativity (the antecedent to innovation) shows that job dissatisfaction promotes creative outcomes. The logic is that those who are dissatisfied (and have decided to stay with the organisation) are determined to change things and have little to lose in doing so (see JM George & J Zhou, 2002). We were therefore surprised to find in the course of our own research into managerial practices and employee attitudes in manufacturing organisations that job satisfaction was a highly significant predictor of product and technological innovation. These results held even though the research was conducted longitudinally, over two years, while controlling for prior innovation. In other words, job satisfaction was a stronger predictor of innovation than any pre-existing orientation organisations had towards working innovatively. Using prior innovation as a control variable, as well as a longitudinal research design, strengthened our case against the argument that people are satisfied because they belong to a highly innovative organisation. We found that the relationship between job satisfaction and innovation was stronger still where organisations showed that they were committed to promoting job variety, especially at shop-floor level. We developed precise instruments to measure innovation, taking into account the magnitude of the innovation both in terms of the number of people involved in its implementation, and how new and different it was. Using this instrument, we are able to give each organisation in our sample a "score" from one to seven for innovation in areas ranging from administration to production technology. We found that much innovation is incremental, involving relatively minor improvements, rather than major change. To achieve sustained innovation, organisations have to draw on the skills and knowledge of employees at all levels. We also measured job satisfaction at organisational level, constructing a mean "job satisfaction" score for all organisations in our sample, and drawing only on those companies whose employees tended to respond in a similar manner to the questions they were asked. We argue that where most of the workforce experience job satisfaction, employees are more likely to collaborate, to share ideas and aim for high standards because people are keen to sustain their positive feelings. Job variety and single-status arrangements further strengthen the relationship between satisfaction and performance. This makes sense; where employees experience variety, they are exposed to new and different ideas and, provided they feel positive about their jobs, are likely to be willing to try to apply these ideas to improve their jobs. Similarly, staff working in single-status environments where hierarchical barriers are reduced are likely to feel trusted and valued by management and there is evidence (see G Jones & J George, 1998) that people work collaboratively and constructively with those they trust. Our study suggests that there is a strong business case for promoting employee job satisfaction. Managers and HR practitioners need to ensure their strategies and practices support and sustain job satisfaction among their workforces to encourage constructive, collaborative and creative working. It is more important than ever for organisations to respond rapidly to demands of the external environment. This study shows the positive association between organisational-level job satisfaction and innovation. So if a happy workforce is the key to unlocking innovation and organisations want to thrive in the global economy, it is vital that managers and HR practitioners pay close attention to employee perceptions of the work environment. In a world where the most innovative survive it could make all the difference.
Resumo:
This thesis is about the discretionary role of the line manager in inspiring the work engagement of staff and their resulting innovative behaviour examined through the lens of Social Exchange Theory (Blau, 1964) and the Job Demands-Resources theory (Bakker, Demerouti, Nachreiner & Schaufeli, 2001). The study is focused on a large British Public Sector organisation undergoing a major organisational shift in the way in which they operate as part of the public sector. It is often claimed that people do not leave organisations; they leave line managers (Kozlowski & Doherty, 1989). Regardless of the knowledge in the literature concerning the importance of the line manager in organisations (Purcell, 2003), the engagement literature in particular is lacking in the consideration of such a fundamental figure in organisational life. Further, the understanding of the black box of managerial discretion and its relationship to employee and organisation related outcomes would benefit from greater exploration (Purcell, 2003; Gerhart, 2005; Scott, et al, 2009). The purpose of this research is to address these gaps with relation to the innovative behaviour of employees in the public sector – an area that is not typically associated with the public sector (Bhatta, 2003; McGuire, Stoner & Mylona, 2008; Hughes, Moore & Kataria, 2011). The study is a CASE Award PhD thesis, requiring academic and practical elements to the research. The study is of one case organisation, focusing on one service characterised by a high level of adoption of Strategic Human Resource Management activities and operating in a rather unique manner for the public sector, having private sector competition for work. The study involved a mixed methods approach to data collection. Preliminary focus groups with 45 participants were conducted, followed by an ethnographic period of five months embedded into the service conducting interviews and observations. This culminated in a quantitative survey delivered within the wider directorate to approximately 500 staff members. The study used aspects of the Grounded Theory (Glaser & Strauss, 1967) approach to analyse the data and developed results that highlight the importance of the line manager in an area characterised by SHRM and organisational change for engaging employees and encouraging innovative behaviour. This survey was completed on behalf of the organisation and the findings of this are presented in appendix 1, in order to keep the focus of the PhD on theory development. Implications for theory and practice are discussed alongside the core finding. Line managers’ discretion surrounding the provision of job resources (in particular trust, autonomy and implementation and interpretation of combined bundles of SHRM policies and procedures) influenced the exchange process by which employees responded with work engagement and innovative behaviour. Limitations to the research are the limitations commonly attributed to cross-sectional data collection methods and those surrounding generalisability of the qualitative findings outside of the contextual factors characterising the service area. Suggestions for future research involve addressing these limitations and further exploration of the discretionary role with regards to extending our understanding of line manager discretion.
Resumo:
Several host systems are available for the production of recombinant proteins, ranging from Escherichia coli to mammalian cell-lines. This article highlights the benefits of using yeast, especially for more challenging targets such as membrane proteins. On account of the wide range of molecular, genetic, and microbiological tools available, use of the well-studied model organism, Saccharomyces cerevisiae, provides many opportunities to optimize the functional yields of a target protein. Despite this wealth of resources, it is surprisingly under-used. In contrast, Pichia pastoris, a relative new-comer as a host organism, is already becoming a popular choice, particularly because of the ease with which high biomass (and hence recombinant protein) yields can be achieved. In the last few years, advances have been made in understanding how a yeast cell responds to the stress of producing a recombinant protein and how this information can be used to identify improved host strains in order to increase functional yields. Given these advantages, and their industrial importance in the production of biopharmaceuticals, I argue that S. cerevisiae and P. pastoris should be considered at an early stage in any serious strategy to produce proteins.
Resumo:
Purpose – The purpose of this paper is to investigate an underexplored aspect of outsourcing involving a mixed strategy in which parallel production is continued in-house at the same time as outsourcing occurs. Design/methodology/approach – The study applied a multiple case study approach and drew on qualitative data collected through in-depth interviews with wood product manufacturing companies. Findings – The paper posits that there should be a variety of mixed strategies between the two governance forms of “make” or “buy.” In order to address how companies should consider the extent to which they outsource, the analysis was structured around two ends of a continuum: in-house dominance or outsourcing dominance. With an in-house-dominant strategy, outsourcing complements an organization's own production to optimize capacity utilization and outsource less cost-efficient production, or is used as a tool to learn how to outsource. With an outsourcing-dominant strategy, in-house production helps maintain complementary competencies and avoids lock-in risk. Research limitations/implications – This paper takes initial steps toward an exploration of different mixed strategies. Additional research is required to understand the costs of different mixed strategies compared with insourcing and outsourcing, and to study parallel production from a supplier viewpoint. Practical implications – This paper suggests that managers should think twice before rushing to a “me too” outsourcing strategy in which in-house capacities are completely closed. It is important to take a dynamic view of outsourcing that maintains a mixed strategy as an option, particularly in situations that involve an underdeveloped supplier market and/or as a way to develop resources over the long term. Originality/value – The concept of combining both “make” and “buy” is not new. However, little if any research has focussed explicitly on exploring the variety of different types of mixed strategies that exist on the continuum between insourcing and outsourcing.
Resumo:
Integrated supplier selection and order allocation is an important decision for both designing and operating supply chains. This decision is often influenced by the concerned stakeholders, suppliers, plant operators and customers in different tiers. As firms continue to seek competitive advantage through supply chain design and operations they aim to create optimized supply chains. This calls for on one hand consideration of multiple conflicting criteria and on the other hand consideration of uncertainties of demand and supply. Although there are studies on supplier selection using advanced mathematical models to cover a stochastic approach, multiple criteria decision making techniques and multiple stakeholder requirements separately, according to authors' knowledge there is no work that integrates these three aspects in a common framework. This paper proposes an integrated method for dealing with such problems using a combined Analytic Hierarchy Process-Quality Function Deployment (AHP-QFD) and chance constrained optimization algorithm approach that selects appropriate suppliers and allocates orders optimally between them. The effectiveness of the proposed decision support system has been demonstrated through application and validation in the bioenergy industry.
Resumo:
Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.
Resumo:
Waste biomass is generated during the conservation management of semi-natural habitats, and represents an unused resource and potential bioenergy feedstock that does not compete with food production. Thermogravimetric analysis was used to characterise a representative range of biomass generated during conservation management in Wales. Of the biomass types assessed, those dominated by rush (Juncus effuses) and bracken (Pteridium aquilinum) exhibited the highest and lowest volatile compositions respectively and were selected for bench scale conversion via fast pyrolysis. Each biomass type was ensiled and a sub-sample of silage was washed and pressed. Demineralization of conservation biomass through washing and pressing was associated with higher oil yields following fast pyrolysis. The oil yields were within the published range established for the dedicated energy crops miscanthus and willow. In order to examine the potential a multiple output energy system was developed with gross power production estimates following valorisation of the press fluid, char and oil. If used in multi fuel industrial burners the char and oil alone would displace 3.9 × 105 tonnes per year of No. 2 light oil using Welsh biomass from conservation management. Bioenergy and product development using these feedstocks could simultaneously support biodiversity management and displace fossil fuels, thereby reducing GHG emissions. Gross power generation predictions show good potential.
Resumo:
Saturation mutagenesis is a powerful tool in modern protein engineering, which permits key residues within a protein to be targeted in order to potentially enhance specific functionalities. However, the creation of large libraries using conventional saturation mutagenesis with degenerate codons (NNN or NNK/S) has inherent redundancy and consequent disparities in codon representation. Therefore, both chemical (trinucleotide phosphoramidites) and biological methods (sequential, enzymatic single codon additions) of non-degenerate saturation mutagenesis have been developed in order to combat these issues and so improve library quality. Large libraries with multiple saturated positions can be limited by the method used to screen them. Although the traditional screening method of choice, cell-dependent methods, such as phage display, are limited by the need for transformation. A number of cell-free screening methods, such as CIS display, which link the screened phenotype with the encoded genotype, have the capability of screening libraries with up to 1014 members. This thesis describes the further development of ProxiMAX technology to reduce library codon bias and its integration with CIS display to screen the resulting library. Synthetic MAX oligonucleotides are ligated to an acceptor base sequence, amplified, and digested, subsequently adding a randomised codon to the acceptor, which forms an iterative cycle using the digested product of the previous cycle as the base sequence for the next. Initial use of ProxiMAX highlighted areas of the process where changes could be implemented in order to improve the codon representation in the final library. The refined process was used to construct a monomeric anti-NGF peptide library, based on two proprietary dimeric peptides (Isogenica) that bind NGF. The resulting library showed greatly improved codon representation that equated to a theoretical diversity of ~69%. The library was subsequently screened using CIS display and the discovered peptides assessed for NGF-TrkA inhibition by ELISA. Despite binding to TrkA, these peptides showed lower levels of inhibition of the NGF-TrkA interaction than the parental dimeric peptides, highlighting the importance of dimerization for inhibition of NGF-TrkA binding.