31 resultados para reflection in action


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Life, and the biochemistry of which it is ultimately comprised, is built from the interactions of proteins, and the study of protein-protein interactions is fast becoming a central feature of molecular bioscience. This is as true of immunobiology as it is of other areas of the wider biological milieu. Protein-protein interactions within an immunological setting comprise both the kind familiar from other areas of biology and instantiations of protein-protein interactions special to the immune arena. Of the generic kind of protein-protein interaction, co-stimulatory receptors, such as CD28, and the interaction of accessory proteins, such as CD4 or CD8, are amongst the most prevalent and apposite of examples. The key examples of special immunological instantiations of protein-protein interactions are the binding of antigens by antibodies and the formation of peptide-MHC-TCR complexes; both prime examples of vital molecular recognition events mediated by protein-protein interactions. In this brief review, and within the context of this burgeoning field, we examine immunological protein-protein interactions, focussing on the problematic nature of defining such interactions. © 2011 by Nova Science Publishers, Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Translation is a regular phenomenon for news production, even if this is not always explicitly indicated. It is quite common that journalists themselves perform translations in their text production processes. Online media have added new possibilties to these processes. This paper looks at the transfer between print and online media texts from the point of view of translation. On the basis of case studies of English translations made available online by Spiegel International, the text production practice and its reflection in the linguistic structure of the translations is illustrated. The declared aim of putting English translations on the Spiegel website is to bring its 'unique voice' to English-speaking readers. This paper argues that this 'unique voice' will not be seen by the readers in the actual linguistic make-up of the texts, but that it is as a result of the text selection process that English-speaking readers can get access to a different point of view.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increasingly the body of knowledge derived from strategy theory has been criticized because it is not actionable in practice, particularly under the conditions of a knowledge economy. Since strategic management is an applied discipline this is a serious criticism. However, we argue that the theory-practice question is too simple. Accordingly, this paper expands this question by outlining first the theoretical criteria under which strategy theory is not actionable, and then outlines an alternative perspective on strategy knowledge in action, based upon a practice epistemology. The paper is in three sections. The first section explains two contextual conditions which impact upon strategy theory within a knowledge economy, environmental velocity and knowledge intensity. The impact of these contextual conditions upon the application of four different streams of strategy theory is examined. The second section suggests that the theoretical validity of these contextual conditions breaks down when we consider the knowledge artifacts, such as strategy tools and frameworks, which arise from strategy research. The third section proposes a practice epistemology for analyzing strategy knowledge in action that stands in contrast to more traditional arguments about actionable knowledge. From a practice perspective, strategy knowledge is argues to be actionable as part of the everyday activities of strategizing. © 2006 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There has been a revival of interest in economic techniques to measure the value of a firm through the use of economic value added as a technique for measuring such value to shareholders. This technique, based upon the concept of economic value equating to total value, is founded upon the assumptions of classical liberal economic theory. Such techniques have been subject to criticism both from the point of view of the level of adjustment to published accounts needed to make the technique work and from the point of view of the validity of such techniques in actually measuring value in a meaningful context. This paper critiques economic value added techniques as a means of calculating changes in shareholder value, contrasting such techniques with more traditional techniques of measuring value added. It uses the company Severn Trent plc as an actual example in order to evaluate and contrast the techniques in action. The paper demonstrates discrepancies between the calculated results from using economic value added analysis and those reported using conventional accounting measures. It considers the merits of the respective techniques in explaining shareholder and managerial behaviour and the problems with using such techniques in considering the wider stakeholder concept of value. It concludes that this economic value added technique has merits when compared with traditional accounting measures of performance but that it does not provide the universal panacea claimed by its proponents.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research concerns information systems and information systems development. The thesis describes an approach to information systems development called Multiview. This is a methodology which seeks to combine the strengths of a number of different, existing approaches in a coherent manner. Many of these approaches are radically different in terms of concepts, philosophy, assumptions, methods, techniques and tools. Three case studies are described presenting Multiview 'in action'. The first is used mainly to expose the strengths and weaknesses of an early version of the approach discussed in the thesis. Tools and techniques are described in the thesis which aim to strengthen the approach. Two further case studies are presented to illustrate the use of this second version of Multiview. This is not put forward as an 'ideal methodology' and the case studies expose some of the difficulties and practical problems of information systems work and the use of the methodology. A more contingency based approach to information systems development is advocated using Multiview as a framework rather than a prescriptive tool. Each information systems project and the use of the framework is unique, contingent on the particular problem situation. The skills of different analysts, the backgrounds of users and the situations in which they are constrained to work have always to be taken into account in any project. The realities of the situation will cause departure from the 'ideal methodology' in order to allow for the exigencies of the real world. Multiview can therefore be said to be an approach used to explore the application area in order to develop an information system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent advances in our ability to watch the molecular and cellular processes of life in action-such as atomic force microscopy, optical tweezers and Forster fluorescence resonance energy transfer-raise challenges for digital signal processing (DSP) of the resulting experimental data. This article explores the unique properties of such biophysical time series that set them apart from other signals, such as the prevalence of abrupt jumps and steps, multi-modal distributions and autocorrelated noise. It exposes the problems with classical linear DSP algorithms applied to this kind of data, and describes new nonlinear and non-Gaussian algorithms that are able to extract information that is of direct relevance to biological physicists. It is argued that these new methods applied in this context typify the nascent field of biophysical DSP. Practical experimental examples are supplied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Requirements are sensitive to the context in which the system-to-be must operate. Where such context is well-understood and is static or evolves slowly, existing RE techniques can be made to work well. Increasingly, however, development projects are being challenged to build systems to operate in contexts that are volatile over short periods in ways that are imperfectly understood. Such systems need to be able to adapt to new environmental contexts dynamically, but the contextual uncertainty that demands this self-adaptive ability makes it hard to formulate, validate and manage their requirements. Different contexts may demand different requirements trade-offs. Unanticipated contexts may even lead to entirely new requirements. To help counter this uncertainty, we argue that requirements for self-adaptive systems should be run-time entities that can be reasoned over in order to understand the extent to which they are being satisfied and to support adaptation decisions that can take advantage of the systems' self-adaptive machinery. We take our inspiration from the fact that explicit, abstract representations of software architectures used to be considered design-time-only entities but computational reflection showed that architectural concerns could be represented at run-time too, helping systems to dynamically reconfigure themselves according to changing context. We propose to use analogous mechanisms to achieve requirements reflection. In this paper we discuss the ideas that support requirements reflection as a means to articulate some of the outstanding research challenges.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increasingly software systems are required to survive variations in their execution environment without or with only little human intervention. Such systems are called "eternal software systems". In contrast to the traditional view of development and execution as separate cycles, these modern software systems should not present such a separation. Research in MDE has been primarily concerned with the use of models during the first cycle or development (i.e. during the design, implementation, and deployment) and has shown excellent results. In this paper the author argues that an eternal software system must have a first-class representation of itself available to enable change. These runtime representations (or runtime models) will depend on the kind of dynamic changes that we want to make available during execution or on the kind of analysis we want the system to support. Hence, different models can be conceived. Self-representation inevitably implies the use of reflection. In this paper the author briefly summarizes research that supports the use of runtime models, and points out different issues and research questions. © 2009 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intercultural communication in the global environment frequently involves recourse to translation. This generates new phenomena which, in turn, raise new questions for translation theory and practice. This issue is concerned with the concept of the hybrid text as one of these phenomena. In this introductory chapter, a hybrid text is defined as: „a text that results from a translation process. It shows features that somehow seem ‘out of place'/‘strange'/‘unusual' for the receiving culture, i.e. the target culture”. It is important, however, to differentiate between the true hybrid, which is the result of positive authorial and/or translatorial decisions, and the inadequate text which exhibits features of translationese, resulting from a lack of competence. Textual, contextual and social features of hybrid texts are postulated (see discussion paper). These are the object of critical reflection in sub-sequent chapters, in relation to different genres. The potential of the hybrid text for translation research is explored.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the centuries, women have always played a significant part in translation practice, training, and theoretical reflection. In fact, translation (and interpreting) have often been characterized as a feminine occupation. This chapter looks at these three aspects predominantly from a quantitative perspective. In terms of the profession, it investigates the distribution of male and female translators and interpreters in the United Kingdom and the subject areas they are working in. For women's contribution to the academic discipline of Translation Studies, it investigates the amount of female authors who contributed to the discipline with their publications and asks whether female scholars focus on specific topics. Finally, it investigates leadership roles of women in professional associations. The paper concludes by reflecting on the potential significance of such studies. © 2013.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background - This review provides a worked example of ‘best fit’ framework synthesis using the Theoretical Domains Framework (TDF) of health psychology theories as an a priori framework in the synthesis of qualitative evidence. Framework synthesis works best with ‘policy urgent’ questions. Objective - The review question selected was: what are patients’ experiences of prevention programmes for cardiovascular disease (CVD) and diabetes? The significance of these conditions is clear: CVD claims more deaths worldwide than any other; diabetes is a risk factor for CVD and leading cause of death. Method - A systematic review and framework synthesis were conducted. This novel method for synthesizing qualitative evidence aims to make health psychology theory accessible to implementation science and advance the application of qualitative research findings in evidence-based healthcare. Results - Findings from 14 original studies were coded deductively into the TDF and subsequently an inductive thematic analysis was conducted. Synthesized findings produced six themes relating to: knowledge, beliefs, cues to (in)action, social influences, role and identity, and context. A conceptual model was generated illustrating combinations of factors that produce cues to (in)action. This model demonstrated interrelationships between individual (beliefs and knowledge) and societal (social influences, role and identity, context) factors. Conclusion - Several intervention points were highlighted where factors could be manipulated to produce favourable cues to action. However, a lack of transparency of behavioural components of published interventions needs to be corrected and further evaluations of acceptability in relation to patient experience are required. Further work is needed to test the comprehensiveness of the TDF as an a priori framework for ‘policy urgent’ questions using ‘best fit’ framework synthesis.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Germany's latest attempt at unification raises again the question of German nationhood and nationality. The present study examines the links between the development of the German language and the political history of Germany, principally in the nineteenth and twentieth centuries. By examining the role of language in the establishment and exercise of political power and in the creation of national and group solidarity in Germany, the study both provides insights into the nature of language as political action and contributes to the socio-cultural history of the German language. The language-theoretical hypothesis on which the study is based sees language as a central factor in political action, and opposes the notion that language is a reflection of underlying political 'realities' which exist independently of language. Language is viewed as language-in-text which performs identifiable functions. Following Leech, five functions are distinguished, two of which (the regulative and the phatic) are regarded as central to political processes. The phatic function is tested against the role of the German language as a creator and symbol of national identity, with particular attention being paid to concepts of the 'purity' of the language. The regulative function (under which a persuasive function is also subsumed) is illustrated using the examples of German fascist discourse and selected cases from German history post-1945. In addition, the interactions are examined between language change and socio-economic change by postulating that language change is both a condition and consequence of socio-economic change, in that socio-economic change both requires and conditions changes in the communicative environment. Finally, three politocolinguistic case studies from the eight and ninth decades of the twentieth century are introduced in order to demonstrate specific ways in which language has been deployed in an attempt to create political realities, thus verifying the initial hypothesis of the centrality of language to the political process.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This hands-on, practical guide for ESL/EFL teachers and teacher educators outlines, for those who are new to doing action research, what it is and how it works. Straightforward and reader friendly, it introduces the concepts and offers a step-by-step guide to going through an action research process, including illustrations drawn widely from international contexts. Specifically, the text addresses: •action research and how it differs from other forms of research •the steps involved in developing an action research project •ways of developing a research focus •methods of data collection •approaches to data analysis •making sense of action research for further classroom action. Each chapter includes a variety of pedagogical activities: •Pre-Reading questions ask readers to consider what they already know about the topic •Reflection Points invite readers to think about/discuss what they have read •action points ask readers to carry out action-research tasks based on what they have read •Classroom Voices illustrate aspects of action research from teachers internationally •Summary Points provide a synopsis of the main points in the chapter