462 resultados para Encompass
Resumo:
Previous developments in the opportunism-independent theory of the firm are either restricted to special cases or are derived from the capabilities or resource-based perspective. However, a more general opportunism-independent approach can be developed, based on the work of Demsetz and Coase, which is nevertheless contractual in nature. This depends on 'direction', that is, deriving economic value by permitting one set of actors to direct the activities of another, and of non-human factors of production. Direction helps to explain not only firm boundaries and organisation, but also the existence of firms, without appealing to opportunism or moral hazard. The paper also considers the extent to which it is meaningful to speak of 'contractual' theories in the absence of opportunism, and whether this analysis can be extended beyond the employment contract to encompass ownership of assets by the firm. © The Author 2005. Published by Oxford University Press on behalf of the Cambridge Political Economy Society. All rights reserved.
Resumo:
Enterprise Risk Management (ERM) and Knowledge Management (KM) both encompass top-down and bottom-up approaches developing and embedding risk knowledge concepts and processes in strategy, policies, risk appetite definition, the decision-making process and business processes. The capacity to transfer risk knowledge affects all stakeholders and understanding of the risk knowledge about the enterprise's value is a key requirement in order to identify protection strategies for business sustainability. There are various factors that affect this capacity for transferring and understanding. Previous work has established that there is a difference between the influence of KM variables on Risk Control and on the perceived value of ERM. Communication among groups appears as a significant variable in improving Risk Control but only as a weak factor in improving the perceived value of ERM. However, the ERM mandate requires for its implementation a clear understanding, of risk management (RM) policies, actions and results, and the use of the integral view of RM as a governance and compliance program to support the value driven management of the organization. Furthermore, ERM implementation demands better capabilities for unification of the criteria of risk analysis, alignment of policies and protection guidelines across the organization. These capabilities can be affected by risk knowledge sharing between the RM group and the Board of Directors and other executives in the organization. This research presents an exploratory analysis of risk knowledge transfer variables used in risk management practice. A survey to risk management executives from 65 firms in various industries was undertaken and 108 answers were analyzed. Potential relationships among the variables are investigated using descriptive statistics and multivariate statistical models. The level of understanding of risk management policies and reports by the board is related to the quality of the flow of communication in the firm and perceived level of integration of the risk policy in the business processes.
Resumo:
This work explores the relevance of semantic and linguistic description to translation, theory and practice. It is aimed towards a practical model of approach to texts to translate. As literary texts [poetry mainly] are the focus of attention, so are stylistic matters. Note, however, that 'style', and, to some extent, the conclusions of the work, are not limited to so-called literary texts. The study of semantic description reveals that most translation problems do not stem from the cognitive (langue-related), but rather from the contextual (parole-related) aspects of meaning. Thus, any linguistic model that fails to account for the latter is bound to fall short. T.G.G. does, whereas Systemics, concerned with both the 'Iangue' and 'parole' (stylistic and sociolinguistic mainly) aspects of meaning, provides a useful framework of approach to texts to translate. Two essential semantic principles for translation are: that meaning is the property of a language (Firth); and the 'relativity of meaning assignments' (Tymoczko). Both imply that meaning can only be assessed, correctly, in the relevant socio-cultural background. Translation is seen as a restricted creation, and the translator's encroach as a three-dimensional critical one. To encompass the most technical to the most literary text, and account for variations in emphasis in any text, translation theory must be based on typology of function Halliday's ideational, interpersonal and textual, or, Buhler's symbol, signal, symptom, Functions3. Function Coverall and specific] will dictate aims and method, and also provide the critic with criteria to assess translation Faithfulness. Translation can never be reduced to purely objective methods, however. Intuitive procedures intervene, in textual interpretation and analysis, in the choice of equivalents, and in the reception of a translation. Ultimately, translation, theory and practice, may perhaps constitute the touchstone as regards the validity of linguistic and semantic theories.
Resumo:
The research investigates the processes of adoption and implementation, by organisations, of computer aided production management systems (CAPM). It is organised around two different theoretical perspectives. The first part is informed by the Rogers model of the diffusion, adoption and implementation of innovations, and the second part by a social constructionist approach to technology. Rogers' work is critically evaluated and a model of adoption and implementation is distilled from it and applied to a set of empirical case studies. In the light of the case study data, strengths and weaknesses of the model are identified. It is argued that the model is too rational and linear to provide an adequate explanation of adoption processes. It is useful for understanding processes of implementation but requires further development. The model is not able to adequately encompass complex computer based technologies. However, the idea of 'reinvention' is identified as Roger's key concept but it needs to be conceptually extended. Both Roger's model and definition of CAPM found in the literature from production engineering tend to treat CAPM in objectivist terms. The problems with this view are addressed through a review of the literature on the sociology of technology, and it is argued that a social constructionist approach offers a more useful framework for understanding CAPM, its nature, adoption, implementation, and use. CAPM it is argued, must be understood on terms of the ways in which it is constituted in discourse, as part of a 'struggle for meaning' on the part of academics, professional engineers, suppliers, and users.
Resumo:
This thesis is concerned with the investigation, by nuclear magnetic resonance spectroscopy, of the molecular interactions occurring in mixtures of benzene and cyclohexane to which either chloroform or deutero-chloroform has been added. The effect of the added polar molecule on the liquid structure has been studied using spin-lattice relaxation time, 1H chemical shift, and nuclear Overhauser effect measurements. The main purpose of the work has been to validate a model for molecular interaction involving local ordering of benzene around chloroform. A chemical method for removing dissolved oxygen from samples has been developed to encompass a number of types of sample, including quantitative mixtures, and its supremacy over conventional deoxygenation technique is shown. A set of spectrometer conditions, the use of which produces the minimal variation in peak height in the steady state, is presented. To separate the general diluting effects of deutero-chloroform from its effects due to the production of local order a series of mixtures involving carbon tetrachloride, instead of deutero-chloroform, have been used as non-interacting references. The effect of molecular interaction is shown to be explainable using a solvation model, whilst an approach involving 1:1 complex formation is shown not to account for the observations. It is calculated that each solvation shell, based on deutero-chloroform, contains about twelve molecules of benzene or cyclohexane. The equations produced to account for the T1 variations have been adapted to account for the 1H chemical shift variations in the same system. The shift measurements are shown to substantiate the solvent cage model with a cage capacity of twelve molecules around each chloroform molecule. Nuclear Overhauser effect data have been analysed quantitatively in a manner consistent with the solvation model. The results show that discrete shells only exist when the mole fraction of deutero-chloroform is below about 0.08.
Resumo:
Rotating fluidised Beds offer the potential for high intensity combustion, large turndown and extended range of fluidising velocity due to the imposition of an artificial gravitational field. Low thermal capacity should also allow rapid response to load changes. This thesis describes investigations of the validity of these potential virtues. Experiments, at atmospheric pressure, were conducted in flow visualisation rigs and a combustor designed to accommodate a distributor 200mm diameter and 80mm axial length. Ancillary experiments were conducted in a 6" diameter conventional fluidised bed. The investigations encompassed assessment of; fluidisation and elutriation, coal feed requirements, start-up and steady-state combustion using premixed propane and air, transition from propane to coal combustion and mechanical design. Assessments were made of an elutriation model and some effects of particle size on the combustion of premixed fuel gas and air. The findings were: a) more reliable start-up and control methods must be developed. Combustion of premixed propane and air led to severe mechanical and operating problems. Manual control of coal combustion was inadequate. b) Design criteria must encompass pressure loss, mechanical strength and high temperature resistance. The flow characteristics of ancillaries and the distributor must be matcheo. c) Fluidisation of a range of particle sizes was investigated. New correlations for minimum fluidisation and fully supported velocities are proposed. Some effects on elutriation of particle size and the distance between the bed surface and exhaust port have been identified. A conic distributor did not aid initial bed distribution. Furthermore, airflow instability was encountered with this distributor shape. Future use of conic distributors is not recommended. Axial solids mixing was found to be poor. A coal feeder was developed which produced uniform fuel distribution throughout the bed. The report concludes that small scale inhibits development of mechanical design and exploration of performance. future research requires larger combustors and automatic control.
Une étude générique du metteur en scène au théâtre:son émergence et son rôle moderne et contemporain
Resumo:
The theatre director (metteur en scene in French) is a relatively new figure in theatre practice. It was not until the I820s that the term 'mise en scene' gained currency. The term 'director' was not in general use until the I880s. The emergence and the role of the director has been considered from a variety of perspectives, either through the history of theatre (Allevy, Jomaron, Sarrazac, Viala, Biet and Triau); the history of directing (Chinoy and Cole, Boll, Veinstein, Roubine); semiotic approaches to directing (Whitmore, Miller, Pavis); the semiotics of performance (De Marinis); generic approaches to the mise en scene (Thomasseau, Banu); post-dramatic approaches to theatre (Lehmann); approaches to performance process and the specifics of rehearsal methodology (Bradby and Williams, Giannachi and Luckhurst, Picon-Vallin, Styan). What the scholarly literature has not done so far is to map the parameters necessarily involved in the directing process, and to incorporate an analysis of the emergence of the theatre director during the modem period and consider its impact on contemporary performance practice. Directing relates primarily to the making of the performance guided by a director, a single figure charged with the authority to make binding artistic decisions. Each director may have her/his own personal approaches to the process of preparation prior to a show. This is exemplified, for example, by the variety of terms now used to describe the role and function of directing, from producer, to facilitator or outside eye. However, it is essential at the outset to make two observations, each of which contributes to a justification for a generic analysis (as opposed to a genetic approach). Firstly, a director does not work alone, and cooperation with others is involved at all stages of the process. Secondly, beyond individual variation, the role of the director remains twofold. The first is to guide the actors (meneur de jeu, directeur d'acteurs, coach); the second is to make a visual representation in the performance space (set designer, stage designer, costume designer, lighting designer, scenographe). The increasing place of scenography has brought contemporary theatre directors such as Wilson, Castellucci, Fabre to produce performances where the performance space becomes a semiotic dimension that displaces the primacy of the text. The play is not, therefore, the sole artistic vehicle for directing. This definition of directing obviously calls for a definition of what the making of the performance might be. The thesis defines the making of the performance as the activity of bringing a social event, by at least one performer, providing visual and/or textual meaning in a performance space. This definition enables us to evaluate four consistent parameters throughout theatre history: first, the social aspect associated to the performance event; second, the devising process which may be based on visual and/or textual elements; third, the presence of at least one performer in the show; fourth, the performance space (which is not simply related to the theatre stage). Although the thesis focuses primarily on theatre practice, such definition blurs the boundaries between theatre and other collaborative artistic disciplines (cinema, opera, music and dance). These parameters illustrate the possibility to undertake a generic analysis of directing, and resonate with the historical, political and artistic dimensions considered. Such a generic perspective on the role of the director addresses three significant questions: an historical question: how/why has the director emerged?; a sociopolitical question: how/why was the director a catalyst for the politicisation of theatre, and subsequently contributed to the rise of State-funded theatre policy?; and an artistic one: how/why the director has changed theatre practice and theory in the twentieth-century? Directing for the theatre as an artistic activity is a historically situated phenomenon. It would seem only natural from a contemporary perspective to associate the activity of directing to the function of the director. This is relativised, however, by the question of how the performance was produced before the modern period. The thesis demonstrates that the rise of the director is a progressive and historical phenomenon (Dort) rather than a mere invention (Viala, Sarrazac). A chronological analysis of the making of the performance throughout theatre history is the most useful way to open the study. In order to understand the emergence of the director, the research methodology assesses the interconnection of the four parameters above throughout four main periods of theatre history: the beginning of the Renaissance (meneur de jeu), the classical age (actor-manager and stage designer-manager), the modern period (director) and the contemporary period (director-facilitator, performer). This allows us properly to appraise the progressive emergence of the director, as well as to make an analysis of her/his modern and contemporary role. The first chapter argues that the physical separation between the performance space and its audience, which appeared in the early fifteenth-century, has been a crucial feature in the scenographic, aesthetic, political and social organisation of the performance. At the end of the Middle Ages, French farces which raised socio-political issues (see Bakhtin) made a clear division on a single outdoor stage (treteau) between the actors and the spectators, while religious plays (drame fiturgique, mystere) were mostly performed on various outdoor and opened multispaces. As long as the performance was liturgical or religious, and therefore confined within an acceptable framework, it was allowed. At the time, the French ecclesiastical and civil authorities tried, on several occasions, to prohibit staged performances. As a result, practitioners developed non-official indoor spaces, the Theatre de fa Trinite (1398) being the first French indoor theatre recognized by scholars. This self-exclusion from the open public space involved breaking the accepted rules by practitioners (e.g. Les Confreres de fa Passion), in terms of themes but also through individual input into a secular performance rather than the repetition of commonly known religious canvases. These developments heralded the authorised theatres that began to emerge from the mid-sixteenth century, which in some cases were subsidised in their construction. The construction of authorised indoor theatres associated with the development of printing led to a considerable increase in the production of dramatic texts for the stage. Profoundly affecting the reception of the dramatic text by the audience, the distance between the stage and the auditorium accompanied the changing relationship between practitioners and spectators. This distance gave rise to a major development of the role of the actor and of the stage designer. The second chapter looks at the significance of both the actor and set designer in the devising process of the performance from the sixteenth-century to the end of the nineteenth-century. The actor underwent an important shift in function in this period from the delivery of an unwritten text that is learned in the medieval oral tradition to a structured improvisation produced by the commedia dell 'arte. In this new form of theatre, a chef de troupe or an experienced actor shaped the story, but the text existed only through the improvisation of the actors. The preparation of those performances was, moreover, centred on acting technique and the individual skills of the actor. From this point, there is clear evidence that acting began to be the subject of a number of studies in the mid-sixteenth-century, and more significantly in the seventeenth-century, in Italy and France. This is revealed through the implementation of a system of notes written by the playwright to the actors (stage directions) in a range of plays (Gerard de Vivier, Comedie de la Fidelite Nuptiale, 1577). The thesis also focuses on Leoni de' Sommi (Quatro dialoghi, 1556 or 1565) who wrote about actors' techniques and introduced the meneur de jeu in Italy. The actor-manager (meneur de jeu), a professional actor, who scholars have compared to the director (see Strihan), trained the actors. Nothing, however, indicates that the actor-manager was directing the visual representation of the text in the performance space. From the end of the sixteenth-century, the dramatic text began to dominate the process of the performance and led to an expansion of acting techniques, such as the declamation. Stage designers carne from outside the theatre tradition and played a decisive role in the staging of religious celebrations (e.g. Actes des Apotres, 1536). In the sixteenth-century, both the proscenium arch and the borders, incorporated in the architecture of the new indoor theatres (theatre a l'italienne), contributed to create all kinds of illusions on the stage, principally the revival of perspective. This chapter shows ongoing audience demands for more elaborate visual effects on the stage. This led, throughout the classical age, and even more so during the eighteenth-century, to grant the stage design practitioner a major role in the making of the performance (see Ciceri). The second chapter demonstrates that the guidance of the actors and the scenographic conception, which are the artistic components of the role of the director, appear to have developed independently from one another until the nineteenth-century. The third chapter investigates the emergence of the director per se. The causes for this have been considered by a number of scholars, who have mainly identified two: the influence of Naturalism (illustrated by the Meiningen Company, Antoine, and Stanislavski) and the invention of electric lighting. The influence of the Naturalist movement on the emergence of the modem director in the late nineteenth-century is often considered as a radical factor in the history of theatre practice. Naturalism undoubtedly contributed to changes in staging, costume and lighting design, and to a more rigorous commitment to the harmonisation and visualisation of the overall production of the play. Although the art of theatre was dependent on the dramatic text, scholars (Osborne) demonstrate that the Naturalist directors did not strictly follow the playwright's indications written in the play in the late nineteenth-century. On the other hand, the main characteristic of directing in Naturalism at that time depended on a comprehensive understanding of the scenography, which had to respond to the requirements of verisimilitude. Electric lighting contributed to this by allowing for the construction of a visual narrative on stage. However, it was a master technician, rather than an emergent director, who was responsible for key operational decisions over how to use this emerging technology in venues such as the new Bayreuth theatre in 1876. Electric lighting reflects a normal technological evolution and cannot be considered as one of the main causes of the emergence of the director. Two further causes of the emergence of the director, not considered in previous studies, are the invention of cinema and the Symbolist movement (Lugne-Poe, Meyerhold). Cinema had an important technological influence on the practitioners of the Naturalist movement. In order to achieve a photographic truth on the stage (tableau, image), Naturalist directors strove to decorate the stage with the detailed elements that would be expected to be found if the situation were happening in reality. Film production had an influence on the work of actors (Walter). The filmmaker took over a primary role in the making of the film, as the source of the script, the filming process and the editing of the film. This role influenced the conception that theatre directors had of their own work. It is this concept of the director which influenced the development of the theatre director. As for the Symbolist movement, the director's approach was to dematerialise the text of the playwright, trying to expose the spirit, movement, colour and rhythm of the text. Therefore, the Symbolists disengaged themselves from the material aspect of the production, and contributed to give greater artistic autonomy to the role of the director. Although the emergence of the director finds its roots amongst the Naturalist practitioners (through a rigorous attempt to provide a strict visual interpretation of the text on stage), the Symbolist director heralded the modem perspective of the making of performance. The emergence of the director significantly changed theatre practice and theory. For instance, the rehearsal period became a clear work in progress, a platform for both developing practitioners' techniques and staging the show. This chapter explores and contrasts several practitioners' methods based on the two aspects proposed for the definition of the director (guidance of the actors and materialisation of a visual space). The fourth chapter argues that the role of the director became stronger, more prominent, and more hierarchical, through a more political and didactic approach to theatre as exemplified by the cases of France and Germany at the end of the nineteenth-century and through the First World War. This didactic perspective to theatre defines the notion of political theatre. Political theatre is often approached by the literature (Esslin, Willett) through a Marxist interpretation of the great German directors' productions (Reinhardt, Piscator, Brecht). These directors certainly had a great influence on many directors after the Second World War, such as Jean Vilar, Judith Molina, Jean-Louis Barrault, Roger Planchon, Augusto Boal, and others. This chapter demonstrates, moreover, that the director was confirmed through both ontological and educational approaches to the process of making the performance, and consequently became a central and paternal figure in the organisational and structural processes practiced within her/his theatre company. In this way, the stance taken by the director influenced the State authorities in establishing theatrical policy. This is an entirely novel scholarly contribution to the study of the director. The German and French States were not indifferent to the development of political theatre. A network of public theatres was thus developed in the inter-war period, and more significantly after the Second World War. The fifth chapter shows how State theatre policies establish its sources in the development of political theatre, and more specifically in the German theatre trade union movement (Volksbiihne) and the great directors at the end of the nineteenth-century. French political theatre was more influenced by playwrights and actors (Romain Rolland, Louise Michel, Louis Lumet, Emile Berny). French theatre policy was based primarily on theatre directors who decentralised their activities in France during both the inter-war period and the German occupation. After the Second World War, the government established, through directors, a strong network of public theatres. Directors became both the artistic director and the executive director of those institutionalised theatres. The institution was, however, seriously shaken by the social and political upheaval of 1968. It is the link between the State and the institution in which established directors were entangled that was challenged by the young emerging directors who rejected institutionalised responsibility in favour of the autonomy of the artist in the 1960s. This process is elucidated in chapter five. The final chapter defines the contemporary role of the director in contrasting thework of a number of significant young theatre practitioners in the 1960s such as Peter Brook, Ariane Mnouchkine, The Living Theater, Jerzy Grotowski, Augusto Boal, Eugenio Barba, all of whom decided early on to detach their companies from any form of public funding. This chapter also demonstrates how they promoted new forms of performance such as the performance of the self. First, these practitioners explored new performance spaces outside the traditional theatre building. Producing performances in a non-dedicated theatre place (warehouse, street, etc.) was a more frequent practice in the 1960s than before. However, the recent development of cybertheatre questions both the separation of the audience and the practitioners and the place of the director's role since the 1990s. Secondly, the role of the director has been multifaceted since the 1960s. On the one hand, those directors, despite all their different working methods, explored western and non-western acting techniques based on both personal input and collective creation. They challenged theatrical conventions of both the character and the process of making the performance. On the other hand, recent observations and studies distinguish the two main functions of the director, the acting coach and the scenographe, both having found new developments in cinema, television, and in various others events. Thirdly, the contemporary director challenges the performance of the text. In this sense, Antonin Artaud was a visionary. His theatre illustrates the need for the consideration of the totality of the text, as well as that of theatrical production. By contrasting the theories of Artaud, based on a non-dramatic form of theatre, with one of his plays (Le Jet de Sang), this chapter demonstrates how Artaud examined the process of making the performance as a performance. Live art and autobiographical performance, both taken as directing the se(f, reinforce this suggestion. Finally, since the 1990s, autobiographical performance or the performance of the self is a growing practical and theoretical perspective in both performance studies and psychology-related studies. This relates to the premise that each individual is making a representation (through memory, interpretation, etc.) of her/his own life (performativity). This last section explores the links between the place of the director in contemporary theatre and performers in autobiographical practices. The role of the traditional actor is challenged through non-identification of the character in the play, while performers (such as Chris Burden, Ron Athey, Orlan, Franko B, Sterlac) have, likewise, explored their own story/life as a performance. The thesis demonstrates the validity of the four parameters (performer, performance space, devising process, social event) defining a generic approach to the director. A generic perspective on the role of the director would encompass: a historical dimension relative to the reasons for and stages of the 'emergence' of the director; a socio-political analysis concerning the relationship between the director, her/his institutionalisation, and the political realm; and the relationship between performance theory, practice and the contemporary role of the director. Such a generic approach is a new departure in theatre research and might resonate in the study of other collaborative artistic practices.
Resumo:
The work presented in this thesis concerns itself with the application of Demand Side Management (DSM) by industrial subsector as applied to the UK electricity industry. A review of the origins of DSM in the US and the relevance of experience gained to the UK electricity industry is made. Reviews are also made of the current status of the UK electricity industry, the regulatory system, and the potential role of DSM within the prevalent industry environment. A financial appraisal of DSM in respect of the distribution business of a Regional Electricity Company (REC) is also made. This financial appraisal highlights the economic viability of DSM within the context of the current UK electricity industry. The background of the work presented above is then followed by the construction of a framework detailing the necessary requirements for expanding the commercial role of DSM to encompass benefits for the supply business of a REC. The derived framework is then applied, in part, to the UK ceramics manufacturing industry, and in full to the UK sanitaryware manufacturing industry. The application of the framework to the UK sanitaryware manufacturing industry has required the undertaking of a unique first-order energy audit of every such manufacturing site in the UK. As such the audit has revealed previously unknown data on the timings and magnitude of electricity demand and consumption attributable to end-use manufacturing technologies and processes. The audit also served to reveal the disparity in the attitudes toward energy services, and thus by implication towards DSM, of manufacturers within the same Standard Industrial Classification (SIC) code. In response to this, attempt is made to identify the underlying drivers which could cause this variation in attitude. A novel approach to the market segmentation of the companies within the UK ceramics manufacturing sector has been utilised to classify these companies in terms of their likelihood to participate in DSM programmes through the derived Energy Services approach. The market segmentation technique, although requiring further development to progress from a research based concept, highlights the necessity to look beyond the purely energy based needs of manufacturing industries when considering the utilisation of the Energy Services approach to facilitate DSM programs.
Resumo:
Despite abundant literature on human behaviour in the face of danger, much remains to be discovered. Some descriptive models of behaviour in the face of danger are reviewed in order to identify areas where documentation is lacking. It is argued that little is known about recognition and assessment of danger and yet, these are important aspects of cognitive processes. Speculative arguments about hazard assessment are reviewed and tested against the results of previous studies. Once hypotheses are formulated, the reason for retaining the reportory grid as the main research instrument are outlined, and the choice of data analysis techniques is described. Whilst all samples used repertory grids, the rating scales were different between samples; therefore, an analysis is performed of the way in which rating scales were used in the various samples and of some reasons why the scales were used differently. Then, individual grids are looked into and compared between respondents within each sample; consensus grids are also discussed. the major results from all samples are then contrasted and compared. It was hypothesized that hazard assessment would encompass three main dimensions, i.e. 'controllability', 'severity of consequences' and 'likelihood of occurrence', which would emerge in that order. the results suggest that these dimensions are but facets of two broader dimensions labelled 'scope of human intervention' and 'dangerousness'. It seems that these two dimensions encompass a number of more specific dimensions some of which can be further fragmented. Thus, hazard assessment appears to be a more complex process about which much remains to be discovered. Some of the ways in which further discovery might proceed are discussed.
Resumo:
Employment generating public works (EGPW) are an important part of GoTL’s strategy to reduce unemployment, underemployment and poverty and contribute to social stability. The term EGPW is used in this report as a generic term to encompass labour intensive (LI) and labourbased (LB) approaches. The distinction between these approaches is made below. SEFOPE is being supported by a number of international agencies to develop and implement employment generating public works programmes (EGPWPs). Other government ministries and agencies and NGOs offering different wage rates are also engaged in such programmes and projects. In setting wage rates for such programmes, it is necessary to take account of (a) the nature of benefits they offer (e.g. the balance between employment creation and effective use of labour); (b) the beneficiaries to be targeted, and (c) any adverse impacts on other economic activities. The purposes of this assignment are: (a) to make recommendations on appropriate wage rates for unskilled casual employment on public works programmes, and (b) make a broad assessment of the labour supply response to the employment opportunities created by employment intensive programmes. The latter would help in gauging the scale of such activities required.
Resumo:
The use of spreadsheets has become routine in all aspects of business with usage growing across a range of functional areas and a continuing trend towards end user spreadsheet development. However, several studies have raised concerns about the accuracy of spreadsheet models in general, and of end user developed applications in particular, raising the risk element for users. High error rates have been discovered, even though the users/developers were confident that their spreadsheets were correct. The lack of an easy to use, context-sensitive validation methodology has been highlighted as a significant contributor to the problems of accuracy. This paper describes experiences in using a practical, contingency factor-based methodology for validation of spreadsheet-based DSS. Because the end user is often both the system developer and a stakeholder, the contingency factor-based validation methodology may need to be used in more than one way. The methodology can also be extended to encompass other DSS.
Resumo:
Although slow waves of the electroencephalogram (EEG) have been associated with attentional processes, the functional significance of the alpha component in the EEG (8.1–12 Hz) remains uncertain. Conventionally, synchronisation in the alpha frequency range is taken to be a marker of cognitive inactivity, i.e. ‘cortical idling’. However, it has been suggested that alpha may index the active inhibition of sensory information during internally directed attentional tasks such as mental imagery. More recently, this idea has been amended to encompass the notion of alpha synchronisation as a means of inhibition of non-task relevant cortical areas irrespective of the direction of attention. Here we test the adequacy of the one idling and two inhibition hypotheses about alpha. In two experiments we investigated the relation between alpha and internally vs. externally directed attention using mental imagery vs. sensory-intake paradigms. Results from both experiments showed a clear relationship between alpha and both attentional factors and increased task demands. At various scalp sites alpha amplitudes were greater during internally directed attention and during increased load, results incompatible with alpha reflecting cortical idling and more in keeping with suggestions of active inhibition necessary for internally driven mental operations.
Resumo:
The following paper attempts to encompass the opportunities for applying QR codes for museums and exhibits through the example of the Hungarian Museum of Environmental Protection and Water Management (Esztergom, Hungary). Besides providing interactivity in the museum for the mobile phone generation through the utilization of a device and a method that they are familiar with, it is important to explain how and why it is worthwhile to “adorn” the exhibits with these codes. In this paper we also touch upon the technical issues of how an existing mobile phone application can be incorporated into and used for the presentation of the museum.
Resumo:
Human Resource (HR) systems and practices generally referred to as High Performance Work Practices (HPWPs), (Huselid, 1995) (sometimes termed High Commitment Work Practices or High Involvement Work Practices) have attracted much research attention in past decades. Although many conceptualizations of the construct have been proposed, there is general agreement that HPWPs encompass a bundle or set of HR practices including sophisticated staffing, intensive training and development, incentive-based compensation, performance management, initiatives aimed at increasing employee participation and involvement, job safety and security, and work design (e.g. Pfeffer, 1998). It is argued that these practices either directly and indirectly influence the extent to which employees’ knowledge, skills, abilities, and other characteristics are utilized in the organization. Research spanning nearly 20 years has provided considerable empirical evidence for relationships between HPWPs and various measures of performance including increased productivity, improved customer service, and reduced turnover (e.g. Guthrie, 2001; Belt & Giles, 2009). With the exception of a few papers (e.g., Laursen &Foss, 2003), this literature appears to lack focus on how HPWPs influence or foster more innovative-related attitudes and behaviours, extra role behaviors, and performance. This situation exists despite the vast evidence demonstrating the importance of innovation, proactivity, and creativity in its various forms to individual, group, and organizational performance outcomes. Several pertinent issues arise when considering HPWPs and their relationship to innovation and performance outcomes. At a broad level is the issue of which HPWPs are related to which innovation-related variables. Another issue not well identified in research relates to employees’ perceptions of HPWPs: does an employee actually perceive the HPWP –outcomes relationship? No matter how well HPWPs are designed, if they are not perceived and experienced by employees to be effective or worthwhile then their likely success in achieving positive outcomes is limited. At another level, research needs to consider the mechanisms through which HPWPs influence –innovation and performance. The research question here relates to what possible mediating variables are important to the success or failure of HPWPs in impacting innovative behaviours and attitudes and what are the potential process considerations? These questions call for theory refinement and the development of more comprehensive models of the HPWP-innovation/performance relationship that include intermediate linkages and boundary conditions (Ferris, Hochwarter, Buckley, Harrell-Cook, & Frink, 1999). While there are many calls for this type of research to be made a high priority, to date, researchers have made few inroads into answering these questions. This symposium brings together researchers from Australia, Europe, Asia and Africa to examine these various questions relating to the HPWP-innovation-performance relationship. Each paper discusses a HPWP and potential variables that can facilitate or hinder the effects of these practices on innovation- and performance- related outcomes. The first paper by Johnston and Becker explores the HPWPs in relation to work design in a disaster response organization that shifts quickly from business as usual to rapid response. The researchers examine how the enactment of the organizational response is devolved to groups and individuals. Moreover, they assess motivational characteristics that exist in dual work designs (normal operations and periods of disaster activation) and the implications for innovation. The second paper by Jørgensen reports the results of an investigation into training and development practices and innovative work behaviors (IWBs) in Danish organizations. Research on how to design and implement training and development initiatives to support IWBs and innovation in general is surprisingly scant and often vague. This research investigates the mechanisms by which training and development initiatives influence employee behaviors associated with innovation, and provides insights into how training and development can be used effectively by firms to attract and retain valuable human capital in knowledge-intensive firms. The next two papers in this symposium consider the role of employee perceptions of HPWPs and their relationships to innovation-related variables and performance. First, Bish and Newton examine perceptions of the characteristics and awareness of occupational health and safety (OHS) practices and their relationship to individual level adaptability and proactivity in an Australian public service organization. The authors explore the role of perceived supportive and visionary leadership and its impact on the OHS policy-adaptability/proactivity relationship. The study highlights the positive main effects of awareness and characteristics of OHS polices, and supportive and visionary leadership on individual adaptability and proactivity. It also highlights the important moderating effects of leadership in the OHS policy-adaptability/proactivity relationship. Okhawere and Davis present a conceptual model developed for a Nigerian study in the safety-critical oil and gas industry that takes a multi-level approach to the HPWP-safety relationship. Adopting a social exchange perspective, they propose that at the organizational level, organizational climate for safety mediates the relationship between enacted HPWS’s and organizational safety performance (prescribed and extra role performance). At the individual level, the experience of HPWP impacts on individual behaviors and attitudes in organizations, here operationalized as safety knowledge, skills and motivation, and these influence individual safety performance. However these latter relationships are moderated by organizational climate for safety. A positive organizational climate for safety strengthens the relationship between individual safety behaviors and attitudes and individual-level safety performance, therefore suggesting a cross-level boundary condition. The model includes both safety performance (behaviors) and organizational level safety outcomes, operationalized as accidents, injuries, and fatalities. The final paper of this symposium by Zhang and Liu explores leader development and relationship between transformational leadership and employee creativity and innovation in China. The authors further develop a model that incorporates the effects of extrinsic motivation (pay for performance: PFP) and employee collectivism in the leader-employee creativity relationship. The papers’ contributions include the incorporation of a PFP effect on creativity as moderator, rather than predictor in most studies; the exploration of the PFP effect from both fairness and strength perspectives; the advancement of knowledge on the impact of collectivism on the leader- employee creativity link. Last, this is the first study to examine three-way interactional effects among leader-member exchange (LMX), PFP and collectivism, thus, enriches our understanding of promoting employee creativity. In conclusion, this symposium draws upon the findings of four empirical studies and one conceptual study to provide an insight into understanding how different variables facilitate or potentially hinder the influence various HPWPs on innovation and performance. We will propose a number of questions for further consideration and discussion. The symposium will address the Conference Theme of ‘Capitalism in Question' by highlighting how HPWPs can promote financial health and performance of organizations while maintaining a high level of regard and respect for employees and organizational stakeholders. Furthermore, the focus on different countries and cultures explores the overall research question in relation to different modes or stages of development of capitalism.
Resumo:
Purpose - It is important to advance operations management (OM) knowledge while being mindful of the theoretical developments of the discipline. The purpose of this paper is to explore which theoretical perspectives have dominated the OM field. This analysis allows the authors to identify theory trends and gaps in the literature and to identify fruitful areas for future research. A reflection on theory is also practical, given that it guides research toward important questions and enlightens OM practitioners. Design/methodology/approach - The authors provide an analysis of OM theory developments in the last 30 years. The study encompasses three decades of OM publications across three OM journals and contains an analysis of over 3,000 articles so as to identify which theories, over time, have been adopted by authors in order to understand OM topics. Findings - The authors find that the majority of studies are atheoretical, empirical, and focussed upon theory testing rather than on theory development. Some theories, such as the resource-based view and contingency theory, have an enduring relevance within OM. The authors also identify theories from psychology, economics, sociology, and organizational behavior that may, in the future, have salience to explain burgeoning OM research areas such as servitization and sustainability. Research limitations/implications - The study makes a novel contribution by exploring which main theories have been adopted or developed in OM, doing so by systematically analyzing articles from the three main journals in the field (the Journal of Operations Management, Production and Operations Management, and the International Journal of Operations and Production Management), which encompass three decades of OM publications. In order to focus the study, the authors may have missed important OM articles in other journals. Practical implications - A reflection on theories is important because theories inform how a researcher or practicing manager interprets and solves OM problems. This study allows the authors to reflect on the collective OM journey to date, to spot trends and gaps in the literature, and to identify fruitful areas for future research. Originality/value - As far as the authors are aware, there has not been an assessment of the main theoretical perspectives in OM. The research also identifies which topics are published in OM journals, and which theories are adopted to investigate them. The authors also reflect on whether the most cited papers and those winning best paper awards are theoretical. This gives the authors a richer understanding of the current state of OM research.