902 resultados para Role models
Resumo:
Two experiments were undertaken with 3 goals: (a) to determine whether manipulating the desirability of including empathy as part of one's gender-role identity motivates accurate mind-reading, (b) to ascertain whether target readability moderates the strength of this effect, and (c) to test whether these effects are mediated by the complexity of perceivers' inferential strategies. Participants viewed videotapes of 2 couples discussing relationship problems and attempted to infer each partner's thoughts and feelings. Both experiments demonstrated that motivation improved accuracy when male and female perceivers valued the empathy-relevant aspects of the traditional female gender role. However, as predicted, high levels of motivation facilitated the accurate reading of easy targets but not of difficult targets. Several mediational models were tested, the results of which showed that the complexity of perceivers' attributions mediated the link between motivation and mind-reading accuracy.
Resumo:
This chapter argues that creative, innovative organizations are places where there is a firm and shared belief among most members in an inspirational vision of what the organization is trying to achieve. There is a high level of interaction, discussion, constructive debate, and influence among the members of the organization as they go about their work. Trust, cooperative orientations, and a sense of interpersonal safety characterize interpersonal and intergroup relationships. Members of the organization, particularly those at the upper echelons (and there are few echelons) are consistently positive and open to members' ideas for new and improved ways of working, providing both encouragement and the resources for innovation. Creativity is heralded as key for organizational survival and success. As global economic models become the norm and competitiveness assumes an international character, leaders realize that, in order to prosper in a highly challenging environment, companies must innovate. The source of organizational innovation is unquestionably the ideas generated by individuals and teams. © 2012 Elsevier Inc. All rights reserved.
Resumo:
Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.
Resumo:
This paper proposes an integrative framework for the conduct of a more thorough and robust analysis regarding the linkage between Human Resource Management (HRM) and business performance. In order to provide the required basis for the proposed framework, initially, the core aspects of the main HRM models predicting business performance are analysed. The framework proposes both the principle of mediation (i.e. HRM outcomes mediate the relationship between organisational strategies and business performance) and the perspective of simultaneity of decision-making by firms with regard to the consideration of business strategies and HRM policies. In order to empirically test this framework the methodological approach of 'structural equation models' is employed. The empirical research is based on a sample of 178 organisations operating in the Greek manufacturing sector. The paper concludes that both the mediation principle and the simultaneity perspective are supported, emphasising further the positive role of HRM outcomes towards organisational performance.
Resumo:
Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.
Resumo:
Oxygen is a crucial molecule for cellular function. When oxygen demand exceeds supply, the oxygen sensing pathway centred on the hypoxia inducible factor (HIF) is switched on and promotes adaptation to hypoxia by up-regulating genes involved in angiogenesis, erythropoiesis and glycolysis. The regulation of HIF is tightly modulated through intricate regulatory mechanisms. Notably, its protein stability is controlled by the oxygen sensing prolyl hydroxylase domain (PHD) enzymes and its transcriptional activity is controlled by the asparaginyl hydroxylase FIH (factor inhibiting HIF-1).To probe the complexity of hypoxia-induced HIF signalling, efforts in mathematical modelling of the pathway have been underway for around a decade. In this paper, we review the existing mathematical models developed to describe and explain specific behaviours of the HIF pathway and how they have contributed new insights into our understanding of the network. Topics for modelling included the switch-like response to decreased oxygen gradient, the role of micro environmental factors, the regulation by FIH and the temporal dynamics of the HIF response. We will also discuss the technical aspects, extent and limitations of these models. Recently, HIF pathway has been implicated in other disease contexts such as hypoxic inflammation and cancer through crosstalking with pathways like NF?B and mTOR. We will examine how future mathematical modelling and simulation of interlinked networks can aid in understanding HIF behaviour in complex pathophysiological situations. Ultimately this would allow the identification of new pharmacological targets in different disease settings.
Resumo:
Uncertainty can be defined as the difference between information that is represented in an executing system and the information that is both measurable and available about the system at a certain point in its life-time. A software system can be exposed to multiple sources of uncertainty produced by, for example, ambiguous requirements and unpredictable execution environments. A runtime model is a dynamic knowledge base that abstracts useful information about the system, its operational context and the extent to which the system meets its stakeholders' needs. A software system can successfully operate in multiple dynamic contexts by using runtime models that augment information available at design-time with information monitored at runtime. This chapter explores the role of runtime models as a means to cope with uncertainty. To this end, we introduce a well-suited terminology about models, runtime models and uncertainty and present a state-of-the-art summary on model-based techniques for addressing uncertainty both at development- and runtime. Using a case study about robot systems we discuss how current techniques and the MAPE-K loop can be used together to tackle uncertainty. Furthermore, we propose possible extensions of the MAPE-K loop architecture with runtime models to further handle uncertainty at runtime. The chapter concludes by identifying key challenges, and enabling technologies for using runtime models to address uncertainty, and also identifies closely related research communities that can foster ideas for resolving the challenges raised. © 2014 Springer International Publishing.
Resumo:
This paper describes physics of nonlinear ultra-short laser pulse propagation affected by plasma created by the pulse itself. Major applications are also discussed. Nonlinear propagation of the femtosecond laser pulses in gaseous and solid transparent dielectric media is a fundamental physical phenomenon in a wide range of important applications such as laser lidars, laser micro-machining (ablation) and microfabrication etc. These applications require very high intensity of the laser field, typically 1013–1015 TW/cm2. Such high intensity leads to significant ionisation and creation of electron-ion or electron-hole plasma. The presence of plasma results into significant multiphoton and plasma absorption and plasma defocusing. Consequently, the propagation effects appear extremely complex and result from competitive counteraction of the above listed effects and Kerr effect, diffraction and dispersion. The theoretical models used for consistent description of laser-plasma interaction during femtosecond laser pulse propagation are derived and discussed. It turns out that the strongly nonlinear effects such self-focusing followed by the pulse splitting are essential. These phenomena feature extremely complex dynamics of both the electromagnetic field and plasma density with different spatio-temporal structures evolving at the same time. Some numerical approaches capable to handle all these complications are also discussed. ©2006 American Institute of Physics
Resumo:
Individuals within the aged population show an increased susceptibility to infection, implying a decline in immune function, a phenomenon known as immunosenescence. Paradoxically, an increase in autoimmune disease, such as rheumatoid arthritis, is also associated with ageing, therefore some aspects of the immune system appear to be inappropriately active in the elderly. The above evidence suggests inappropriate control of the immune system as we age. Macrophages, and their precursors monocytes, play a key role in control of the immune system. They play an important role in host defence in the form of phagocytosis, and also link the innate and adaptive immune system via antigen presentation. Macrophages also have a reparative role, as professional phagocytes of dead and dying cells. Clearance of apoptotic cells by macrophages has also been shown to directly influence immune responses in an anti-inflammatory manner. Inappropriate control of macrophage function with regards to dead cell clearance may contribute to pathology as we age. The aims of this study were to assess the impact of lipid treatment, as a model of the aged environment, on the ability of macrophages to interact with, and respond to, apoptotic cells. Using a series of in vitro cell models, responses of macrophages (normal and lipid-loaded) to apoptotic macrophages (normal and lipid-loaded) were investigated. Monocyte recruitment to apoptotic cells, a key process in resolving inflammation, was assessed in addition to cytokine responses. Data here shows, for the first time, that apoptotic macrophages (normal and lipid-loaded) induce inflammation in human monocyte-derived macrophages, a response that could drive inflammation in age-associated pathology e.g. atherosclerosis. Monoclonal antibody inhibition studies suggest the classical chemokine CX3CL1 may be involved in monocyte recruitment to apoptotic macrophages, but not apoptotic foam cells, therefore differential clearance strategies may be employed following lipid-loading. CD14, an important apoptotic cell tethering receptor, was not found to have a prominent role in this process, whilst the role for ICAM-3 remains unclear. Additionally, a small pilot study using macrophages from young (<25) and mid-life (>40) donors was undertaken. Preliminary data was gathered to assess the ability of primary human monocyte-derived macrophages, from young and mid-life donors, to interact with, and respond to, apoptotic cells. MØ from mid-life individuals showed no significant differences in their ability to respond to immune modulation by apoptotic cells compared to MØ from young donors. Larger cohorts would be required to investigate whether immune modulation of MØ by apoptotic cells contribute to inflammatory pathology throughout ageing.
Resumo:
Although the existence of halogenated lipids in lower organisms has been known for many years, it is only since the 1990s that interest in their occurrence in mammalian systems has developed. Chlorinated (and other halogenated) lipids can arise from oxidation by hypohalous acids, such as HOCl, which are products of the phagocytic enzyme myeloperoxidase and are generated during inflammation. The major species of chlorinated lipids investigated to date are chlorinated sterols, fatty acid and phospholipid chlorohydrins, and a-chloro fatty aldehydes. While all of these chlorinated lipids have been shown to be produced in model systems from lipoproteins to cells subjected to oxidative stress, as yet only a-chloro fatty aldehydes, such as 2-chlorohexadecanal, have been detected in clinical samples or animal models of disease. a-Chloro fatty aldehydes and chlorohydrins have been found to have a number of potentially pro-inflammatory effects ranging from toxicity to inhibition of nitric oxide synthesis and upregulation of vascular adhesion molecules. Thus evidence is building for a role of chlorinated lipids in inflammatory disease, although much more research is required to establish the contributions of specific compounds in different disease pathologies. Preventing chlorinated lipid formation and indeed other HOCl-induced damage, via the inhibition of myeloperoxidase, is an area of growing interest and may lead in the future to antimyeloperoxidase-based antiinflammatory therapy. However, other chlorinated lipids, such as punaglandins, have beneficial effects that could offer novel therapies for cancer.
Resumo:
Heme-oxygenases (HOs) catalyze the conversion of heme into carbon monoxide and biliverdin. HO-1 is induced during hypoxia, ischemia/reperfusion, and inflammation, providing cytoprotection and inhibiting leukocyte migration to inflammatory sites. Although in vitro studies have suggested an additional role for HO-1 in angiogenesis, the relevance of this in vivo remains unknown. We investigated the involvement of HO-1 in angiogenesis in vitro and in vivo. Vascular endothelial growth factor (VEGF) induced prolonged HO-1 expression and activity in human endothelial cells and HO-1 inhibition abrogated VEGF-driven angiogenesis. Two murine models of angiogenesis were used: (1) angiogenesis initiated by addition of VEGF to Matrigel and (2) a lipopolysaccharide (LPS)-induced model of inflammatory angiogenesis in which angiogenesis is secondary to leukocyte invasion. Pharmacologic inhibition of HO-1 induced marked leukocytic infiltration that enhanced VEGF-induced angiogenesis. However, in the presence of an anti-CD18 monoclonal antibody (mAb) to block leukocyte migration, VEGF-induced angiogenesis was significantly inhibited by HO-1 antagonists. Furthermore, in the LPS-induced model of inflammatory angiogenesis, induction of HO-1 with cobalt protoporphyrin significantly inhibited leukocyte invasion into LPS-conditioned Matrigel and thus prevented the subsequent angiogenesis. We therefore propose that during chronic inflammation HO-1 has 2 roles: first, an anti-inflammatory action inhibiting leukocyte infiltration; and second, promotion of VEGF-driven noninflammatory angiogenesis that facilitates tissue repair.
Resumo:
This paper describes physics of nonlinear ultra‐short laser pulse propagation affected by plasma created by the pulse itself. Major applications are also discussed. Nonlinear propagation of the femtosecond laser pulses in gaseous and solid transparent dielectric media is a fundamental physical phenomenon in a wide range of important applications such as laser lidars, laser micro‐machining (ablation) and microfabrication etc. These applications require very high intensity of the laser field, typically 1013–1015 TW/cm2. Such high intensity leads to significant ionisation and creation of electron‐ion or electron‐hole plasma. The presence of plasma results into significant multiphoton and plasma absorption and plasma defocusing. Consequently, the propagation effects appear extremely complex and result from competitive counteraction of the above listed effects and Kerr effect, diffraction and dispersion. The theoretical models used for consistent description of laser‐plasma interaction during femtosecond laser pulse propagation are derived and discussed. It turns out that the strongly nonlinear effects such self‐focusing followed by the pulse splitting are essential. These phenomena feature extremely complex dynamics of both the electromagnetic field and plasma density with different spatio‐temporal structures evolving at the same time. Some numerical approaches capable to handle all these complications are also discussed.
Resumo:
This article reflects on the UK coalition government’s ‘alternative models’ agenda, specifically in terms of the adoption of new models of service delivery by arm’s-length bodies (ALBs). It provides an overview of the alternative models agenda and discusses barriers to implementation. These include practical challenges involved in the set up of alternative models, the role of sponsor departments, and the effective communication of best practice. Finally, the article highlights some issues for further discussion.
Resumo:
In the global economy, innovation is one of the most important competitive assets for companies willing to compete in international markets. As competition moves from standardised products to customised ones, depending on each specific market needs, economies of scale are not anymore the only winning strategy. Innovation requires firms to establish processes to acquire and absorb new knowledge, leading to the recent theory of Open Innovation. Knowledge sharing and acquisition happens when firms are embedded in networks with other firms, university, institutions and many other economic actors. Several typologies of innovation and firm networks have been identified, with various geographical spans. One of the first being modelled was the Industrial Cluster (or in Italian Distretto Industriale) which was for long considered the benchmark for innovation and economic development. Other kind of networks have been modelled since the late 1970s; Regional Innovation Systems represent one of the latest and more diffuse model of innovation networks, specifically introduced to combine local networks and the global economy. This model was qualitatively exploited since its introduction, but, together with National Innovation Systems, is among the most inspiring for policy makers and is often cited by them, not always properly. The aim of this research is to setup an econometric model describing Regional Innovation Systems, becoming one the first attempts to test and enhance this theory with a quantitative approach. A dataset of 104 secondary and primary data from European regions was built in order to run a multiple linear regression, testing if Regional Innovation Systems are really correlated to regional innovation and regional innovation in cooperation with foreign partners. Furthermore, an exploratory multiple linear regression was performed to verify which variables, among those describing a Regional Innovation Systems, are the most significant for innovating, alone or with foreign partners. Furthermore, the effectiveness of present innovation policies has been tested based on the findings of the econometric model. The developed model confirmed the role of Regional Innovation Systems for creating innovation even in cooperation with international partners: this represents one of the firsts quantitative confirmation of a theory previously based on qualitative models only. Furthermore the results of this model confirmed a minor influence of National Innovation Systems: comparing the analysis of existing innovation policies, both at regional and national level, to our findings, emerged the need for potential a pivotal change in the direction currently followed by policy makers. Last, while confirming the role of the presence a learning environment in a region and the catalyst role of regional administration, this research offers a potential new perspective for the whole private sector in creating a Regional Innovation System.
Resumo:
The notion model of development and distribution of software (MDDS) is introduced and its role for the efficiency of the software products is stressed. Two classical MDDS are presented and some attempts to adapt them to the contemporary trends in web-based software design are described. Advantages and shortcomings of the obtained models are outlined. In conclusion the desired features of a better MDDS for web-based solutions are given.