908 resultados para Actor-network Theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Servitization is the process by which manufacturers add services to their product offerings and even replace products with services. The capabilities necessary to develop and deliver advanced services as part of servitization are often discussed in the literature from the manufacturer’s perspective, e.g., having a service-focused culture or the ability to sell solutions. Recent research has acknowledged the important role of customers and, to a lesser extent, other actors (e.g., intermediaries) in bringing about successful servitization, particularly for use-oriented and results-oriented advanced services. The objective of this study is to identify the capabilities required to successful develop advanced services as part of servitization by considering the perspective of manufacturers, intermediaries and customers. This study involved interviews with 33 managers in 28 large UK-based companies from these three groups, about servitization capabilities. The findings suggest that there are eight broad capabilities that are important for advanced services; 1) personnel with expertise and deep technical product knowledge, 2) methodologies for improving operational processes, helping to manage risk and reduce costs, 3) the evolution from being a product- focused manufacturer to embracing a services culture, 4) developing trusting relationships with other actors in the network to support the delivery of advanced services, 5) new innovation activities focused on financing contracts (e.g., ‘gain share’) and technology implementation (e.g., Web-based applications), 6) customer intimacy through understanding their business challenges in order to develop suitable solutions, 7) extensive infrastructure (e.g., personnel, service centres) to deliver a local service, and 8) the ability to tailor service offerings to each customer’s requirements and deliver these responsively to changing needs. The capabilities required to develop and deliver advanced services align to a need to enhance the operational performance of supplied products throughout their lifecycles and as such require greater investment than the capabilities for base and intermediate services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes research findings on the roles that organizations can adopt in managing supply networks. Drawing on extensive empirical data, it is demonstrated that organizations may be said to be able to manage supply networks, provided a broad view of ‘managing’ is adopted. Applying role theory, supply network management interventions were clustered into sets of linked activities and goals that constituted supply network management roles. Six supply network management roles were identified – innovation facilitator, co-ordinator, supply policy maker and implementer, advisor, information broker and supply network structuring agent. The findings are positioned in the wider context of debates about the meaning of management, the contribution of role theory to our understanding of management, and whether inter-organizational networks can be managed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter introduces activity theory as an approach for studying strategy as practice. Activity theory conceptualizes the ongoing construction of activity as a product of activity systems, comprising the actor, the community with which that actor interacts and those symbolic and material tools that mediate between actors, their community and their pursuit of activity. The focus on the mediating role of tools and cultural artefacts in human activity seems especially promising for advancing the strategy-as-practice agenda, for example as a theoretical resource for the growing interest in sociomateriality and the role of tools and artefacts in (strategy) practice (for example, Balogun et al. 2014; Lanzara 2009; Nicolini 2009; Spee and Jarzabkowski 2009; Stetsenko 2005). Despite its potential, in a recent review Vaara and Whittington (2012) identified only three strategy-as-practice articles explicitly applying an activity theory lens. In the wider area of practice-based studies in organizations, activity theory has been slightly more popular (for example, Blackler 1993; 1995; Blackler, Crump and McDonald 2000; Engeström, Kerosuo and Kajamaa 2007; Groleau 2006; Holt 2008; Miettinen and Virkkunen 2005). It still lags behind its potential, however, primarily because of its origins as a social psychology theory developed in Russia with little initial recognition outside the Russian context, particularly in the area of strategy and organization theory, until recently (Miettinen, Samra-Fredericks and Yanow 2009). This chapter explores activity theory as a resource for studying strategy as practice as it is socially accomplished by individuals in interaction with their wider social group and the artefacts of interaction. In particular, activity theory’s focus on actors as social individuals provides a conceptual basis for studying the core question in strategy-as-practice research: what strategy practitioners do. The chapter is structured in three parts. First, an overview of activity theory is provided. Second, activity theory as a practice-based approach to studying organizational action is introduced and an activity system conceptual framework is developed. Third, the elements of the activity system are explained in more detail and explicitly linked to each of the core SAP concepts: practitioners, practices and praxis. In doing so, links are made to existing strategy-as-practice research, with brief empirical examples of topics that might be addressed using activity theory. Throughout the chapter, we introduce key authors in the development of activity theory and its use in management and adjacent disciplinary fields, as further resources for those wishing to make greater use of activity theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper models how the structure and function of a network of firms affects their aggregate innovativeness. Each firm has the potential to innovate, either from in-house R&D or from innovation spillovers from neighboring firms. The nature of innovation spillovers depends upon network density, the commonality of knowledge between firms, and the learning capability of firms. Innovation spillovers are modelled in detail using ideas from organizational theory. Two main results emerge: (i) the marginal effect on innovativeness of spillover intensity is non-monotonic, and (ii) network density can affect innovativeness but only when there are heterogeneous firms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Similar to classic Signal Detection Theory (SDT), recent optimal Binary Signal Detection Theory (BSDT) and based on it Neural Network Assembly Memory Model (NNAMM) can successfully reproduce Receiver Operating Characteristic (ROC) curves although BSDT/NNAMM parameters (intensity of cue and neuron threshold) and classic SDT parameters (perception distance and response bias) are essentially different. In present work BSDT/NNAMM optimal likelihood and posterior probabilities are analytically analyzed and used to generate ROCs and modified (posterior) mROCs, optimal overall likelihood and posterior. It is shown that for the description of basic discrimination experiments in psychophysics within the BSDT a ‘neural space’ can be introduced where sensory stimuli as neural codes are represented and decision processes are defined, the BSDT’s isobias curves can simultaneously be interpreted as universal psychometric functions satisfying the Neyman-Pearson objective, the just noticeable difference (jnd) can be defined and interpreted as an atom of experience, and near-neutral values of biases are observers’ natural choice. The uniformity or no-priming hypotheses, concerning the ‘in-mind’ distribution of false-alarm probabilities during ROC or overall probability estimations, is introduced. The BSDT’s and classic SDT’s sensitivity, bias, their ROC and decision spaces are compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

On the basis of convolutional (Hamming) version of recent Neural Network Assembly Memory Model (NNAMM) for intact two-layer autoassociative Hopfield network optimal receiver operating characteristics (ROCs) have been derived analytically. A method of taking into account explicitly a priori probabilities of alternative hypotheses on the structure of information initiating memory trace retrieval and modified ROCs (mROCs, a posteriori probabilities of correct recall vs. false alarm probability) are introduced. The comparison of empirical and calculated ROCs (or mROCs) demonstrates that they coincide quantitatively and in this way intensities of cues used in appropriate experiments may be estimated. It has been found that basic ROC properties which are one of experimental findings underpinning dual-process models of recognition memory can be explained within our one-factor NNAMM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Architecture and learning algorithm of self-learning spiking neural network in fuzzy clustering task are outlined. Fuzzy receptive neurons for pulse-position transformation of input data are considered. It is proposed to treat a spiking neural network in terms of classical automatic control theory apparatus based on the Laplace transform. It is shown that synapse functioning can be easily modeled by a second order damped response unit. Spiking neuron soma is presented as a threshold detection unit. Thus, the proposed fuzzy spiking neural network is an analog-digital nonlinear pulse-position dynamic system. It is demonstrated how fuzzy probabilistic and possibilistic clustering approaches can be implemented on the base of the presented spiking neural network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There have been many functional imaging studies of the brain basis of theory of mind (ToM) skills, but the findings are heterogeneous and implicate anatomical regions as far apart as orbitofrontal cortex and the inferior parietal lobe. The functional imaging studies are reviewed to determine whether the diverse findings are due to methodological factors. The studies are considered according to the paradigm employed (e.g., stories vs. cartoons and explicit vs. implicit ToM instructions), the mental state(s) investigated, and the language demands of the tasks. Methodological variability does not seem to account for the variation in findings, although this conclusion may partly reflect the relatively small number of studies. Alternatively, several distinct brain regions may be activated during ToM reasoning, forming an integrated functional "network." The imaging findings suggest that there are several "core" regions in the network-including parts of the prefrontal cortex and superior temporal sulcus-while several more "peripheral" regions may contribute to ToM reasoning in a manner contingent on relatively minor aspects of the ToM task. © 2008 Wiley-Liss, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Servitization involves manufacturers developing service offerings to grow revenue and profit. Advanced services, in particular, can facilitate a more service-focused organization and impact customers' business processes significantly. However, approaches to servitization are often discussed solely from the manufacturer's perspective; overlooking the role of other network actors. Adopting a multi-actor perspective, this study investigates manufacturer, intermediary and customer perspectives to identify complementary and competing capabilities within a manufacturer's downstream network, required for advanced services. Interviews were conducted with 24 senior executives in 19 UK-based manufacturers, intermediaries and customers across multiple sectors. The study identified six key business activities, within which advanced services capabilities were grouped. The unique and critical capabilities for advanced services for each actor were identified as follows: manufacturers; the need to balance product and service innovation, developing customer-focused through-life service methodologies and having distinct, yet synergistic product and service cultures; intermediaries, the coordination and integration of third party products/services; customers, co-creating innovation and having processes supporting service outsourcing. The study is unique in highlighting the distinct roles of different actors in the provision of advanced services and shows that they can only be developed and delivered by the combination of complex interconnected capabilities found within a network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. ^ The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as (1) closure or connectedness within the group, (2) bridging ties which extend outside of the group, and (3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. ^ The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. ^ Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This ex post facto study (N = 209) examined the relationships between employer job strategies and job retention among organizations participating in Florida welfare-to-work network programs and associated the strategies with job retention data to determine best practices. ^ An internet-based self-report survey battery was administered to a heterogeneous sampling of organizations participating in the Florida welfare-to-work network program. Hypotheses were tested through correlational and hierarchical regression analytic procedures. The partial correlation results linked each of the job retention strategies to job retention. Wages, benefits, training and supervision, communication, job growth, work/life balance, fairness and respect were all significantly related to job retention. Hierarchical regression results indicated that the training and supervision variable was the best predictor of job retention in the regression equation. ^ The size of the organization was also a significant predictor of job retention. Large organizations reported higher job retention rates than small organizations. There was no statistical difference between the types of organizations (profit-making and non-profit) and job retention. The standardized betas ranged from to .26 to .41 in the regression equation. Twenty percent of the variance in job retention was explained by the combination of demographic and job retention strategy predictors, supporting the theoretical, empirical, and practical relevance of understanding the association between employer job strategies and job retention outcomes. Implications for adult education and human resource development theory, research, and practice are highlighted as possible strategic leverage points for creating conditions that facilitate the development of job strategies as a means for improving former welfare workers’ job retention.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research involves the design, development, and theoretical demonstration of models resulting in integrated misbehavior resolution protocols for ad hoc networked devices. Game theory was used to analyze strategic interaction among independent devices with conflicting interests. Packet forwarding at the routing layer of autonomous ad hoc networks was investigated. Unlike existing reputation based or payment schemes, this model is based on repeated interactions. To enforce cooperation, a community enforcement mechanism was used, whereby selfish nodes that drop packets were punished not only by the victim, but also by all nodes in the network. Then, a stochastic packet forwarding game strategy was introduced. Our solution relaxed the uniform traffic demand that was pervasive in other works. To address the concerns of imperfect private monitoring in resource aware ad hoc networks, a belief-free equilibrium scheme was developed that reduces the impact of noise in cooperation. This scheme also eliminated the need to infer the private history of other nodes. Moreover, it simplified the computation of an optimal strategy. The belief-free approach reduced the node overhead and was easily tractable. Hence it made the system operation feasible. Motivated by the versatile nature of evolutionary game theory, the assumption of a rational node is relaxed, leading to the development of a framework for mitigating routing selfishness and misbehavior in Multi hop networks. This is accomplished by setting nodes to play a fixed strategy rather than independently choosing a rational strategy. A range of simulations was carried out that showed improved cooperation between selfish nodes when compared to older results. Cooperation among ad hoc nodes can also protect a network from malicious attacks. In the absence of a central trusted entity, many security mechanisms and privacy protections require cooperation among ad hoc nodes to protect a network from malicious attacks. Therefore, using game theory and evolutionary game theory, a mathematical framework has been developed that explores trust mechanisms to achieve security in the network. This framework is one of the first steps towards the synthesis of an integrated solution that demonstrates that security solely depends on the initial trust level that nodes have for each other.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as 1) closure or connectedness within the group, 2) bridging ties which extend outside of the group, and 3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software.