919 resultados para Actor-network theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, a surprising new phenomenon has emerged in which globally-distributed online communities collaborate to create useful and sophisticated computer software. These open source software groups are comprised of generally unaffiliated individuals and organizations who work in a seemingly chaotic fashion and who participate on a voluntary basis without direct financial incentive. The purpose of this research is to investigate the relationship between the social network structure of these intriguing groups and their level of output and activity, where social network structure is defined as 1) closure or connectedness within the group, 2) bridging ties which extend outside of the group, and 3) leader centrality within the group. Based on well-tested theories of social capital and centrality in teams, propositions were formulated which suggest that social network structures associated with successful open source software project communities will exhibit high levels of bridging and moderate levels of closure and leader centrality. The research setting was the SourceForge hosting organization and a study population of 143 project communities was identified. Independent variables included measures of closure and leader centrality defined over conversational ties, along with measures of bridging defined over membership ties. Dependent variables included source code commits and software releases for community output, and software downloads and project site page views for community activity. A cross-sectional study design was used and archival data were extracted and aggregated for the two-year period following the first release of project software. The resulting compiled variables were analyzed using multiple linear and quadratic regressions, controlling for group size and conversational volume. Contrary to theory-based expectations, the surprising results showed that successful project groups exhibited low levels of closure and that the levels of bridging and leader centrality were not important factors of success. These findings suggest that the creation and use of open source software may represent a fundamentally new socio-technical development process which disrupts the team paradigm and which triggers the need for building new theories of collaborative development. These new theories could point towards the broader application of open source methods for the creation of knowledge-based products other than software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The purpose of the research described in this paper is to disentangle the rhetoric from the reality in relation to supply chain management (SCM) adoption in practice. There is significant evidence of a divergence between theory and practice in the field of SCM. Research Approach: The authors’ review of the extant SCM literature highlighted a lack of replication studies in SCM, leading to the concept of refined replication being developed. The authors conducted a refined replication of the work of Sweeney et al. (2015) where a new SCM definitional construct – the Four Fundamentals – was proposed. The work presented in this article refines the previous study but adopts the same three-phase approach: focussed interviews, a questionnaire survey, and focus groups. This article covers the second phase of the refined replication study and describes an integrated research design of a questionnaire research to be undertaken in Britain. Findings and Originality: The article presents an integrated research design of a questionnaire research with emphases on the refined replication of previous work of Sweeney et al. (2015) carried out in Ireland and adapting it to the British context. Research Impact: The authors introduce the concept of refined replication in SCM research. This allows previous research to be built upon in order to test understanding of SCM theory and its practical implementation - based on the Four Fundamentals construct - among SCM professionals in Britain. Practical Impact: The article presents the integrated research design of a questionnaire research that may be used in similar studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The protection of cyberspace has become one of the highest security priorities of governments worldwide. The EU is not an exception in this context, given its rapidly developing cyber security policy. Since the 1990s, we could observe the creation of three broad areas of policy interest: cyber-crime, critical information infrastructures and cyber-defence. One of the main trends transversal to these areas is the importance that the private sector has come to assume within them. In particular in the area of critical information infrastructure protection, the private sector is seen as a key stakeholder, given that it currently operates most infrastructures in this area. As a result of this operative capacity, the private sector has come to be understood as the expert in network and information systems security, whose knowledge is crucial for the regulation of the field. Adopting a Regulatory Capitalism framework, complemented by insights from Network Governance, we can identify the shifting role of the private sector in this field from one of a victim in need of protection in the first phase, to a commercial actor bearing responsibility for ensuring network resilience in the second, to an active policy shaper in the third, participating in the regulation of NIS by providing technical expertise. By drawing insights from the above-mentioned frameworks, we can better understand how private actors are involved in shaping regulatory responses, as well as why they have been incorporated into these regulatory networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the interactions among the various phases of network research design in the context of our current work using Mixed Methods and SNA on networks and rural economic development. We claim that there are very intricate inter-dependencies among the various phases of network research design - from theory and formulation of research questions right through to modes of analysis and interpretation. Through examples drawn from our work we illustrate how choices about methods for Sampling and Data Collection are influenced by these interdependencies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many sociopolitical theories to help explain why governments and actors do what they do. Securitization Theory is a process-oriented theory in international relations that focuses on how an actor defines another actor as an “existential threat,” and the resulting responses that can be taken in order to address that threat. While Securitization Theory is an acceptable method to analyze the relationships between actors in the international system, this thesis contends that the proper examination is multi-factorial, focusing on the addition of Role Theory to the analysis. Consideration of Role Theory, which is another international relations theory that explains how an actor’s strategies, relationships, and perceptions by others is based on pre-conceptualized definitions of that actor’s identity, is essential in order to fully explain why an actor might respond to another in a particular way. Certain roles an actor may enact produce a rival relationship with other actors in the system, and it is those rival roles that elicit securitized responses. The possibility of a securitized response lessens when a role or a relationship between roles becomes ambiguous. There are clear points of role rivalry and role ambiguity between Hizb’allah and Iran, which has directly impacted, and continues to impact, how the United States (US) responds to these actors. Because of role ambiguity, the US has still not conceptualized an effective way to deal with Hizb’allah and Iran holistically across all its various areas of operation and in its various enacted roles. It would be overly simplistic to see Hizb’allah and Iran solely through one lens depending on which hemisphere or continent one is observing. The reality is likely more nuanced. Both Role Theory and Securitization theory can help to understand and articulate those nuances. By examining two case studies of Hizb’allah and Iran’s enactment of various roles in both the Middle East and Latin America, the situations where roles cause a securitized response and where the response is less securitized due to role ambiguity will become clear. Using this augmented approach of combining both theories, along with supplementing the manner in which an actor, action, or role is analyzed, will produce better methods for policy-making that will be able to address the more ambiguous activities of Hizb’allah and Iran in these two regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two concepts in rural economic development policy have been the focus of much research and policy action: the identification and support of clusters or networks of firms and the availability and adoption by rural businesses of Information and Communication Technologies (ICT). From a theoretical viewpoint these policies are based on two contrasting models, with clustering seen as a process of economic agglomeration, and ICT-mediated communication as a means of facilitating economic dispersion. The study’s conceptual framework is based on four interrelated elements: location, interaction, knowledge, and advantage, together with the concept of networks which is employed as an operationally and theoretically unifying concept. The research questions are developed in four successive categories: Policy, Theory, Networks, and Method. The questions are approached using a study of two contrasting groups of rural small businesses in West Cork, Ireland: (a) Speciality Foods, and (b) firms in Digital Products and Services. The study combines Social Network Analysis (SNA) with Qualitative Thematic Analysis, using data collected from semi-structured interviews with 58 owners or managers of these businesses. Data comprise relational network data on the firms’ connections to suppliers, customers, allies and competitors, together with linked qualitative data on how the firms established connections, and how tacit and codified knowledge was sourced and utilised. The research finds that the key characteristics identified in the cluster literature are evident in the sample of Speciality Food businesses, in relation to flows of tacit knowledge, social embedding, and the development of forms of social capital. In particular the research identified the presence of two distinct forms of collective social capital in this network, termed “community” and “reputation”. By contrast the sample of Digital Products and Services businesses does not have the form of a cluster, but matches more closely to dispersive models, or “chain” structures. Much of the economic and social structure of this set of firms is best explained in terms of “project organisation”, and by the operation of an individual rather than collective form of “reputation”. The rural setting in which these firms are located has resulted in their being service-centric, and consequently they rely on ICT-mediated communication in order to exchange tacit knowledge “at a distance”. It is this factor, rather than inputs of codified knowledge, that most strongly influences their operation and their need for availability and adoption of high quality communication technologies. Thus the findings have applicability in relation to theory in Economic Geography and to policy and practice in Rural Development. In addition the research contributes to methodological questions in SNA, and to methodological questions about the combination or mixing of quantitative and qualitative methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aberrant behavior of biological signaling pathways has been implicated in diseases such as cancers. Therapies have been developed to target proteins in these networks in the hope of curing the illness or bringing about remission. However, identifying targets for drug inhibition that exhibit good therapeutic index has proven to be challenging since signaling pathways have a large number of components and many interconnections such as feedback, crosstalk, and divergence. Unfortunately, some characteristics of these pathways such as redundancy, feedback, and drug resistance reduce the efficacy of single drug target therapy and necessitate the employment of more than one drug to target multiple nodes in the system. However, choosing multiple targets with high therapeutic index poses more challenges since the combinatorial search space could be huge. To cope with the complexity of these systems, computational tools such as ordinary differential equations have been used to successfully model some of these pathways. Regrettably, for building these models, experimentally-measured initial concentrations of the components and rates of reactions are needed which are difficult to obtain, and in very large networks, they may not be available at the moment. Fortunately, there exist other modeling tools, though not as powerful as ordinary differential equations, which do not need the rates and initial conditions to model signaling pathways. Petri net and graph theory are among these tools. In this thesis, we introduce a methodology based on Petri net siphon analysis and graph network centrality measures for identifying prospective targets for single and multiple drug therapies. In this methodology, first, potential targets are identified in the Petri net model of a signaling pathway using siphon analysis. Then, the graph-theoretic centrality measures are employed to prioritize the candidate targets. Also, an algorithm is developed to check whether the candidate targets are able to disable the intended outputs in the graph model of the system or not. We implement structural and dynamical models of ErbB1-Ras-MAPK pathways and use them to assess and evaluate this methodology. The identified drug-targets, single and multiple, correspond to clinically relevant drugs. Overall, the results suggest that this methodology, using siphons and centrality measures, shows promise in identifying and ranking drugs. Since this methodology only uses the structural information of the signaling pathways and does not need initial conditions and dynamical rates, it can be utilized in larger networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been years since the introduction of the Dynamic Network Optimization (DNO) concept, yet the DNO development is still at its infant stage, largely due to a lack of breakthrough in minimizing the lengthy optimization runtime. Our previous work, a distributed parallel solution, has achieved a significant speed gain. To cater for the increased optimization complexity pressed by the uptake of smartphones and tablets, however, this paper examines the potential areas for further improvement and presents a novel asynchronous distributed parallel design that minimizes the inter-process communications. The new approach is implemented and applied to real-life projects whose results demonstrate an augmented acceleration of 7.5 times on a 16-core distributed system compared to 6.1 of our previous solution. Moreover, there is no degradation in the optimization outcome. This is a solid sprint towards the realization of DNO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shape-based registration methods frequently encounters in the domains of computer vision, image processing and medical imaging. The registration problem is to find an optimal transformation/mapping between sets of rigid or nonrigid objects and to automatically solve for correspondences. In this paper we present a comparison of two different probabilistic methods, the entropy and the growing neural gas network (GNG), as general feature-based registration algorithms. Using entropy shape modelling is performed by connecting the point sets with the highest probability of curvature information, while with GNG the points sets are connected using nearest-neighbour relationships derived from competitive hebbian learning. In order to compare performances we use different levels of shape deformation starting with a simple shape 2D MRI brain ventricles and moving to more complicated shapes like hands. Results both quantitatively and qualitatively are given for both sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expanding on the growing movement to take academic and other erudite subjugated knowledges and distill them into some graphic form, this “cartoon” is a recounting of the author’s 2014 article,  “Big Data, Actionable Information, Scientific Knowledge and the Goal of Control,” Teknokultura, Vol. 11/no. 3, pp. 529-54.  It is an analysis of the idea of Big Data and an argument that its power relies on its instrumentalist specificity and not its extent. Mind control research in general and optogenetics in particular are the case study. Noir seems an appropriate aesthetic for this analysis, so direct quotes from the article are illustrated by publically available screen shots from iconic and unknown films of the 20th century. The only addition to the original article is a framing insight from the admirable activist network CrimethInc. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is a study in the Local Productive Arrangement of confections from Agreste of Pernambuco, as a relevant sector in economic and social aspect. This research has as central aim to understand how the inter-organizational relations influence the collective efficiency of arrangement. The theoretical framework employed highlights the approaches that deal with the benefits of business agglomeration for the develop­ment of firms and regions. It has discussed the approach of small and medium enter­ prises and industrial districts (SCHMITZ, 1997), which introduce the concept of col­ lective efficiency, explaining that only those externalities explained by Marshall (1996) are not sufficient to explain the competitive advantage of enterprises, expand­ing the idea that organizations achieve competitive advantage not acting alone. To examine the influences of relations in the collective efficiency, it has been taken as analytical perspective theory of social networks (GRANOVETTER, 1973, 1985; BURT, 1992; UZZI, 1997) because it has believe that this approach provides subsi­ dies for a structural analysis of social relationships in face the behavior of human ac­tion. By examining the organizations in a social network, you should understand the reason of this establishment of the relationship, their benefits, and as the information flow takes place and density of links between the actors (Powell; SMITH-DOERR, 1994). As for the methods, this study is characterized as a case study, in according to the purposed objectives, in addition to qualitative method. Also, due to recovering of the historical milestones of the arrangement, it is used a sectional approach with longitudinal perspective (VIEIRA, 2004). The primary and secondary data were used in order to understand the evolutionary process of the sector and their inter-actors re­ lationships in the arrangement for the promotion of development, for both, was used the contend and documentary analysis technique, respectively (DELLAGNELO ; SIL­VA, 2005). The approach of social networks has permitted understand that social re­lationships may extend the collective efficiency of the arrangement, and therefore need to develop policies that encourage the legalization of informal companies in ar­rangement, by showing up themselves representative. Thus, the relations estab­ lished in LPA of confections from Agreste of Pernambuco need for more effective mechanisms to broaden the collective efficiency. Therefore, this way as take place has directly benefited only a group of companies that are linked in some way the sup­portive institutions. So we can conclude that the inter-actor relations have limited the collective efficiency of LPA, being stimulated by the institutions in support only to groups of entrepreneurs, even those that produce external relations for all clustered companies

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2013 the European Commission launched its new green infrastructure strategy to make another attempt to stop and possibly reverse the loss of biodiversity until 2020, by connecting habitats in the wider landscape. This means that conservation would go beyond current practices to include landscapes that are dominated by conventional agriculture, where biodiversity conservation plays a minor role at best. The green infrastructure strategy aims at bottom-up rather than top-down implementation, and suggests including local and regional stakeholders. Therefore, it is important to know which stakeholders influence land-use decisions concerning green infrastructure at the local and regional level. The research presented in this paper served to select stakeholders in preparation for a participatory scenario development process to analyze consequences of different implementation options of the European green infrastructure strategy. We used a mix of qualitative and quantitative social network analysis (SNA) methods to combine actors’ attributes, especially concerning their perceived influence, with structural and relational measures. Further, our analysis provides information on institutional backgrounds and governance settings for green infrastructure and agricultural policy. The investigation started with key informant interviews at the regional level in administrative units responsible for relevant policies and procedures such as regional planners, representatives of federal ministries, and continued at the local level with farmers and other members of the community. The analysis revealed the importance of information flows and regulations but also of social pressure, considerably influencing biodiversity governance with respect to green infrastructure and biodiversity.