12 resultados para Social Media Technology
em CentAUR: Central Archive University of Reading - UK
Organisational semiotics methods to assess organisational readiness for internal use of social media
Resumo:
The paper presents organisational semiotics (OS) as an approach for identifying organisational readiness factors for internal use of social media within information intensive organisations (IIO). The paper examines OS methods, such as organisational morphology, containment analysis and collateral analysis to reveal factors of readiness within an organisation. These models also help to identify the essential patterns of activities needed for social media use within an organisation, which can provide a basis for future analysis. The findings confirmed many of the factors, previously identified in literature, while also revealing new factors using OS methods. The factors for organisational readiness for internal use of social media include resources, organisational climate, processes, motivational readiness, benefit and organisational control factors. Organisational control factors revealed are security/privacy, policies, communication procedures, accountability and fallback.
Resumo:
The increasing use of social media, applications or platforms that allow users to interact online, ensures that this environment will provide a useful source of evidence for the forensics examiner. Current tools for the examination of digital evidence find this data problematic as they are not designed for the collection and analysis of online data. Therefore, this paper presents a framework for the forensic analysis of user interaction with social media. In particular, it presents an inter-disciplinary approach for the quantitative analysis of user engagement to identify relational and temporal dimensions of evidence relevant to an investigation. This framework enables the analysis of large data sets from which a (much smaller) group of individuals of interest can be identified. In this way, it may be used to support the identification of individuals who might be ‘instigators’ of a criminal event orchestrated via social media, or a means of potentially identifying those who might be involved in the ‘peaks’ of activity. In order to demonstrate the applicability of the framework, this paper applies it to a case study of actors posting to a social media Web site.
Resumo:
The use of online data is becoming increasingly essential for the generation of insight in today’s research environment. This reflects the much wider range of data available online and the key role that social media now plays in interpersonal communication. However, the process of gaining permission to use social media data for research purposes creates a number of significant issues when considering compatibility with professional ethics guidelines. This paper critically explores the application of existing informed consent policies to social media research and compares with the form of consent gained by the social networks themselves, which we label ‘uninformed consent’. We argue that, as currently constructed, informed consent carries assumptions about the nature of privacy that are not consistent with the way that consumers behave in an online environment. On the other hand, uninformed consent relies on asymmetric relationships that are unlikely to succeed in an environment based on co-creation of value. The paper highlights the ethical ambiguity created by current approaches for gaining customer consent, and proposes a new conceptual framework based on participative consent that allows for greater alignment between consumer privacy and ethical concerns.
Resumo:
Social network has gained remarkable attention in the last decade. Accessing social network sites such as Twitter, Facebook LinkedIn and Google+ through the internet and the web 2.0 technologies has become more affordable. People are becoming more interested in and relying on social network for information, news and opinion of other users on diverse subject matters. The heavy reliance on social network sites causes them to generate massive data characterised by three computational issues namely; size, noise and dynamism. These issues often make social network data very complex to analyse manually, resulting in the pertinent use of computational means of analysing them. Data mining provides a wide range of techniques for detecting useful knowledge from massive datasets like trends, patterns and rules [44]. Data mining techniques are used for information retrieval, statistical modelling and machine learning. These techniques employ data pre-processing, data analysis, and data interpretation processes in the course of data analysis. This survey discusses different data mining techniques used in mining diverse aspects of the social network over decades going from the historical techniques to the up-to-date models, including our novel technique named TRCM. All the techniques covered in this survey are listed in the Table.1 including the tools employed as well as names of their authors.
Resumo:
Nanoscience and technology (NST) are widely cited to be the defining technology for the 21st century. In recent years, the debate surrounding NST has become increasingly public, with much of this interest stemming from two radically opposing long-term visions of a NST-enabled future: ‘nano-optimism’ and ‘nano-pessimism’. This paper demonstrates that NST is a complex and wide-ranging discipline, the future of which is characterised by uncertainty. It argues that consideration of the present-day issues surrounding NST is essential if the public debate is to move forwards. In particular, the social constitution of an emerging technology is crucial if any meaningful discussion surrounding costs and benefits is to be realised. An exploration of the social constitution of NST raises a number of issues, of which unintended consequences and the interests of those who own and control new technologies are highlighted.
Resumo:
The use of social network sites (SNS) has become very valuable to educational institutions. Some universities have formally integrated these social media in their educational systems and are using them to improve their service delivery. The main aim of this study was to establish whether African universities have embraced this emerging technology by having official presence on SNS. A purposive sampling method was used to study 24 universities from which data were obtained by visiting their official websites and following the official links to the most common SNS.
Resumo:
This paper reports on an exploratory study of segmentation practices of organisations with a social media presence. It investigates whether traditional segmentation approaches are still relevant in this new socio-technical environment and identifies emerging practices. The study found that social media are particularly promising in terms of targeting influencers, enabling the cost-effective delivery of personalised messages and engaging with numerous customer segments in a differentiated way. However, some problems previously identified in the segmentation literature still occur in the social media environment, such as the technical challenge of integrating databases, the preference for pragmatic rather than complex solutions and the lack of relevant analytical skills. Overall, a gap has emerged between marketing theory and practice. While segmentation is far from obsolete in the age of the social customer, it needs to adapt to reflect the characteristics of the new media.
Resumo:
We are looking into variants of a domination set problem in social networks. While randomised algorithms for solving the minimum weighted domination set problem and the minimum alpha and alpha-rate domination problem on simple graphs are already present in the literature, we propose here a randomised algorithm for the minimum weighted alpha-rate domination set problem which is, to the best of our knowledge, the first such algorithm. A theoretical approximation bound based on a simple randomised rounding technique is given. The algorithm is implemented in Python and applied to a UK Twitter mentions networks using a measure of individuals’ influence (klout) as weights. We argue that the weights of vertices could be interpreted as the costs of getting those individuals on board for a campaign or a behaviour change intervention. The minimum weighted alpha-rate dominating set problem can therefore be seen as finding a set that minimises the total cost and each individual in a network has at least alpha percentage of its neighbours in the chosen set. We also test our algorithm on generated graphs with several thousand vertices and edges. Our results on this real-life Twitter networks and generated graphs show that the implementation is reasonably efficient and thus can be used for real-life applications when creating social network based interventions, designing social media campaigns and potentially improving users’ social media experience.
Resumo:
The It Gets Better project has been held up as a model of successful social media activism. This article explores how narrators of It Gets Better videos make use of generic intertextuality, strategically combining the canonical narrative genres of the exemplum, the testimony, and the confession in a way that allows them to claim ‘textual authority’ and to make available multiple moral positions for themselves and their listeners. This strategy is further facilitated by the ambiguous participation frameworks associated with digital media, which make it possible for storytellers to tell different kinds of stories to different kinds of listeners at the same time, to simultaneously comfort the victims of anti-gay violence, confront its perpetrators, and elicit sympathy from ‘onlookers’. This analysis highlights the potential of new practices of online storytelling for social activism, and challenges notions that new media are contributing to the demise of common narrative traditions.