654 resultados para networked journalism


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two-component systems capable of self-assembling into soft gel-phase materials are of considerable interest due to their tunability and versatility. This paper investigates two-component gels based on a combination of a L-lysine-based dendron and a rigid diamine spacer (1,4-diaminobenzene or 1,4-diaminocyclohexane). The networked gelator was investigated using thermal measurements, circular dichroism, NMR spectroscopy and small angle neutron scattering (SANS) giving insight into the macroscopic properties, nanostructure and molecular-scale organisation. Surprisingly, all of these techniques confirmed that irrespective of the molar ratio of the components employed, the "solid-like" gel network always consisted of a 1:1 mixture of dendron/diamine. Additionally, the gel network was able to tolerate a significant excess of diamine in the "liquid-like" phase before being disrupted. In the light of this observation, we investigated the ability of the gel network structure to evolve from mixtures of different aromatic diamines present in excess. We found that these two-component gels assembled in a component-selective manner, with the dendron preferentially recognising 1,4-diaminobenzene (>70%). when similar competitor diamines (1,2- and 1,3-diaminobenzene) are present. Furthermore, NMR relaxation measurements demonstrated that the gel based oil 1,4-diaminobenzene was better able to form a selective ternary complex with pyrene than the gel based oil 1,4-diaminocyclohexane, indicative of controlled and selective pi-pi interactions within a three-component assembly. As such, the results ill this paper demonstrate how component selection processes in two-component gel systems call control hierarchical self-assembly.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

External reflection FTIR spectroscopy and surface pressure measurements were used to compare conformational changes in the adsorbed structures of three globular proteins at the air/water interface. Of the three proteins studied, lysozyme, bovine serum albumin and P-lactoglobulin, lysozyme was unique in its behaviour. Lysozyme adsorption was slow, taking approximately 2.5 h to reach a surface pressure plateau (from a 0.07 mM solution), and led to significant structural change. The FTIR spectra revealed that lysozyme formed a highly networked adsorbed layer of unfolded protein with high antiparallel beta-sheet content and that these changes occurred rapidly (within 10 min). This non-native secondary structure is analogous to that of a 3D heat-set protein gel, suggesting that the adsorbed protein formed a highly networked interfacial layer. Albumin and P-lactoglobulin adsorbed rapidly (reaching a plateau within 10 min) and with little chance to their native secondary structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gerry Anderson’s 1960s puppet series have hybrid identities in relation to their medial, geographical, and production histories. This chapter ranges over his science fiction series from Supercar (1961) to Joe 90 (1968), arguing that Anderson’s television science fiction in that period crossed many kinds of boundary and border. Anderson’s television series were a compromise between his desire to make films for adults versus an available market for children’s television puppet programs, and aimed to appeal to a cross-generational family audience. They were made on film, using novel effects, for a UK television production culture that still relied largely on live and videotaped production. While commissioned by British ITV companies, the programs had notable success in the USA, achieving national networked screening as well as syndication, and they were designed to be transatlantic products. The transnational hero teams and security organisations featured in the series supported this internationalism, and simultaneously negotiated between the cultural meanings of Britishness and Americanness. By discussing their means of production, the aesthetic and narrative features of the programs, their institutional contexts, and their international distribution, this chapter argues that Anderson’s series suggest ways of rethinking the boundaries of British science fiction television in the 1960s.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. This work proposes a fully decentralised algorithm (Epidemic K-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art distributed K-Means algorithms based on sampling methods. The experimental analysis confirms that the proposed algorithm is a practical and accurate distributed K-Means implementation for networked systems of very large and extreme scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gossip (or Epidemic) protocols have emerged as a communication and computation paradigm for large-scale networked systems. These protocols are based on randomised communication, which provides probabilistic guarantees on convergence speed and accuracy. They also provide robustness, scalability, computational and communication efficiency and high stability under disruption. This work presents a novel Gossip protocol named Symmetric Push-Sum Protocol for the computation of global aggregates (e.g., average) in decentralised and asynchronous systems. The proposed approach combines the simplicity of the push-based approach and the efficiency of the push-pull schemes. The push-pull schemes cannot be directly employed in asynchronous systems as they require synchronous paired communication operations to guarantee their accuracy. Although push schemes guarantee accuracy even with asynchronous communication, they suffer from a slower and unstable convergence. Symmetric Push- Sum Protocol does not require synchronous communication and achieves a convergence speed similar to the push-pull schemes, while keeping the accuracy stability of the push scheme. In the experimental analysis, we focus on computing the global average as an important class of node aggregation problems. The results have confirmed that the proposed method inherits the advantages of both other schemes and outperforms well-known state of the art protocols for decentralized Gossip-based aggregation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic multi-user interactions in a single networked virtual environment suffer from abrupt state transition problems due to communication delays arising from network latency--an action by one user only becoming apparent to another user after the communication delay. This results in a temporal suspension of the environment for the duration of the delay--the virtual world `hangs'--followed by an abrupt jump to make up for the time lost due to the delay so that the current state of the virtual world is displayed. These discontinuities appear unnatural and disconcerting to the users. This paper proposes a novel method of warping times associated with users to ensure that each user views a continuous version of the virtual world, such that no hangs or jumps occur despite other user interactions. Objects passed between users within the environment are parameterized, not by real time, but by a virtual local time, generated by continuously warping real time. This virtual time periodically realigns itself with real time as the virtual environment evolves. The concept of a local user dynamically warping the local time is also introduced. As a result, the users are shielded from viewing discontinuities within their virtual worlds, consequently enhancing the realism of the virtual environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks, such as massively parallel processors and clusters of workstations. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. The lack of scalable and fault tolerant global communication and synchronisation methods in large-scale systems has hindered the adoption of the K-Means algorithm for applications in large networked systems such as wireless sensor networks, peer-to-peer systems and mobile ad hoc networks. This work proposes a fully distributed K-Means algorithm (EpidemicK-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art sampling methods and shows that the proposed method overcomes the limitations of the sampling-based approaches for skewed clusters distributions. The experimental analysis confirms that the proposed algorithm is very accurate and fault tolerant under unreliable network conditions (message loss and node failures) and is suitable for asynchronous networks of very large and extreme scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A world of ubiquitous computing, full of networked mobile and embedded technologies, is approaching. The benefits of this technology are numerous, and act as the major driving force behind its development. These benefits are brought about, in part, by ubiquitous monitoring (UM): the continuous and wide spread collection of ?significant amounts of data about users

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In any wide-area distributed system there is a need to communicate and interact with a range of networked devices and services ranging from computer-based ones (CPU, memory and disk), to network components (hubs, routers, gateways) and specialised data sources (embedded devices, sensors, data-feeds). In order for the ensemble of underlying technologies to provide an environment suitable for virtual organisations to flourish, the resources that comprise the fabric of the Grid must be monitored in a seamless manner that abstracts away from the underlying complexity. Furthermore, as various competing Grid middleware offerings are released and evolve, an independent overarching monitoring service should act as a corner stone that ties these systems together. GridRM is a standards-based approach that is independent of any given middleware and that can utilise legacy and emerging resource-monitoring technologies. The main objective of the project is to produce a standardised and extensible architecture that provides seamless mechanisms to interact with native monitoring agents across heterogeneous resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper seeks to synthesise the various contributions to the special issue of Long Range Planning on competence-creating subsidiaries (CCS), and identifies avenues for future research. Effective competence-creation through a network of subsidiaries requires an appropriate balance between internal and external embeddedness. There are multiple types of firm-specific advantages (FSAs) essential to achieve this. In addition, wide-bandwidth pathways are needed with collaborators, suppliers, customers as well as internally within the MNE. Paradoxically, there is a natural tendency for bandwidth to shrink as dispersion increases. As distances (technological, organisational, and physical) become greater, there may be decreasing returns to R&D spread. Greater resources for knowledge integration and coordination are needed as intra-MNE and inter-firm R&D cooperation becomes more intensive and extensive. MNEs need to invest in mechanisms to promote wide-bandwidth knowledge flows, without which widely dispersed and networked MNEs can suffer from internal market failures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human ICT implants, such as RFID implants, cochlear implants, cardiac pacemakers, Deep Brain Stimulation, bionic limbs connected to the nervous system, and networked cognitive prostheses, are becoming increasingly complex. With ever-growing data processing functionalities in these implants, privacy and security become vital concerns. Electronic attacks on human ICT implants can cause significant harm, both to implant subjects and to their environment. This paper explores the vulnerabilities which human implants pose to crime victimisation in light of recent technological developments, and analyses how the law can deal with emerging challenges of what may well become the next generation of cybercrime: attacks targeted at technology implanted in the human body. After a state-of-the-art description of relevant types of human implants and a discussion how these implants challenge existing perceptions of the human body, we describe how various modes of attacks, such as sniffing, hacking, data interference, and denial of service, can be committed against implants. Subsequently, we analyse how these attacks can be assessed under current substantive and procedural criminal law, drawing on examples from UK and Dutch law. The possibilities and limitations of cybercrime provisions (eg, unlawful access, system interference) and bodily integrity provisions (eg, battery, assault, causing bodily harm) to deal with human-implant attacks are analysed. Based on this assessment, the paper concludes that attacks on human implants are not only a new generation in the evolution of cybercrime, but also raise fundamental questions on how criminal law conceives of attacks. Traditional distinctions between physical and non-physical modes of attack, between human bodies and things, between exterior and interior of the body need to be re-interpreted in light of developments in human implants. As the human body and technology become increasingly intertwined, cybercrime legislation and body-integrity crime legislation will also become intertwined, posing a new puzzle that legislators and practitioners will sooner or later have to solve.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the linguistic practice of digital code plays in an online discussion forum, used by the community of English-speaking Germans living in Britain. By adopting a qualitative approach of Computer-Mediated Discourse Analysis, the article examines the ways in which these bilinguals deploy linguistic and other semiotic resources on the forum to co-construct humorous code plays. These performances occur in the context of negotiating language norms and are based on conscious manipulations of both codes, English and German. They involve play with codes at three levels: play with forms, meanings, and frames. Although, at first sight, such alternations appear to be used mainly for a comic effect, there is more to this than just humour. By mixing both codes at all levels, the participants deliberately produce aberrant German ‘polluted’ with English and, in so doing, dismantle the ideology of language purity upheld by the purist movement. The deliberate character of this type of code alternation demonstrates heightened metalinguistic awareness as well as creativity and criticality. By exploring the practice of digital code plays, the current study contributes to the growing body of research on networked multilingualism as well as to practices associated with translanguaging, poly- and metrolingualism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Epidemic protocols are a bio-inspired communication and computation paradigm for large and extreme-scale networked systems. This work investigates the expansion property of the network overlay topologies induced by epidemic protocols. An expansion quality index for overlay topologies is proposed and adopted for the design of epidemic membership protocols. A novel protocol is proposed, which explicitly aims at improving the expansion quality of the overlay topologies. The proposed protocol is tested with a global aggregation task and compared to other membership protocols. The analysis by means of simulations indicates that the expansion quality directly relates to the speed of dissemination and convergence of epidemic protocols and can be effectively used to design better protocols.