890 resultados para Interception of communications
Resumo:
This paper attempts to address the effectiveness of physical-layer network coding (PNC) on the throughput improvement for multi-hop multicast in random wireless ad hoc networks (WAHNs). We prove that the per session throughput order with PNC is tightly bounded as T((nvmR (n))-1) if m = O(R-2 (n)), where n is the total number of nodes, R(n) is the communication range, and m is the number of destinations for each multicast session. We also show that per-session throughput order with PNC is tight bounded as T(n-1), when m = O(R-2(n)). The results of this paper imply that PNC cannot improve the throughput order of multicast in random WAHNs, which is different from the intuition that PNC may improve the throughput order as it allows simultaneous signal access and combination.
Resumo:
Intelligent transport system (ITS) has large potentials on road safety applications as well as nonsafety applications. One of the big challenges for ITS is on the reliable and cost-effective vehicle communications due to the large quantity of vehicles, high mobility, and bursty traffic from the safety and non-safety applications. In this paper, we investigate the use of dedicated short-range communications (DSRC) for coexisting safety and non-safety applications over infrastructured vehicle networks. The main objective of this work is to improve the scalability of communications for vehicles networks, ensure QoS for safety applications, and leave as much as possible bandwidth for non-safety applications. A two-level adaptive control scheme is proposed to find appropriate message rate and control channel interval for safety applications. Simulation results demonstrated that this adaptive method outperforms the fixed control method under varying number of vehicles. © 2012 Wenyang Guan et al.
Resumo:
Definitions and perceptions of the role and styles of risk management, and performance management/strategic control systems have evolved over time, but it can be argued that risk management is primarily concerned with ensuring the achievement of strategic objectives. This paper shows the extent of overlap between a broad-based view of risk management, namely Enterprise Risk Management (ERM), and the balanced scorecard, which is a widely used strategic control system. A case study of one of the UK's largest retailers, Tesco plc, is used to show how ERM can be introduced as part of an existing strategic control system. The case demonstrates that, despite some differences in lines of communications, the strategic controls and risk controls can be used to achieve a common objective. Adoption of such an integrated approach, however, has implications for the profile of risk and the overall risk culture within an organisation.
Resumo:
Advertising and other forms of communications are often used by government bodies, non-government organisations, and other institutions to try to influence the population to either a) reduce some form of harmful behaviour (e.g. smoking, drunk- driving) or b) increase some more healthy behaviour (e.g. eating healthily). It is common for these messages to be predicated on the chances of some negative event occurring if the individual does not either a) stop the harmful behaviour, or b) start / increase the healthy behaviour. This design of communication is referred to by many names in the relevant literature, but for the purposes of this thesis, will be termed a ‘threat appeal’. Despite their widespread use in the public sphere, and concerted academic interest since the 1950s, the effectiveness of threat appeals in delivering their objective remains unclear in many ways. In a detailed, chronological and thematic examination of the literature, two assumptions are uncovered that have either been upheld despite little evidence to support them, or received limited attention at all, in the literature. Specifically, a) that threat appeal characteristics can be conflated with their intended responses, and b) that a threat appeal always and necessarily evokes a fear response in the subject. A detailed examination of these assumptions underpins this thesis. The intention is to take as a point of departure the equivocality of empirical results, and deliver a novel approach with the objective of reducing the confusion that is evident in existing work. More specifically, the present thesis frames cognitive and emotional responses to threat appeals as part of a decision about future behaviour. To further develop theory, a conceptual framework is presented that outlines the role of anticipated and anticipatory emotions, alongside subjective probabilities, elaboration and immediate visceral emotions, resultant from manipulation of the intrinsic message characteristics of a threat appeal (namely, message direction, message frame and graphic image). In doing so, the spectrum of relevant literature is surveyed, and used to develop a theoretical model which serves to integrate key strands of theory into a coherent model. In particular, the emotional and cognitive responses to the threat appeal manipulations are hypothesised to influence behaviour intentions and expectations pertaining to future behaviour. Using data from a randomised experiment with a sample of 681 participants, the conceptual model was tested using analysis of covariance. The results for the conceptual framework were encouraging overall, and also with regard to the individual hypotheses. In particular, empirical results showed clearly that emotional responses to the intrinsic message characteristics are not restricted to fear, and that different responses to threat appeals were clearly attributed to specific intrinsic message characteristics. In addition, the inclusion of anticipated emotions alongside cognitive appraisals in the framework generated interesting results. Specifically, immediate emotions did not influence key response variables related to future behaviour, in support of questioning the assumption of the prominent role of fear in the response process that is so prevalent in existing literature. The findings, theoretical and practical implications, limitations and directions for future research are discussed.
Resumo:
In the agrifood sector, the explosive increase in information about environmental sustainability, often in uncoordinated information systems, has created a new form of ignorance ('meta-ignorance') that diminishes the effectiveness of information on decision-makers. Flows of information are governed by informal and formal social arrangements that we can collectively call Informational Institutions. In this paper, we have reviewed the recent literature on such institutions. From the perspectives of information theory and new institutional economics, current informational institutions are increasing the information entropy of communications concerning environmental sustainability and stakeholders' transaction costs of using relevant information. In our view this reduces the effectiveness of informational governance. Future research on informational governance should explicitly address these aspects.
Resumo:
Communication can be seen as one of the most important features to manage conflicts and the stress of the work teams that operate in environments with strong pressure, complex operations and continuous risk, which are aspects that characterize a high reliability organization. This article aims to highlight the importance of communication in high-reliability organizations, having as object of study the accidents and incidents in civil aviation area. It refers to a qualitative research, outlined by documental analysis based on investigations conducted by the Federal Aviation Administration and the Center of Investigation and Prevention of Aeronautical Accidents. The results point out that human errors account for 60 to 80 percent of accidents and incidents. Most of these occurrences are attributed to miscommunication between the professionals involved with the air and ground operation, such as pilots, crewmembers and maintenance staff, and flight controllers. Inappropriate tone of voice usage, difficulties to understand different accents between the issuer and the receiver or even difficulty to perceive red flags between the lines of verbal and non-verbal communication, are elements that contribute to the fata of understanding between people involved in the operation. As a research limitation this present research pointed out a lack of a special category of "interpersonal communications failures" in the official agency reports. So, the researchers must take the conceptual definition of "social ability", communication implied, to classify behaviors and communication matters accordingly. Other research finding indicates that communication is superficially approached in the contents of air operations courses what could mitigate the lack of communications skills as a social ability. Part of the research findings refers to the contents of communication skills development into the program to train professional involved in air flight and ground operations. So, it is expected that this present article gives an appropriate highlight towards the improvement of flight operations training programs. Developing communication skills among work teams in high reliability organizations can contribute to mitigate stress, accidents and incidents in Civil Aviation Field. The original contribution of this article is the proposal of the main contents that should be developed in a Communication Skills Training Program, specially addressed to Civil Aviation operations.
Resumo:
A new parallel approach for solving a pentadiagonal linear system is presented. The parallel partition method for this system and the TW parallel partition method on a chain of P processors are introduced and discussed. The result of this algorithm is a reduced pentadiagonal linear system of order P \Gamma 2 compared with a system of order 2P \Gamma 2 for the parallel partition method. More importantly the new method involves only half the number of communications startups than the parallel partition method (and other standard parallel methods) and hence is a far more efficient parallel algorithm.
Resumo:
General-purpose parallel processing for solving day-to-day industrial problems has been slow to develop, partly because of the lack of suitable hardware from well-established, mainstream computer manufacturers and suitably parallelized application software. The parallelization of a CFD-(computational fluid dynamics) flow solution code is known as ESAUNA. This code is part of SAUNA, a large CFD suite aimed at computing the flow around very complex aircraft configurations including complete aircraft. A novel feature of the SAUNA suite is that it is designed to use either block-structured hexahedral grids, unstructured tetrahedral grids, or a hybrid combination of both grid types. ESAUNA is designed to solve the Euler equations or the Navier-Stokes equations, the latter in conjunction with various turbulence models. Two fundamental parallelization concepts are used—namely, grid partitioning and encapsulation of communications. Grid partitioning is applied to both block-structured grid modules and unstructured grid modules. ESAUNA can also be coupled with other simulation codes for multidisciplinary computations such as flow simulations around an aircraft coupled with flutter prediction for transient flight simulations.
Resumo:
Have been less than thirty years since a group of graduate students and computer scientists working on a federal contract performed the first successful connection between two computers located at remote sites. This group known as the NWG Network Working Group, comprised of highly creative geniuses who as soon as they began meeting started talking about things like intellectual graphics, cooperating processes, automation questions, email, and many other interesting possibilities 1 . In 1968, the group's task was to design NWG's first computer network, in October 1969, the first data exchange occurred and by the end of that year a network of four computers was in operation. Since the invention of the telephone in 1876 no other technology has revolutionized the field of communications over the computer network. The number of people who have made great contributions to the creation and development of the Internet are many, the computer network a much more complex than the phone is the result of people of many nationalities and cultures. However, remember that some years later in 19732 two computer scientists Robert Kahn and Vinton Cerft created a more sophisticated communication program called Transmission Control Protocol - Internet Protocol TCP / IP which is still in force in the Internet today.
Resumo:
Part 12: Collaboration Platforms
Resumo:
The Pierre Auger Cosmic Ray Observatory North site employs a large array of surface detector stations (tanks) to detect the secondary particle showers generated by ultra-high energy cosmic rays. Due to the rare nature of ultra-high energy cosmic rays, it is important to have a high reliability on tank communications, ensuring no valuable data is lost. The Auger North site employs a peer-to-peer paradigm, the Wireless Architecture for Hard Real-Time Embedded Networks (WAHREN), designed specifically for highly reliable message delivery over fixed networks, under hard real-time deadlines. The WAHREN design included two retransmission protocols, Micro- and Macro- retransmission. To fully understand how each retransmission protocol increased the reliability of communications, this analysis evaluated the system without using either retransmission protocol (Case-0), both Micro- and Macro-retransmission individually (Micro and Macro), and Micro- and Macro-retransmission combined. This thesis used a multimodal modeling methodology to prove that a performance and reliability analysis of WAHREN was possible, and provided the results of the analysis. A multimodal approach was necessary because these processes were driven by different mathematical models. The results from this analysis can be used as a framework for making design decisions for the Auger North communication system.
Resumo:
2016
Resumo:
This paper presents a novel algorithm for the gateway placement problem in Backbone Wireless Mesh Networks (BWMNs). Different from existing algorithms, the new algorithm incrementally identifies gateways and assigns mesh routers to identified gateways. The new algorithm can guarantee to find a feasible gateway placement satisfying Quality-of-Service (QoS) constraints, including delay constraint, relay load constraint and gateway capacity constraint. Experimental results show that its performance is as good as that of the best of existing algorithms for the gateway placement problem. But, the new algorithm can be used for BWMNs that do not form one connected component, and it is easy to implement and use.
Resumo:
This paper anatomises emerging developments in online community engagement in a major global industry: real estate. Economists argue that we are entering a ‘social network economy’ in which ‘complex social networks’ govern consumer choice and product value. In the light of this, organisations are shifting from thinking and behaving in the conventional ‘value chain’ model--in which exchanges between firms and customers are one-way only, from the firm to the consumer--to the ‘value ecology’ model, in which consumers and their networks become co-creators of the value of the product. This paper studies the way in which the global real estate industry is responding to this environment. This paper identifies three key areas in which online real estate ‘value ecology’ work is occurring: real estate social networks, games, and locative media / augmented reality applications. Uptake of real estate applications is, of course, user-driven: the paper not only highlights emerging innovations; it also identifies which of these innovations are actually being taken up by users, and the content contributed as a result. The paper thus provides a case study of one major industry’s shift into a web 2.0 communication model, focusing on emerging trends and issues.
Resumo:
This paper raises the question of whether comparative national models of communications research can be developed, along the lines of Hallin and Mancini’s (2004) analysis of comparative media policy, or the work of Perraton and Clift (2004) on comparative national capitalisms. Taking consideration of communications research in Australia and New Zealand as its starting point, the paper will consider what are relevant variables in shaping an “intellectual milieu” for communications research in these countries, as compared to those of Europe, North America and Asia. Some possibly relevant variables include: • Type of media system (e.g. how significant is public service media?); • Political culture (e.g. are there significant left-of-centre political parties?); • Dominant intellectual traditions; • Level and types of research funding; • Overall structure of higher education system, and where communications sits within it. In considering whether such an exercise can or should be undertaken, we can also evaluate, as Hallin and Mancini do, the significance of potentially homogenizing forces. These would include globalization, new media technologies, and the rise of a global “audit culture”. The paper will raise these issues as questions that emerge as we consider, as Curran and Park (2000) and Thussu (2009) have proposed, what a “de-Westernized” media and communications research paradigm may look like.