863 resultados para Business enterprises -- Computer networks
Resumo:
In dynamic and uncertain environments, where the needs of security and information availability are difficult to balance, an access control approach based on a static policy will be suboptimal regardless of how comprehensive it is. Risk-based approaches to access control attempt to address this problem by allocating a limited budget to users, through which they pay for the exceptions deemed necessary. So far the primary focus has been on how to incorporate the notion of budget into access control rather than what or if there is an optimal amount of budget to allocate to users. In this paper we discuss the problems that arise from a sub-optimal allocation of budget and introduce a generalised characterisation of an optimal budget allocation function that maximises organisations expected benefit in the presence of self-interested employees and costly audit.
Resumo:
Real-time networked control systems (NCSs) over data networks are being increasingly implemented on a massive scale in industrial applications. Along with this trend, wireless network technologies have been promoted for modern wireless NCSs (WNCSs). However, popular wireless network standards such as IEEE 802.11/15/16 are not designed for real-time communications. Key issues in real-time applications include limited transmission reliability and poor transmission delay performance. Considering the unique features of real-time control systems, this paper develops a conditional retransmission enabled transport protocol (CRETP) to improve the delay performance of the transmission control protocol (TCP) and also the reliability performance of the user datagram protocol (UDP) and its variants. Key features of the CRETP include a connectionless mechanism with acknowledgement (ACK), conditional retransmission and detection of ineffective data packets on the receiver side.
Resumo:
Popular wireless networks, such as IEEE 802.11/15/16, are not designed for real-time applications. Thus, supporting real-time quality of service (QoS) in wireless real-time control is challenging. This paper adopts the widely used IEEE 802.11, with the focus on its distributed coordination function (DCF), for soft-real-time control systems. The concept of the critical real-time traffic condition is introduced to characterize the marginal satisfaction of real-time requirements. Then, mathematical models are developed to describe the dynamics of DCF based real-time control networks with periodic traffic, a unique feature of control systems. Performance indices such as throughput and packet delay are evaluated using the developed models, particularly under the critical real-time traffic condition. Finally, the proposed modelling is applied to traffic rate control for cross-layer networked control system design.
Resumo:
The final shape of the "Internet of Things" ubiquitous computing promises relies on a cybernetic system of inputs (in the form of sensory information), computation or decision making (based on the prefiguration of rules, contexts, and user-generated or defined metadata), and outputs (associated action from ubiquitous computing devices). My interest in this paper lies in the computational intelligences that suture these positions together, and how positioning these intelligences as autonomous agents extends the dialogue between human-users and ubiquitous computing technology. Drawing specifically on the scenarios surrounding the employment of ubiquitous computing within aged care, I argue that agency is something that cannot be traded without serious consideration of the associated ethics.
Resumo:
Purpose – The work presented in this paper aims to provide an approach to classifying web logs by personal properties of users. Design/methodology/approach – The authors describe an iterative system that begins with a small set of manually labeled terms, which are used to label queries from the log. A set of background knowledge related to these labeled queries is acquired by combining web search results on these queries. This background set is used to obtain many terms that are related to the classification task. The system then ranks each of the related terms, choosing those that most fit the personal properties of the users. These terms are then used to begin the next iteration. Findings – The authors identify the difficulties of classifying web logs, by approaching this problem from a machine learning perspective. By applying the approach developed, the authors are able to show that many queries in a large query log can be classified. Research limitations/implications – Testing results in this type of classification work is difficult, as the true personal properties of web users are unknown. Evaluation of the classification results in terms of the comparison of classified queries to well known age-related sites is a direction that is currently being exploring. Practical implications – This research is background work that can be incorporated in search engines or other web-based applications, to help marketing companies and advertisers. Originality/value – This research enhances the current state of knowledge in short-text classification and query log learning. Classification schemes, Computer networks, Information retrieval, Man-machine systems, User interfaces
Resumo:
Google, Facebook, Twitter, LinkedIn, etc. are some of the prominent large-scale digital service providers that are having tremendous impact on societies, corporations and individuals. However, despite the rapid uptake and their obvious influence on the behavior of individuals and the business models and networks of organizations, we still lack a deeper, theory-guided understanding of the related phenomenon. We use Teece’s notion of complementary assets and extend it towards ‘digital complementary assets’ (DCA) in an attempt to provide such a theory-guided understanding of these digital services. Building on Teece’s theory, we make three contributions. First, we offer a new conceptualization of digital complementary assets in the form of digital public goods and digital public assets. Second, we differentiate three models for how organizations can engage with such digital complementary assets. Third, user-base is found to be a critical factor when considering appropriability.
Resumo:
This thesis addresses one of the fundamental issues that remains unresolved in patent law today. It is a question that strikes at the heart of what a patent is and what it is supposed to protect. That question is whether an invention must produce a physical effect or cause a physical transformation of matter to be patentable, or whether it is sufficient that an invention involves a specific practical application of an idea or principle to achieve a useful result. In short, the question is whether patent law contains a physicality requirement. Resolving this issue will determine whether only traditional mechanical, industrial and manufacturing processes are patent eligible, or whether patent eligibility extends to include purely intangible, or non-physical, products and processes. To this end, this thesis seeks to identify where the dividing line lies between patentable subject matter and the recognised categories of excluded matter, namely, fundamental principles of nature, physical phenomena, and abstract ideas. It involves determining which technological advances are worth the inconvenience monopoly protection causes the public at large, and which should remain free for all to use without restriction. This is an issue that has important ramifications for innovation in the ‘knowledge economy’ of the Information Age. Determining whether patent law contains a physicality requirement is integral to deciding whether much of the valuable innovation we are likely to witness, in what are likely to be the emerging areas of technology in the near future, will receive the same encouragement as industrial and manufacturing advances of previous times.
Resumo:
Advanced substation applications, such as synchrophasors and IEC 61850-9-2 sampled value process buses, depend upon highly accurate synchronizing signals for correct operation. The IEEE 1588 Precision Timing Protocol (PTP) is the recommended means of providing precise timing for future substations. This paper presents a quantitative assessment of PTP reliability using Fault Tree Analysis. Two network topologies are proposed that use grandmaster clocks with dual network connections and take advantage of the Best Master Clock Algorithm (BMCA) from IEEE 1588. The cross-connected grandmaster topology doubles reliability, and the addition of a shared third grandmaster gives a nine-fold improvement over duplicated grandmasters. The performance of BMCA mediated handover of the grandmaster role during contingencies in the timing system was evaluated experimentally. The 1 µs performance requirement of sampled values and synchrophasors are met, even during network or GPS antenna outages. Slave clocks are shown to synchronize to the backup grandmaster in response to degraded performance or loss of the main grandmaster. Slave disturbances are less than 350 ns provided the grandmaster reference clocks are not offset from one another. A clear understanding of PTP reliability and the factors that affect availability will encourage the adoption of PTP for substation time synchronization.
Resumo:
The Internet is one of the most significant information and communication technologies to emerge during the end of the last century. It created new and effective means by which individuals and groups communicate. These advances led to marked institutional changes most notably in the realm of commercial exchange: it did not only provide the high-speed communication infrastructure to business enterprises; it also opened them to the global consumer base where they could market their products and services. Commercial interests gradually dominated Internet technology over the past several years and have been a factor in the increase of its user population and enhancement of infrastructure. Such commercial interests fitted comfortably within the structures of the Philippine government. As revealed in the study, state policies and programs make use of Internet technology as an enabler of commercial institutional reforms using traditional economic measures. Yet, despite efforts to maximize the Internet as an enabler for market-driven economic growth, the accrued benefits are yet to come about; it is largely present only in major urban areas and accessible to a small number of social groups. The failure of the Internet’s developmental capability can be traced back to the government’s wholesale adoption of commercial-centered discourse. The Internet’s developmental gains (i.e. instrumental, communicative and emancipatory) and features, which were always there since its inception, have been visibly left out in favor of its commercial value. By employing synchronic and diachronic analysis, it can be shown that the Internet can be a vital technology in promoting genuine social development in the Philippines. In general, the object is to realize a social environment of towards a more inclusive and participatory application of Internet technology, equally aware of the caveats or risks the technology may pose. It is argued further that there is a need for continued social scientific research regarding the social as and developmental implications of Internet technology at local level structures, such social sectors, specific communities and organizations. On the meta-level, such approach employed in this research can be a modest attempt in increasing the calculus of hope especially among the marginalized Filipino sectors, with the use of information and communications technologies. This emerging field of study—tentatively called Progressive Informatics—must emanate from the more enlightened social sectors, namely: the non-government, academic and locally-based organizations.
Resumo:
Threats against computer networks evolve very fast and require more and more complex measures. We argue that teams respectively groups with a common purpose for intrusion detection and prevention improve the measures against rapid propagating attacks similar to the concept of teams solving complex tasks known from field of work sociology. Collaboration in this sense is not easy task especially for heterarchical environments. We propose CIMD (collaborative intrusion and malware detection) as a security overlay framework to enable cooperative intrusion detection approaches. Objectives and associated interests are used to create detection groups for exchange of security-related data. In this work, we contribute a tree-oriented data model for device representation in the scope of security. We introduce an algorithm for the formation of detection groups, show realization strategies for the system and conduct vulnerability analysis. We evaluate the benefit of CIMD by simulation and probabilistic analysis.
Resumo:
The emergence of global computer networks and the ubiquitous availability of advanced information and communication technology (ICT) since the mid Nineties has given rise to the hope that the traditional disadvantages faced by regional economies and regional communities could be elevated easily and swiftly. Yet, the experience of both community informatics and community development researchers and practitioners tells a different tale. Although the potential of ICT is in fact realised in some situations and locations and does provide means to ensure sustainability in some regional communities, elsewhere it has not been taken up or has not been able to elicit change for the promised better. Too many communities are still faced by a centralised structure in the context of commerce, service provision or governance and by various degrees of digital divides between connected and disconnected, between media literate and illiterate, between young and old, and between urban and rural. Many attempts to close or bridge the digital divide have been reported with various degrees of success (cf. Menou, 2001; Servon, 2002). Most of these accounts echo a common voice in that they report similar principles of action, and they reflect – in most cases unconsciously – practices of sociocultural animation. This article seeks to shed light onto the concept of sociocultural animation which is already commonplace in various forms in the arts, in education and professional development, youth work, sports, town planning, careers services, entrepreneurship and tourism. It starts by exploring the origins of sociocultural animation and draws parallels to the current state of research and practice. It unpacks the foundation of sociocultural animation and briefly describes underlying principles and how they can be applied in the context of community informatics and developing regional communities with ICT. Finally, further areas of investigation are being proposed.
Resumo:
CSR is increasingly an essential issue for business enterprises. It is a company and multidimensional organizational ...
Resumo:
Understanding network traffic behaviour is crucial for managing and securing computer networks. One important technique is to mine frequent patterns or association rules from analysed traffic data. On the one hand, association rule mining usually generates a huge number of patterns and rules, many of them meaningless or user-unwanted; on the other hand, association rule mining can miss some necessary knowledge if it does not consider the hierarchy relationships in the network traffic data. Aiming to address such issues, this paper proposes a hybrid association rule mining method for characterizing network traffic behaviour. Rather than frequent patterns, the proposed method generates non-similar closed frequent patterns from network traffic data, which can significantly reduce the number of patterns. This method also proposes to derive new attributes from the original data to discover novel knowledge according to hierarchy relationships in network traffic data and user interests. Experiments performed on real network traffic data show that the proposed method is promising and can be used in real applications. Copyright2013 John Wiley & Sons, Ltd.
Resumo:
This thesis examines the value of political connections for business groups by constructing a unique dataset that allows us to identify the form and extent of the connections. Results show firms' membership to family-controlled business groups (South Korean chaebol) play a key role in determining the value of political connections. Politically connected chaebol firms experience substantial price increases following the establishment of the connection than other firms, but the reverse is found for other (non-family-controlled) connected business groups.
Resumo:
A Remote Sensing Core Curriculum (RSCC) development project is currently underway. This project is being conducted under the auspices of the National Center for Geographic Information and Analysis (NCGIA). RSCC is an outgrowth of the NCGIA GIS Core Curriculum project. It grew out of discussions begun at NCGIA, Initiative 12 (I-12): 'Integration of Remote Sensing and Geographic Information Systems'. This curriculum development project focuses on providing professors, teachers and instructors in undergraduate and graduate institutions with course materials from experts in specific subject matter for areas use in the class room.