351 resultados para internet computing
Resumo:
Intelligent Transport Systems (ITS) resembles the infrastructure for ubiquitous computing in the car. It encompasses a) all kinds of sensing technologies within vehicles as well as road infrastructure, b) wireless communication protocols for the sensed information to be exchanged between vehicles (V2V) and between vehicles and infrastructure (V2I), and c) appropriate intelligent algorithms and computational technologies that process these real-time streams of information. As such, ITS can be considered a game changer. It provides the fundamental basis of new, innovative concepts and applications, similar to the Internet itself. The information sensed or gathered within or around the vehicle has led to a variety of context-aware in-vehicular technologies within the car. A simple example is the Anti-lock Breaking System (ABS), which releases the breaks when sensors detect that the wheels are locked. We refer to this type of context awareness as vehicle/technology awareness. V2V and V2I communication, often summarized as V2X, enables the exchange and sharing of sensed information amongst cars. As a result, the vehicle/technology awareness horizon of each individual car is expanded beyond its observable surrounding, paving the way to technologically enhance such already advanced systems. In this chapter, we draw attention to those application areas of sensing and V2X technologies, where the human (driver), the human’s behavior and hence the psychological perspective plays a more pivotal role. The focal points of our project are illustrated in Figure 1: In all areas, the vehicle first (1) gathers or senses information about the driver. Rather than to limit the use of such information towards vehicle/technology awareness, we see great potential for applications in which this sensed information is then (2) fed back to the driver for an increased self-awareness. In addition, by using V2V technologies, it can also be (3) passed to surrounding drivers for an increased social awareness, or (4), pushed even further, into the cloud, where it is collected and visualized for an increased, collective urban awareness within the urban community at large, which includes all city dwellers.
Resumo:
Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.
Resumo:
Despite the compelling case for moving towards cloud computing, the upstream oil & gas industry faces several technical challenges—most notably, a pronounced emphasis on data security, a reliance on extremely large data sets, and significant legacy investments in information technology infrastructure—that make a full migration to the public cloud difficult at present. Private and hybrid cloud solutions have consequently emerged within the industry to yield as much benefit from cloud-based technologies as possible while working within these constraints. This paper argues, however, that the move to private and hybrid clouds will very likely prove only to be a temporary stepping stone in the industry's technological evolution. By presenting evidence from other market sectors that have faced similar challenges in their journey to the cloud, we propose that enabling technologies and conditions will probably fall into place in a way that makes the public cloud a far more attractive option for the upstream oil & gas industry in the years ahead. The paper concludes with a discussion about the implications of this projected shift towards the public cloud, and calls for more of the industry's services to be offered through cloud-based “apps.”
Resumo:
The Internet is one of the most significant information and communication technologies to emerge during the end of the last century. It created new and effective means by which individuals and groups communicate. These advances led to marked institutional changes most notably in the realm of commercial exchange: it did not only provide the high-speed communication infrastructure to business enterprises; it also opened them to the global consumer base where they could market their products and services. Commercial interests gradually dominated Internet technology over the past several years and have been a factor in the increase of its user population and enhancement of infrastructure. Such commercial interests fitted comfortably within the structures of the Philippine government. As revealed in the study, state policies and programs make use of Internet technology as an enabler of commercial institutional reforms using traditional economic measures. Yet, despite efforts to maximize the Internet as an enabler for market-driven economic growth, the accrued benefits are yet to come about; it is largely present only in major urban areas and accessible to a small number of social groups. The failure of the Internet’s developmental capability can be traced back to the government’s wholesale adoption of commercial-centered discourse. The Internet’s developmental gains (i.e. instrumental, communicative and emancipatory) and features, which were always there since its inception, have been visibly left out in favor of its commercial value. By employing synchronic and diachronic analysis, it can be shown that the Internet can be a vital technology in promoting genuine social development in the Philippines. In general, the object is to realize a social environment of towards a more inclusive and participatory application of Internet technology, equally aware of the caveats or risks the technology may pose. It is argued further that there is a need for continued social scientific research regarding the social as and developmental implications of Internet technology at local level structures, such social sectors, specific communities and organizations. On the meta-level, such approach employed in this research can be a modest attempt in increasing the calculus of hope especially among the marginalized Filipino sectors, with the use of information and communications technologies. This emerging field of study—tentatively called Progressive Informatics—must emanate from the more enlightened social sectors, namely: the non-government, academic and locally-based organizations.
Resumo:
Internet chatrooms are common means of interaction and communications, and they carry valuable information about formal or ad-hoc formation of groups with diverse objectives. This work presents a fully automated surveillance system for data collection and analysis in Internet chatrooms. The system has two components: First, it has an eavesdropping tool which collects statistics on individual (chatter) and chatroom behavior. This data can be used to profile a chatroom and its chatters. Second, it has a computational discovery algorithm based on Singular Value Decomposition (SVD) to locate hidden communities and communication patterns within a chatroom. The eavesdropping tool is used for fine tuning the SVD-based discovery algorithm which can be deployed in real-time and requires no semantic information processing. The evaluation of the system on real data shows that (i) statistical properties of different chatrooms vary significantly, thus profiling is possible, (ii) SVD-based algorithm has up to 70-80% accuracy to discover groups of chatters.
Resumo:
In the modern connected world, pervasive computing has become reality. Thanks to the ubiquity of mobile computing devices and emerging cloud-based services, the users permanently stay connected to their data. This introduces a slew of new security challenges, including the problem of multi-device key management and single-sign-on architectures. One solution to this problem is the utilization of secure side-channels for authentication, including the visual channel as vicinity proof. However, existing approaches often assume confidentiality of the visual channel, or provide only insufficient means of mitigating a man-in-the-middle attack. In this work, we introduce QR-Auth, a two-step, 2D barcode based authentication scheme for mobile devices which aims specifically at key management and key sharing across devices in a pervasive environment. It requires minimal user interaction and therefore provides better usability than most existing schemes, without compromising its security. We show how our approach fits in existing authorization delegation and one-time-password generation schemes, and that it is resilient to man-in-the-middle attacks.
Resumo:
This book develops tools and techniques that will help urban residents gain access to urban computing. Metaphorically speaking, it is taking computing to the street by giving the general public – rather than just researchers and professionals – the power to leverage available city infrastructure and create solutions tailored to their individual needs. It brings together five chapters that are based on presentations given at the Street Computing Workshop held on 24 November 2009 in Melbourne in conjunction with the Australian Computer-Human Interaction Conference (OZCHI 2009). This book focuses on applying urban informatics, urban and community sensing and open application programming interfaces (APIs) to the public space through the delivery of online services, on demand and in real time. It then offers a case study of how the city of Singapore has harnessed the potential of an online infrastructure so that residents and visitors can access services electronically. This book was published as a special issue of the Journal of Urban Technology, 19(2), 2012.
Resumo:
The importance of language for Internet and Society: developing a language-intelligent approach
Resumo:
Pro-anorexia Internet sites aim to promote, support and discuss anorexia nervosa. Media coverage has raised concerns that sites may increase the level of eating disorders. This research examines the meaning of participation in a pro-anorexia Internet site and its relationship with disordered eating by using an interpretative phenomenological analysis of fifteen separate message ‘threads’ followed over a six-week period. Four themes were identified: (1) tips and techniques; (2) ‘ana’ v. anorexia nervosa; (3) social support; and (4) need for anorexia. Findings suggest participation was multi-purpose, providing a coping function in relation to weight loss, and the contribution of sites to increased levels of eating disorders is not inevitable.
Resumo:
Stigmergy is a biological term originally used when discussing insect or swarm behaviour, and describes a model supporting environment-based communication separating artefacts from agents. This phenomenon is demonstrated in the behavior of ants and their food foraging supported by pheromone trails, or similarly termites and their termite nest building process. What is interesting with this mechanism is that highly organized societies are formed without an apparent central management function. We see design features in Web sites that mimic stigmergic mechanisms as part of the User Interface and we have created generalizations of these patterns. Software development and Web site development techniques have evolved significantly over the past 20 years. Recent progress in this area proposes languages to model web applications to facilitate the nuances specific to these developments. These modeling languages provide a suitable framework for building reusable components encapsulating our design patterns of stigmergy. We hypothesize that incorporating stigmergy as a separate feature of a site’s primary function will ultimately lead to enhanced user coordination.
Resumo:
For any discipline to be regarded as a professional undertaking by which its members may be treated as true “professionals” in a specific area, practitioners must clearly understand that discipline’s history as well as the place and significance of that history in current practice as well as its relevance to available technologies and artefacts at the time. This is common for many professional disciplines such as medicine, pharmacy, engineering, law and so on but not yet, this paper submits, in information technology. Based on twenty five elapsed years of experience in developing and delivering Cybersecurity courses at undergraduate and postgraduate levels, this paper proposes a rationale and set of differing perspectives for the planning and development of curricula relevant to the delivery of appropriate courses in the history of cybersecurity or information assurance to information and communications technology (ICT) students and thus to potential information technology professionals.
Resumo:
Every day we are confronted with social interactions with other people. Our social life is characterised by norms that manifest as attitudinal and behavioural uniformities among people. With greater awareness about our social context, we can interact more efficiently. Any theory or model of human interaction that fails to include social concepts could be suggested to lack a critical element. This paper identifies the construct of social concepts that need to be supported by future context-aw are systems from an interdisciplinary perspective. It discusses the limitations of existing context-aware systems to support social psychology theories related to the identification and membership of social groups. We argue that social norms are among the core modelling concepts that future context-aware systems need to capture with the view to support and enhance social interactions. A detailed summary of social psychology theory relevant to social computing is given, followed by a formal definition of the process with each such norm advertised and acquired. The social concepts identified in this paper could be used to simulate agent interactions imbued with social norms or use ICT to facilitate, assist, enhance or understand social interactions. They also could be used in virtual communities modelling where the social awareness of a community as well as the process of joining and exiting a community are important.
Resumo:
The most powerful known primitive in public-key cryptography is undoubtedly elliptic curve pairings. Upon their introduction just over ten years ago the computation of pairings was far too slow for them to be considered a practical option. This resulted in a vast amount of research from many mathematicians and computer scientists around the globe aiming to improve this computation speed. From the use of modern results in algebraic and arithmetic geometry to the application of foundational number theory that dates back to the days of Gauss and Euler, cryptographic pairings have since experienced a great deal of improvement. As a result, what was an extremely expensive computation that took several minutes is now a high-speed operation that takes less than a millisecond. This thesis presents a range of optimisations to the state-of-the-art in cryptographic pairing computation. Both through extending prior techniques, and introducing several novel ideas of our own, our work has contributed to recordbreaking pairing implementations.
Resumo:
The objective of this PhD research program is to investigate numerical methods for simulating variably-saturated flow and sea water intrusion in coastal aquifers in a high-performance computing environment. The work is divided into three overlapping tasks: to develop an accurate and stable finite volume discretisation and numerical solution strategy for the variably-saturated flow and salt transport equations; to implement the chosen approach in a high performance computing environment that may have multiple GPUs or CPU cores; and to verify and test the implementation. The geological description of aquifers is often complex, with porous materials possessing highly variable properties, that are best described using unstructured meshes. The finite volume method is a popular method for the solution of the conservation laws that describe sea water intrusion, and is well-suited to unstructured meshes. In this work we apply a control volume-finite element (CV-FE) method to an extension of a recently proposed formulation (Kees and Miller, 2002) for variably saturated groundwater flow. The CV-FE method evaluates fluxes at points where material properties and gradients in pressure and concentration are consistently defined, making it both suitable for heterogeneous media and mass conservative. Using the method of lines, the CV-FE discretisation gives a set of differential algebraic equations (DAEs) amenable to solution using higher-order implicit solvers. Heterogeneous computer systems that use a combination of computational hardware such as CPUs and GPUs, are attractive for scientific computing due to the potential advantages offered by GPUs for accelerating data-parallel operations. We present a C++ library that implements data-parallel methods on both CPU and GPUs. The finite volume discretisation is expressed in terms of these data-parallel operations, which gives an efficient implementation of the nonlinear residual function. This makes the implicit solution of the DAE system possible on the GPU, because the inexact Newton-Krylov method used by the implicit time stepping scheme can approximate the action of a matrix on a vector using residual evaluations. We also propose preconditioning strategies that are amenable to GPU implementation, so that all computationally-intensive aspects of the implicit time stepping scheme are implemented on the GPU. Results are presented that demonstrate the efficiency and accuracy of the proposed numeric methods and formulation. The formulation offers excellent conservation of mass, and higher-order temporal integration increases both numeric efficiency and accuracy of the solutions. Flux limiting produces accurate, oscillation-free solutions on coarse meshes, where much finer meshes are required to obtain solutions with equivalent accuracy using upstream weighting. The computational efficiency of the software is investigated using CPUs and GPUs on a high-performance workstation. The GPU version offers considerable speedup over the CPU version, with one GPU giving speedup factor of 3 over the eight-core CPU implementation.