29 resultados para Sharing and privacy
The effective use of implicit parallelism through the use of an object-oriented programming language
Resumo:
This thesis explores translating well-written sequential programs in a subset of the Eiffel programming language - without syntactic or semantic extensions - into parallelised programs for execution on a distributed architecture. The main focus is on constructing two object-oriented models: a theoretical self-contained model of concurrency which enables a simplified second model for implementing the compiling process. There is a further presentation of principles that, if followed, maximise the potential levels of parallelism. Model of Concurrency. The concurrency model is designed to be a straightforward target for mapping sequential programs onto, thus making them parallel. It aids the compilation process by providing a high level of abstraction, including a useful model of parallel behaviour which enables easy incorporation of message interchange, locking, and synchronization of objects. Further, the model is sufficient such that a compiler can and has been practically built. Model of Compilation. The compilation-model's structure is based upon an object-oriented view of grammar descriptions and capitalises on both a recursive-descent style of processing and abstract syntax trees to perform the parsing. A composite-object view with an attribute grammar style of processing is used to extract sufficient semantic information for the parallelisation (i.e. code-generation) phase. Programming Principles. The set of principles presented are based upon information hiding, sharing and containment of objects and the dividing up of methods on the basis of a command/query division. When followed, the level of potential parallelism within the presented concurrency model is maximised. Further, these principles naturally arise from good programming practice. Summary. In summary this thesis shows that it is possible to compile well-written programs, written in a subset of Eiffel, into parallel programs without any syntactic additions or semantic alterations to Eiffel: i.e. no parallel primitives are added, and the parallel program is modelled to execute with equivalent semantics to the sequential version. If the programming principles are followed, a parallelised program achieves the maximum level of potential parallelisation within the concurrency model.
Resumo:
The research is concerned with the measurement of residents' evaluations of the environmental quality of residential areas. The research reflects the increased attention being given to residents' values in planning decisions affecting the residential environment. The work was undertaken in co-operation with a local authority which was in the process of revising its housing strategy, and in particular the priorities for improvement action. The study critically examines the existing evidence on environmental values and their relationship to the environment and points to a number of methodological and conceptual deficiencies. The research strategy developed on the basis of the research review was constrained by the need to keep any survey methods simple so that they could easily be repeated, when necessary, by the sponsoring authority. A basic perception model was assumed, and a social survey carried out to measure residents' responses to different environmental conditions. The data was only assumed to have ordinal properties, necessitating the extensive use of non-parametric statistics. Residents' expressions of satisfaction with the component elements of the environment (ranging from convenience to upkeep and privacy) were successfully related to 'objective' measures of the environment. However the survey evidence did not justify the use of the 'objective' variables as environmental standards. A method of using the social survey data directly as an aid to decision-making is discussed. Alternative models of the derivation of overall satisfaction with the environment are tested, and the values implied by the additive model compared with residents' preferences as measured directly in the survey. Residents' overall satisfactions with the residential environment were most closely related to their satisfactions with the "Appearance" and the "Reputation" of their areas. By contrast the most important directly measured preference was "Friendliness of area". The differences point to the need to define concepts used in social research clearly in operational terms, and to take care in the use of values 'measured' by different methods.
Resumo:
In recent years the increased interest in introducing radio frequency technology (RFID) in warehousing was observed. First adopters of RFID reported numerous benefits, which included: reduced shrinkage, real-time tracking and better accuracy of data collection. Along with the academic and industrial discussion on benefits which can be achieved in RFID enabled warehouses there are reports on issues related to adoption of RFID technology in warehousing. This paper reviews results of scientific reports of RFID implementation in warehouses and discusses the adoption barriers and causes of not achieving full potential of the technology. Following adoption barriers are identified and set in warehousing context: lack of forseeable return on investment (ROI), unreliable performance of RFID systems, standarisation, integration with legacy systems and privacy/security concerns. As more studies will address these challenges, the realisation of RFID benefits for warehouses will become reality.
Resumo:
How do buyer–supplier relationships affect innovation? This study suggests that the relational exchange norms of flexibility, information sharing, and solidarity (the bright side) encourage buyer innovation. However, negative (dark side) aspects of relationships with suppliers—loss of supplier objectivity, increasing buyer expectations, and supplier opportunism—may accompany the bright side and subsequently reduce buyer innovation. The study reports on the simultaneous effects of the bright and dark sides on innovation and the resultant effect on supplier performance as evaluated from the buyer's perspective. Using data from the travel and computer industry, regression models reveal that the bright side encourages buyer innovation. Buyers reciprocate this support by enhancing their supplier evaluations. The findings indicate that rising buyer expectations—supposedly a dark side of relational exchange—encourage innovation, while loss of supplier objectivity reduces relationship performance. These findings imply that the bright and dark sides are not mutually exclusive dimensions of good versus bad behavior.
Resumo:
The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. In this paper, a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them.
Resumo:
This paper presents innovative programs that business schools can utilize to reduce dependence on public funds. A review of the literature shows the theoretical and empirical foundation of higher education funding dilemmas. While higher education is moving towards a global ambition, scarcity hinders governments to fully support programs long-term; thus, cost-sharing and cost-shifting measures must occur for higher education to support current programs. In this study, we examine two universities (one U.S. and one U.K.) and provide practical summaries of programs that have provided additional funds. We show that diversity of funding sources is essential for survival of higher education institutions. Market forces require competition to reduce higher education operational costs while providing students and corporate clients an a la carte educational experience.
Resumo:
Risk management and knowledge management have so far been studied almost independently. The evolution of risk management to the holistic view of Enterprise Risk Management requires the destruction of barriers between organizational silos and the exchange and application of knowledge from different risk management areas. However, knowledge management has received little or no attention in risk management. This paper examines possible relationships between knowledge management constructs related to knowledge sharing, and two risk management concepts: perceived quality of risk control and perceived value of enterprise risk management. From a literature review, relationships with eight knowledge management variables covering people, process and technology aspects were hypothesised. A survey was administered to risk management employees in financial institutions. The results showed that the perceived quality of risk control is significantly associated with four knowledge management variables: perceived quality of risk knowledge sharing, perceived quality of communication among people, web channel functionality, and risk management information system functionality. However, the relationships of the knowledge management variables to the perceived value of enterprise risk management are not significant. We conclude that better knowledge management is associated with better risk control, but that more effort needs to be made to break down organizational silos in order to support true Enterprise Risk Management.
Resumo:
Definitions and measures of supply chain integration (SCI) are diverse. More empirical research, with clear definition and appropriate measures are needed. The purpose of this article is to identify dimensions and variables for SCI and develop an integrated framework to facilitate this. A literature review of the relevant academic papers in international journals in Logistics, Supply Chain Management and Operations Management for the period 1995-2009 has been undertaken. This study reveals that information integration, coordination and resource sharing and organisational relationship linkage are three major dimensions for SCI. The proposed framework helps integrate both upstream suppliers and downstream customers with the focal organisation. It also allows measuring SCI using both qualitative and quantitative approach. This study encourages researchers and practitioners to identify dimensions and variables for SCI and analyses how it affects the overall supply chain (SC) performance in terms of efficiency and responsiveness. Although there is extensive research in the area of SCI, a comprehensive and integrated approach is missing. This study bridges the gap by developing a framework for measuring SCI, which enables any organisation to identify critical success factors for integrating their SC, measures the degree of integration qualitatively and quantitatively and suggest improvement measures. © 2013 Copyright Taylor and Francis Group, LLC.
Resumo:
We describe a free space quantum cryptography system which is designed to allow continuous unattended key exchanges for periods of several days, and over ranges of a few kilometres. The system uses a four-laser faint-pulse transmission system running at a pulse rate of 10MHz to generate the required four alternative polarization states. The receiver module similarly automatically selects a measurement basis and performs polarization measurements with four avalanche photodiodes. The controlling software can implement the full key exchange including sifting, error correction, and privacy amplification required to generate a secure key.
Resumo:
Many software engineers have found that it is difficult to understand, incorporate and use different formal models consistently in the process of software developments, especially for large and complex software systems. This is mainly due to the complex mathematical nature of the formal methods and the lack of tool support. It is highly desirable to have software models and their related software artefacts systematically connected and used collaboratively, rather than in isolation. The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. This paper proposed a framework that allows users to interconnect the knowledge about formal software models and other related documents using the semantic technology. We first propose a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them. We then develop a Semantic Web environment for representing and sharing formal Z/OZ models. A method with prototype tool is presented to enhance semantic query to software models and other artefacts. © 2014.
Resumo:
This work looks into video quality assessment applied to the field of telecare and proposes an alternative metric to the more traditionally used PSNR based on the requirements of such an application. We show that the Pause Intensity metric introduced in [1] is also relevant and applicable to heterogeneous networks with a wireless last hop connected to a wired TCP backbone. We demonstrate through our emulation testbed that the impairments experienced in such a network architecture are dominated by continuity based impairments rather than artifacts, such as motion drift or blockiness. We also look into the implication of using Pause Intensity as a metric in terms of the overall video latency, which is potentially problematic should the video be sent and acted upon in real-time. We conclude that Pause Intensity may be used alongside the video characteristics which have been suggested as a measure of the overall video quality. © 2012 IEEE.
Resumo:
Welcome to the Second International Workshop on Multimedia Communications and Networking held in conjunction with IUCC-2012 during 25 June – 27 June 2012 in Liverpool, UK. MultiCom-2012 is dedicated to address the challenges in the areas of elivering multimedia content using modern communication and networking techniques. The multimedia & networking computing domain emerges from the integration of multimedia content such as audio and video with content distribution technologies. This workshop aims to cover contributions in both design and analysis aspects in the context of multimedia, wired/wireless/heterogeneous networks, and quality evaluation. It also intends to bring together researchers and practitioners from academia and industry to share their latest achievements in this field with others and establish new collaborations for future developments. All papers received are peer reviewed by three members of the Technical Programme Committee. The papers are assessed by their originality, technical quality, presentation and relevance to the theme of the workshop. Based on the criteria set, four papers have been accepted for presentation at the workshop and will appear in the IUCC conference proceedings. We would like to take this opportunity to thank the IUCC-2012 Organizing Committee, the TPC members of MultiCom-2012 and the authors for their s upport, dedicated work and contributions. Finally, we look forward to meeting you at the workshop in Liverpool.
Resumo:
Risk management and knowledge management have so far been studied almost independently. The evolution of risk management to the holistic view of Enterprise Risk Management requires the destruction of barriers between organizational silos and the exchange and application of knowledge from different risk management areas. However, knowledge management has received little or no attention in risk management. This paper examines possible relationships between knowledge management constructs related to knowledge sharing, and two risk management concepts: perceived quality of risk control and perceived value of enterprise risk management. From a literature review, relationships with eight knowledge management variables covering people, process and technology aspects were hypothesised. A survey was administered to risk management employees in financial institutions. The results showed that the perceived quality of risk control is significantly associated with four knowledge management variables: perceived quality of risk knowledge sharing, perceived quality of communication among people, web channel functionality, and risk management information system functionality. However, the relationships of the knowledge management variables to the perceived value of enterprise risk management are not significant. We conclude that better knowledge management is associated with better risk control, but that more effort needs to be made to break down organizational silos in order to support true Enterprise Risk Management.
Resumo:
With the advent of GPS enabled smartphones, an increasing number of users is actively sharing their location through a variety of applications and services. Along with the continuing growth of Location-Based Social Networks (LBSNs), security experts have increasingly warned the public of the dangers of exposing sensitive information such as personal location data. Most importantly, in addition to the geographical coordinates of the user’s location, LBSNs allow easy access to an additional set of characteristics of that location, such as the venue type or popularity. In this paper, we investigate the role of location semantics in the identification of LBSN users. We simulate a scenario in which the attacker’s goal is to reveal the identity of a set of LBSN users by observing their check-in activity. We then propose to answer the following question: what are the types of venues that a malicious user has to monitor to maximize the probability of success? Conversely, when should a user decide whether to make his/her check-in to a location public or not? We perform our study on more than 1 million check-ins distributed over 17 urban regions of the United States. Our analysis shows that different types of venues display different discriminative power in terms of user identity, with most of the venues in the “Residence” category providing the highest re-identification success across the urban regions. Interestingly, we also find that users with a high entropy of their check-ins distribution are not necessarily the hardest to identify, suggesting that it is the collective behaviour of the users’ population that determines the complexity of the identification task, rather than the individual behaviour.