43 resultados para Computer and network security
Resumo:
Increasingly users are seen as the weak link in the chain, when it comes to the security of corporate information. Should the users of computer systems act in any inappropriate or insecure manner, then they may put their employers in danger of financial losses, information degradation or litigation, and themselves in danger of dismissal or prosecution. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of inappropriate behaviours, and in so doing, protecting corporate information, is through the formulation and application of a formal ‘acceptable use policy (AUP). Whilst the AUP has attracted some academic interest, it has tended to be prescriptive and overly focussed on the role of the Internet, and there is relatively little empirical material that explicitly addresses the purpose, positioning or content of real acceptable use policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and composition of a sample of authentic policies – taken from the higher education sector – rather than simply making general prescriptions about what they ought to contain. There are two important conclusions to be drawn from this study: (1) the primary role of the AUP appears to be as a mechanism for dealing with unacceptable behaviour, rather than proactively promoting desirable and effective security behaviours, and (2) the wide variation found in the coverage and positioning of the reviewed policies is unlikely to be fostering a coherent approach to security management, across the higher education sector.
Resumo:
Smart cameras allow pre-processing of video data on the camera instead of sending it to a remote server for further analysis. Having a network of smart cameras allows various vision tasks to be processed in a distributed fashion. While cameras may have different tasks, we concentrate on distributed tracking in smart camera networks. This application introduces various highly interesting problems. Firstly, how can conflicting goals be satisfied such as cameras in the network try to track objects while also trying to keep communication overhead low? Secondly, how can cameras in the network self adapt in response to the behavior of objects and changes in scenarios, to ensure continued efficient performance? Thirdly, how can cameras organise themselves to improve the overall network's performance and efficiency? This paper presents a simulation environment, called CamSim, allowing distributed self-adaptation and self-organisation algorithms to be tested, without setting up a physical smart camera network. The simulation tool is written in Java and hence allows high portability between different operating systems. Relaxing various problems of computer vision and network communication enables a focus on implementing and testing new self-adaptation and self-organisation algorithms for cameras to use.
Resumo:
This edited volume analyzes recent key developments in EU border management. In light of the refugee crises in the Mediterranean and the responses on the part of EU member states, this volume presents an in-depth reflection on European border practices and their political, social and economic consequences. Approaching borders as concepts in flux, the authors identify three main trends: the rise of security technologies such as the EUROSUR system, the continued externalization of EU security governance such as border mission training in third states, and the unfolding dynamics of accountability. The contributions show that internal security cooperation in Europe is far from consolidated, since both political oversight mechanisms and the definition of borders remain in flux. This edited volume makes a timely and interdisciplinary contribution to the ongoing academic and political debate on the future of open borders and legitimate security governance in Europe. It offers a valuable resource for scholars in the fields of international security and migration studies, as well as for practitioners dealing with border management mechanisms.
Resumo:
Due to huge popularity of portable terminals based on Wireless LANs and increasing demand for multimedia services from these terminals, the earlier structures and protocols are insufficient to cover the requirements of emerging networks and communications. Most research in this field is tailored to find more efficient ways to optimize the quality of wireless LAN regarding the requirements of multimedia services. Our work is to investigate the effects of modulation modes at the physical layer, retry limits at the MAC layer and packet sizes at the application layer over the quality of media packet transmission. Interrelation among these parameters to extract a cross-layer idea will be discussed as well. We will show how these parameters from different layers jointly contribute to the performance of service delivery by the network. The results obtained could form a basis to suggest independent optimization in each layer (an adaptive approach) or optimization of a set of parameters from different layers (a cross-layer approach). Our simulation model is implemented in the NS-2 simulator. Throughput and delay (latency) of packet transmission are the quantities of our assessments. © 2010 IEEE.
Resumo:
In this work, we present an adaptive unequal loss protection (ULP) scheme for H264/AVC video transmission over lossy networks. This scheme combines erasure coding, H.264/AVC error resilience techniques and importance measures in video coding. The unequal importance of the video packets is identified in the group of pictures (GOP) and the H.264/AVC data partitioning levels. The presented method can adaptively assign unequal amount of forward error correction (FEC) parity across the video packets according to the network conditions, such as the available network bandwidth, packet loss rate and average packet burst loss length. A near optimal algorithm is developed to deal with the FEC assignment for optimization. The simulation results show that our scheme can effectively utilize network resources such as bandwidth, while improving the quality of the video transmission. In addition, the proposed ULP strategy ensures graceful degradation of the received video quality as the packet loss rate increases. © 2010 IEEE.
Resumo:
The aim of this special issue is to widen the existing debates on security privatization by looking at how and why an increasing number of private actors beyond private military and/or security companies (PMSCs) have come to perform various security related functions. While PMSCs produce security for profit, most other private sector actors make profit by selling goods and services that were originally not connected with security in the traditional understanding of the term. However, due to the continuous introduction of new legal and technical regulations by public authorities, many non- security related private businesses nowadays have to perform at least some security functions. Little research, however, has been done thus far, both in terms of security practices of non- security related private businesses and their impact on security governance. This introduction explains how this special issue contributes to closing this glaring gap by 1) extending the conceptual and theoretical arguments in the existing body of literature; and 2) offering a range of original case studies on the specific roles of non- security related private companies of all sizes, areas of businesses, and geographic origin.
Resumo:
Primarily targeted toward the network or MIS manager who wants to stay abreast of the latest networking technology, Enterprise Networking: Multilayer Switching and Applications offers up to date information relevant for the design of modern corporate networks and for the evaluation of new networking equipment. The book describes the architectures and standards of switching across the various protocol layers and will also address issues such as multicast quality of service, high-availability and network policies that are requirements of modern switched networks.
Resumo:
The topic of bioenergy, biofuels and bioproducts remains at the top of the current political and research agenda. Identification of the optimum processing routes for biomass, in terms of efficiency, cost, environment and socio-economics is vital as concern grows over the remaining fossil fuel resources, climate change and energy security. It is known that the only renewable way of producing conventional hydrocarbon fuels and organic chemicals is from biomass, but the problem remains of identifying the best product mix and the most efficient way of processing biomass to products. The aim is to move Europe towards a biobased economy and it is widely accepted that biorefineries are key to this development. A methodology was required for the generation and evaluation of biorefinery process chains for converting biomass into one or more valuable products that properly considers performance, cost, environment, socio-economics and other factors that influence the commercial viability of a process. In this thesis a methodology to achieve this objective is described. The completed methodology includes process chain generation, process modelling and subsequent analysis and comparison of results in order to evaluate alternative process routes. A modular structure was chosen to allow greater flexibility and allowing the user to generate a large number of different biorefinery configurations The significance of the approach is that the methodology is defined and is thus rigorous and consistent and may be readily re-examined if circumstances change. There was the requirement for consistency in structure and use, particularly for multiple analyses. It was important that analyses could be quickly and easily carried out to consider, for example, different scales, configurations and product portfolios and so that previous outcomes could be readily reconsidered. The result of the completed methodology is the identification of the most promising biorefinery chains from those considered as part of the European Biosynergy Project.
Resumo:
For the last several years, mobile devices and platform security threats, including wireless networking technology, have been top security issues. A departure has occurred from automatic anti-virus software based on traditional PC defense: risk management (authentication and encryption), compliance, and disaster recovery following polymorphic viruses and malware as the primary activities within many organizations and government services alike. This chapter covers research in Turkey as a reflection of the current market – e-government started officially in 2008. This situation in an emerging country presents the current situation and resistances encountered while engaging with mobile and e-government interfaces. The authors contend that research is needed to understand more precisely security threats and most of all potential solutions for sustainable future intention to use m-government services. Finally, beyond m-government initiatives' success or failure, the mechanisms related to public administration mobile technical capacity building and security issues are discussed.
Resumo:
The integration of a microprocessor and a medium power stepper motor in one control system brings together two quite different disciplines. Various methods of interfacing are examined and the problems involved in both hardware and software manipulation are investigated. Microprocessor open-loop control of the stepper motor is considered. The possible advantages of microprocessor closed-loop control are examined and the development of a system is detailed. The system uses position feedback to initiate each motor step. Results of the dynamic response of the system are presented and its performance discussed. Applications of the static torque characteristic of the stepper motor are considered followed by a review of methods of predicting the characteristic. This shows that accurate results are possible only when the effects of magnetic saturation are avoided or when the machine is available for magnetic circuit tests to be carried out. A new method of predicting the static torque characteristic is explained in detail. The method described uses the machine geometry and the magnetic characteristics of the iron types used in the machine. From this information the permeance of each iron component of the machine is calculated and by using the equivalent magnetic circuit of the machine, the total torque produced is predicted. It is shown how this new method is implemented on a digital computer and how the model may be used to investigate further aspects of the stepper motor in addition to the static torque.
Resumo:
We have recently proposed the framework of independent blind source separation as an advantageous approach to steganography. Amongst the several characteristics noted was a sensitivity to message reconstruction due to small perturbations in the sources. This characteristic is not common in most other approaches to steganography. In this paper we discuss how this sensitivity relates the joint diagonalisation inside the independent component approach, and reliance on exact knowledge of secret information, and how it can be used as an additional and inherent security mechanism against malicious attack to discovery of the hidden messages. The paper therefore provides an enhanced mechanism that can be used for e-document forensic analysis and can be applied to different dimensionality digital data media. In this paper we use a low dimensional example of biomedical time series as might occur in the electronic patient health record, where protection of the private patient information is paramount.
Resumo:
The aim of this paper is to provide an overview and an analysis of recent developments and changes in the implementation of sustainability practices by food retailers. It also aims to explore whether the sustainability measurement criteria and indicators identified in the literature can be applied in practice. A literature review identified the current trends, developments and the proposed sustainability objectives, criteria and indicators. Via case study research, we collected empirical data from four retailers. This involved both qualitative and quantitative data drawn from questionnaires and in-depth interviews with logistics directors from four retailers' distribution centres. The empirical data collected from the interviews indicate similarities in some of the characteristics of distribution centres, as well as differences. However, it was difficult to make cross-company comparisons due to the absence of benchmarks or assessments of the relative importance of each sustainability criterion and indicator. This research focused only on two sustainability objectives. Further research on other sustainability objectives is therefore required. Lessons learnt from the four case studies can be taken into consideration when developing future sustainability performance rating scales. The paper provides an in-depth analysis of sustainability in the food chain, with emphasis on food retailing. Its value lies in presenting an attempt to test in practice how a number of sustainability objectives, criteria and indicators are applied in logistics-related processes, identifying the gaps and reporting the potential difficulties.
Resumo:
Visualising data for exploratory analysis is a major challenge in many applications. Visualisation allows scientists to gain insight into the structure and distribution of the data, for example finding common patterns and relationships between samples as well as variables. Typically, visualisation methods like principal component analysis and multi-dimensional scaling are employed. These methods are favoured because of their simplicity, but they cannot cope with missing data and it is difficult to incorporate prior knowledge about properties of the variable space into the analysis; this is particularly important in the high-dimensional, sparse datasets typical in geochemistry. In this paper we show how to utilise a block-structured correlation matrix using a modification of a well known non-linear probabilistic visualisation model, the Generative Topographic Mapping (GTM), which can cope with missing data. The block structure supports direct modelling of strongly correlated variables. We show that including prior structural information it is possible to improve both the data visualisation and the model fit. These benefits are demonstrated on artificial data as well as a real geochemical dataset used for oil exploration, where the proposed modifications improved the missing data imputation results by 3 to 13%.
Resumo:
The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.
Resumo:
This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.