968 resultados para security evaluation
Resumo:
The topic of bioenergy, biofuels and bioproducts remains at the top of the current political and research agenda. Identification of the optimum processing routes for biomass, in terms of efficiency, cost, environment and socio-economics is vital as concern grows over the remaining fossil fuel resources, climate change and energy security. It is known that the only renewable way of producing conventional hydrocarbon fuels and organic chemicals is from biomass, but the problem remains of identifying the best product mix and the most efficient way of processing biomass to products. The aim is to move Europe towards a biobased economy and it is widely accepted that biorefineries are key to this development. A methodology was required for the generation and evaluation of biorefinery process chains for converting biomass into one or more valuable products that properly considers performance, cost, environment, socio-economics and other factors that influence the commercial viability of a process. In this thesis a methodology to achieve this objective is described. The completed methodology includes process chain generation, process modelling and subsequent analysis and comparison of results in order to evaluate alternative process routes. A modular structure was chosen to allow greater flexibility and allowing the user to generate a large number of different biorefinery configurations The significance of the approach is that the methodology is defined and is thus rigorous and consistent and may be readily re-examined if circumstances change. There was the requirement for consistency in structure and use, particularly for multiple analyses. It was important that analyses could be quickly and easily carried out to consider, for example, different scales, configurations and product portfolios and so that previous outcomes could be readily reconsidered. The result of the completed methodology is the identification of the most promising biorefinery chains from those considered as part of the European Biosynergy Project.
Resumo:
This research describes the development of a groupware system which adds security services to a Computer Supported Cooperative Work system operating over the Internet. The security services use cryptographic techniques to provide a secure access control service and an information protection service. These security services are implemented as a protection layer for the groupware system. These layers are called External Security Layer (ESL) and Internal Security Layer (ISL) respectively. The security services are sufficiently flexible to allow the groupware system to operate in both synchronous and asynchronous modes. The groupware system developed - known as Secure Software Inspection Groupware (SecureSIG) - provides security for a distributed group performing software inspection. SecureSIG extends previous work on developing flexible software inspection groupware (FlexSIG) Sahibuddin, 1999). The SecureSIG model extends the FlexSIG model, and the prototype system was added to the FlexSIG prototype. The prototype was built by integrating existing software, communication and cryptography tools and technology. Java Cryptography Extension (JCE) and Internet technology were used to build the prototype. To test the suitability and transparency of the system, an evaluation was conducted. A questionnaire was used to assess user acceptability.
Resumo:
The advent of personal communication systems within the last decade has depended upon the utilization of advanced digital schemes for source and channel coding and for modulation. The inherent digital nature of the communications processing has allowed the convenient incorporation of cryptographic techniques to implement security in these communications systems. There are various security requirements, of both the service provider and the mobile subscriber, which may be provided for in a personal communications system. Such security provisions include the privacy of user data, the authentication of communicating parties, the provision for data integrity, and the provision for both location confidentiality and party anonymity. This thesis is concerned with an investigation of the private-key and public-key cryptographic techniques pertinent to the security requirements of personal communication systems and an analysis of the security provisions of Second-Generation personal communication systems is presented. Particular attention has been paid to the properties of the cryptographic protocols which have been employed in current Second-Generation systems. It has been found that certain security-related protocols implemented in the Second-Generation systems have specific weaknesses. A theoretical evaluation of these protocols has been performed using formal analysis techniques and certain assumptions made during the development of the systems are shown to contribute to the security weaknesses. Various attack scenarios which exploit these protocol weaknesses are presented. The Fiat-Sharmir zero-knowledge cryptosystem is presented as an example of how asymmetric algorithm cryptography may be employed as part of an improved security solution. Various modifications to this cryptosystem have been evaluated and their critical parameters are shown to be capable of being optimized to suit a particular applications. The implementation of such a system using current smart card technology has been evaluated.
Resumo:
A method is proposed to offer privacy in computer communications, using symmetric product block ciphers. The security protocol involved a cipher negotiation stage, in which two communicating parties select privately a cipher from a public cipher space. The cipher negotiation process includes an on-line cipher evaluation stage, in which the cryptographic strength of the proposed cipher is estimated. The cryptographic strength of the ciphers is measured by confusion and diffusion. A method is proposed to describe quantitatively these two properties. For the calculation of confusion and diffusion a number of parameters are defined, such as the confusion and diffusion matrices and the marginal diffusion. These parameters involve computationally intensive calculations that are performed off-line, before any communication takes place. Once they are calculated, they are used to obtain estimation equations, which are used for on-line, fast evaluation of the confusion and diffusion of the negotiated cipher. A technique proposed in this thesis describes how to calculate the parameters and how to use the results for fast estimation of confusion and diffusion for any cipher instance within the defined cipher space.
Resumo:
In this paper a methodology for evaluation of information security of objects under attacks, processed by methods of compression, is represented. Two basic parameters for evaluation of information security of objects – TIME and SIZE – are chosen and the characteristics, which reflect on their evaluation, are analyzed and estimated. A co-efficient of information security of object is proposed as a mean of the coefficients of the parameter TIME and SIZE. From the simulation experiments which were carried out methods with the highest co-efficient of information security had been determined. Assessments and conclusions for future investigations are proposed.
Resumo:
The main requirements to DRM platforms implementing effective user experience and strong security measures to prevent unauthorized use of content are discussed. Comparison of hardware-based and software- based platforms is made showing the general inherent advantages of hardware DRM solutions. Analysis and evaluation of the main flaws of hardware platforms are conducted, pointing out the possibilities to overcome them. The overview of the existing concepts for practical realization of hardware DRM protection reveals their advantages and disadvantages and the increasing demand for creation of multi-core architecture, which could assure an effective DRM protection without decreasing the user’s freedom and importing risks for end system security.
Resumo:
This report examines important issues pertaining to the different ways of affecting the information security of file objects under information attacks through methods of compression. Accordingly, the report analyzes the three-way relationships which may exist among a selected set of attacks, methods and objects. Thus, a methodology is proposed for evaluation of information security, and a coefficient of information security is created. With respects to this coefficient, using different criteria and methods for evaluation and selection of alternatives, the lowest-risk methods of compression are selected.
Resumo:
Over the past two decades, the Democratic People's Republic of Korea has allegedly developed nuclear energy while suffering near collapse caused by catastrophic economic policies. This article presents an evaluation of North Korea's contemporary energy policies and suggests that despite retaining communist ideals and "Chu'che" policies, North Korea has slowly started to modernise its energy sector and recognises the necessity to start engaging with the international community. While it is argued that Pyongyang's newfound concerns for sustainable development, equity and the environment are a welcomed departure from its usual belligerent rhetoric and present a number of exciting engagement opportunities, the regime has not abandoned its nuclear energy programme.
Resumo:
Acknowledgements This work contributes to the ELUM (Ecosystem Land Use Modelling & Soil Carbon GHG Flux Trial) project, which was commissioned and funded by the Energy Technologies Institute (ETI). We acknowledge the E-OBS data set from the EU-FP6 project ENSEMBLES (http://ensembles-eu.metoffice.com) and the data providers in the ECA&D project (http://www.ecad.eu).
Resumo:
Acknowledgements We are grateful for Dr. Jens Strauss and the other two anonymous reviewers for their insightful comments on an earlier version of this MS, and appreciate members of the IBCAS Sampling Campaign Teams for their assistance in field investigation. This work was supported by the National Basic Research Program of China on Global Change (2014CB954001 and 2015CB954201), National Natural Science Foundation of China (31322011 and 41371213), and the Thousand Young Talents Program.
Resumo:
Increasing research has highlighted the effects of changing climates on the occurrence and prevalence of toxigenic Aspergillus species producing aflatoxins. There is concern of the toxicological effects to human health and animal productivity following acute and chronic exposure that may affect the future ability to provide safe and sufficient food globally. Considerable research has focused on the detection of these toxins, based on the physicochemical and biochemical properties of the aflatoxin compounds, in agricultural products for human and animal consumption. As improvements in food security continue more regulations for acceptable levels of aflatoxins have arisen globally; the most stringent in Europe. These regulations are important for developing countries as aflatoxin occurrence is high significantly effecting international trade and the economy. In developed countries analytical approaches have become highly sophisticated, capable of attaining results with high precision and accuracy, suitable for regulatory laboratories. Regrettably, many countries that are affected by aflatoxin contamination do not have resources for high tech HPLC and MS instrumentation and require more affordable, yet robust equally accurate alternatives that may be used by producers, processors and traders in emerging economies. It is especially important that those companies wishing to exploit the opportunities offered by lucrative but highly regulated markets in the developed world, have access to analytical methods that will ensure that their exports meet their customers quality and safety requirements.
This work evaluates the ToxiMet system as an alternative approach to UPLC–MS/MS for the detection and determination of aflatoxins relative to current European regulatory standards. Four commodities: rice grain, maize cracked and flour, peanut paste and dried distillers grains were analysed for natural aflatoxin contamination. For B1 and total aflatoxins determination the qualitative correlation, above or below the regulatory limit, was good for all commodities with the exception of the dried distillers grain samples for B1 for which no calibration existed. For B1 the quantitative R2 correlations were 0.92, 0.92, 0.88 (<250 μg/kg) and 0.7 for rice, maize, peanuts and dried distillers grain samples respectively whereas for total aflatoxins the quantitative correlation was 0.92, 0.94, 0.88 and 0.91. The ToxiMet system could be used as an alternative for aflatoxin analysis for current legislation but some consideration should be given to aflatoxin M1 regulatory levels for these commodities considering the high levels detected in this study especially for maize and peanuts
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
In the past decades, social-ecological systems (SESs) worldwide have undergone dramatic transformations with often detrimental consequences for livelihoods. Although resilience thinking offers promising conceptual frameworks to understand SES transformations, empirical resilience assessments of real-world SESs are still rare because SES complexity requires integrating knowledge, theories, and approaches from different disciplines. Taking up this challenge, we empirically assess the resilience of a South African pastoral SES to drought using various methods from natural and social sciences. In the ecological subsystem, we analyze rangelands’ ability to buffer drought effects on forage provision, using soil and vegetation indicators. In the social subsystem, we assess households’ and communities’ capacities to mitigate drought effects, applying agronomic and institutional indicators and benchmarking against practices and institutions in traditional pastoral SESs. Our results indicate that a decoupling of livelihoods from livestock-generated income was initiated by government interventions in the 1930s. In the post-apartheid phase, minimum-input strategies of herd management were adopted, leading to a recovery of rangeland vegetation due to unintentionally reduced stocking densities. Because current livelihood security is mainly based on external monetary resources (pensions, child grants, and disability grants), household resilience to drought is higher than in historical phases. Our study is one of the first to use a truly multidisciplinary resilience assessment. Conflicting results from partial assessments underline that measuring narrow indicator sets may impede a deeper understanding of SES transformations. The results also imply that the resilience of contemporary, open SESs cannot be explained by an inward-looking approach because essential connections and drivers at other scales have become relevant in the globalized world. Our study thus has helped to identify pitfalls in empirical resilience assessment and to improve the conceptualization of SES dynamics.
Resumo:
We propose three research problems to explore the relations between trust and security in the setting of distributed computation. In the first problem, we study trust-based adversary detection in distributed consensus computation. The adversaries we consider behave arbitrarily disobeying the consensus protocol. We propose a trust-based consensus algorithm with local and global trust evaluations. The algorithm can be abstracted using a two-layer structure with the top layer running a trust-based consensus algorithm and the bottom layer as a subroutine executing a global trust update scheme. We utilize a set of pre-trusted nodes, headers, to propagate local trust opinions throughout the network. This two-layer framework is flexible in that it can be easily extensible to contain more complicated decision rules, and global trust schemes. The first problem assumes that normal nodes are homogeneous, i.e. it is guaranteed that a normal node always behaves as it is programmed. In the second and third problems however, we assume that nodes are heterogeneous, i.e, given a task, the probability that a node generates a correct answer varies from node to node. The adversaries considered in these two problems are workers from the open crowd who are either investing little efforts in the tasks assigned to them or intentionally give wrong answers to questions. In the second part of the thesis, we consider a typical crowdsourcing task that aggregates input from multiple workers as a problem in information fusion. To cope with the issue of noisy and sometimes malicious input from workers, trust is used to model workers' expertise. In a multi-domain knowledge learning task, however, using scalar-valued trust to model a worker's performance is not sufficient to reflect the worker's trustworthiness in each of the domains. To address this issue, we propose a probabilistic model to jointly infer multi-dimensional trust of workers, multi-domain properties of questions, and true labels of questions. Our model is very flexible and extensible to incorporate metadata associated with questions. To show that, we further propose two extended models, one of which handles input tasks with real-valued features and the other handles tasks with text features by incorporating topic models. Our models can effectively recover trust vectors of workers, which can be very useful in task assignment adaptive to workers' trust in the future. These results can be applied for fusion of information from multiple data sources like sensors, human input, machine learning results, or a hybrid of them. In the second subproblem, we address crowdsourcing with adversaries under logical constraints. We observe that questions are often not independent in real life applications. Instead, there are logical relations between them. Similarly, workers that provide answers are not independent of each other either. Answers given by workers with similar attributes tend to be correlated. Therefore, we propose a novel unified graphical model consisting of two layers. The top layer encodes domain knowledge which allows users to express logical relations using first-order logic rules and the bottom layer encodes a traditional crowdsourcing graphical model. Our model can be seen as a generalized probabilistic soft logic framework that encodes both logical relations and probabilistic dependencies. To solve the collective inference problem efficiently, we have devised a scalable joint inference algorithm based on the alternating direction method of multipliers. The third part of the thesis considers the problem of optimal assignment under budget constraints when workers are unreliable and sometimes malicious. In a real crowdsourcing market, each answer obtained from a worker incurs cost. The cost is associated with both the level of trustworthiness of workers and the difficulty of tasks. Typically, access to expert-level (more trustworthy) workers is more expensive than to average crowd and completion of a challenging task is more costly than a click-away question. In this problem, we address the problem of optimal assignment of heterogeneous tasks to workers of varying trust levels with budget constraints. Specifically, we design a trust-aware task allocation algorithm that takes as inputs the estimated trust of workers and pre-set budget, and outputs the optimal assignment of tasks to workers. We derive the bound of total error probability that relates to budget, trustworthiness of crowds, and costs of obtaining labels from crowds naturally. Higher budget, more trustworthy crowds, and less costly jobs result in a lower theoretical bound. Our allocation scheme does not depend on the specific design of the trust evaluation component. Therefore, it can be combined with generic trust evaluation algorithms.
Resumo:
Malicious users try to compromise systems using new techniques. One of the recent techniques used by the attacker is to perform complex distributed attacks such as denial of service and to obtain sensitive data such as password information. These compromised machines are said to be infected with malicious software termed a “bot”. In this paper, we investigate the correlation of behavioural attributes such as keylogging and packet flooding behaviour to detect the existence of a single bot on a compromised machine by applying (1) Spearman’s rank correlation (SRC) algorithm and (2) the Dendritic Cell Algorithm (DCA). We also compare the output results generated from these two methods to the detection of a single bot. The results show that the DCA has a better performance in detecting malicious activities.