927 resultados para security network
Resumo:
For over 150 years Australia has exported bulk, undifferentiated, commodities such as wool, wheat, meat and sugar to the UK and more recently to Japan, Korea, and the Middle East. It is estimated that, each year, Australia's farming system feeds a domestic population of some 22 million people, while exporting enough food to feed another 40 million. With the Australian population expected to double in the next 40 years, and with the anticipated growth in the world's population to reach a level of some 9 billion (from its present level of 7 billion) in the same period, there are strong incentives for an expansion of food production in Australia. Neoliberal settings are encouraging this expansion at the same time as they are facilitating importation of foods, higher levels of foreign direct investment and the commoditisation of resources (such as water). Yet, expansion in food production – and in an era of climate change – will continue to compromise the environment. After discussing Australia's neoliberal framework and its relation to farming, this paper outlines how Australia is attempting to address the issue of food security. It argues that productivist farming approaches that are favoured by both industry and government are proving incapable of bringing about long-term production outcomes that will guarantee national food security.
Resumo:
Biodiesel, produced from renewable feedstock represents a more sustainable source of energy and will therefore play a significant role in providing the energy requirements for transportation in the near future. Chemically, all biodiesels are fatty acid methyl esters (FAME), produced from raw vegetable oil and animal fat. However, clear differences in chemical structure are apparent from one feedstock to the next in terms of chain length, degree of unsaturation, number of double bonds and double bond configuration-which all determine the fuel properties of biodiesel. In this study, prediction models were developed to estimate kinematic viscosity of biodiesel using an Artificial Neural Network (ANN) modelling technique. While developing the model, 27 parameters based on chemical composition commonly found in biodiesel were used as the input variables and kinematic viscosity of biodiesel was used as output variable. Necessary data to develop and simulate the network were collected from more than 120 published peer reviewed papers. The Neural Networks Toolbox of MatLab R2012a software was used to train, validate and simulate the ANN model on a personal computer. The network architecture and learning algorithm were optimised following a trial and error method to obtain the best prediction of the kinematic viscosity. The predictive performance of the model was determined by calculating the coefficient of determination (R2), root mean squared (RMS) and maximum average error percentage (MAEP) between predicted and experimental results. This study found high predictive accuracy of the ANN in predicting fuel properties of biodiesel and has demonstrated the ability of the ANN model to find a meaningful relationship between biodiesel chemical composition and fuel properties. Therefore the model developed in this study can be a useful tool to accurately predict biodiesel fuel properties instead of undertaking costly and time consuming experimental tests.
Resumo:
“Supermax” prisons, conceived by the United States in the early 1980s, are typically reserved for convicted political criminals such as terrorists and spies and for other inmates who are considered to pose a serious ongoing threat to the wider community, to the security of correctional institutions, or to the safety of other inmates. Prisoners are usually restricted to their cells for up to twenty-three hours a day and typically have minimal contact with other inmates and correctional staff. Not only does the Federal Bureau of Prisons operate one of these facilities, but almost every state has either a supermax wing or stand-alone supermax prison. The Globalization of Supermax Prisons examines why nine advanced industrialized countries have adopted the supermax prototype, paying particular attention to the economic, social, and political processes that have affected each state. Featuring essays that look at the U.S.-run prisons of Abu Ghraib and Guantanemo, this collection seeks to determine if the American model is the basis for the establishment of these facilities and considers such issues as the support or opposition to the building of a supermax and why opposition efforts failed; the allegation of human rights abuses within these prisons; and the extent to which the decision to build a supermax was influenced by developments in the United States. Additionally, contributors address such domestic matters as the role of crime rates, media sensationalism, and terrorism in each country’s decision to build a supermax prison.
Resumo:
In Responsibility to Protect and Women, Peace and Security: Aligning the Protection Agendas, editors Davies, Nwokora, Stamnes and Teitt address the intersections of the Responsibility to Protect (R2P) principle and the Women, Peace, and Security (WPS) agenda. Widespread or systematic sexual or gender-based violence is a war crime, a crime against humanity and an act of genocide, all of which are clearly addressed in the R2P principle. The protection of those at risk of widespread sexual violence is therefore not only relative to the Women, Peace and Security (WPS) agenda, but a fundamental sovereign obligation for all states as part of their commitment to R2P. Contributions from policy-makers and academics consider both the merits and the utility of aligning the protection agendas of R2P and WPS. Ultimately, a number of actionable recommendations are made concerning a unification of the agendas to best support the global empowerment of women and prevention of mass atrocities.
Resumo:
This research introduces a general methodology in order to create a Coloured Petri Net (CPN) model of a security protocol. Then standard or user-defined security properties of the created CPN model are identified. After adding an attacker model to the protocol model, the security property is verified using state space method. This approach is applied to analyse a number of trusted computing protocols. The results show the applicability of proposed method to analyse both standard and user-defined properties.
Resumo:
This paper proposes a new distributed coordination approach to make load leveling, using Energy Storage Units (ESUs) in LV network. The proposed distributed control strategy is based on consensus algorithm which shares the required active power equally among the ESUs with respect to their rating. To show the effectiveness of the proposed approach, a typical radial LV network is simulated as a case study.
Resumo:
Voltage rise and drop are the main power quality challenges in Low Voltage (LV) network with Renewable Energy (RE) generators. This paper proposes a new voltage support strategy based on coordination of multiple Distribution Static Synchronous Compensators (DSTATCOMs) using consensus algorithm. The study focuses on LV network with PV as the RE source for customers. The proposed approach applied to a typical residential LV network and its advantages are shown comparing with other voltage control strategies.
Resumo:
The operation of Autonomous Underwater Vehicles (AUVs) within underwater sensor network fields provides an opportunity to reuse the network infrastructure for long baseline localisation of the AUV. Computationally efficient localisation can be accomplished using off-the-shelf hardware that is comparatively inexpensive and which could already be deployed in the environment for monitoring purposes. This paper describes the development of a particle filter based localisation system which is implemented onboard an AUV in real-time using ranging information obtained from an ad-hoc underwater sensor network. An experimental demonstration of this approach was conducted in a lake with results presented illustrating network communication and localisation performance.
Resumo:
Many large-scale GNSS CORS networks have been deployed around the world to support various commercial and scientific applications. To make use of these networks for real-time kinematic positioning services, one of the major challenges is the ambiguity resolution (AR) over long inter-station baselines in the presence of considerable atmosphere biases. Usually, the widelane ambiguities are fixed first, followed by the procedure of determination of the narrowlane ambiguity integers based on the ionosphere-free model in which the widelane integers are introduced as known quantities. This paper seeks to improve the AR performance over long baseline through efficient procedures for improved float solutions and ambiguity fixing. The contribution is threefold: (1) instead of using the ionosphere-free measurements, the absolute and/or relative ionospheric constraints are introduced in the ionosphere-constrained model to enhance the model strength, thus resulting in the better float solutions; (2) the realistic widelane ambiguity precision is estimated by capturing the multipath effects due to the observation complexity, leading to improvement of reliability of widelane AR; (3) for the narrowlane AR, the partial AR for a subset of ambiguities selected according to the successively increased elevation is applied. For fixing the scalar ambiguity, an error probability controllable rounding method is proposed. The established ionosphere-constrained model can be efficiently solved based on the sequential Kalman filter. It can be either reduced to some special models simply by adjusting the variances of ionospheric constraints, or extended with more parameters and constraints. The presented methodology is tested over seven baselines of around 100 km from USA CORS network. The results show that the new widelane AR scheme can obtain the 99.4 % successful fixing rate with 0.6 % failure rate; while the new rounding method of narrowlane AR can obtain the fix rate of 89 % with failure rate of 0.8 %. In summary, the AR reliability can be efficiently improved with rigorous controllable probability of incorrectly fixed ambiguities.
Resumo:
Detecting anomalies in the online social network is a significant task as it assists in revealing the useful and interesting information about the user behavior on the network. This paper proposes a rule-based hybrid method using graph theory, Fuzzy clustering and Fuzzy rules for modeling user relationships inherent in online-social-network and for identifying anomalies. Fuzzy C-Means clustering is used to cluster the data and Fuzzy inference engine is used to generate rules based on the cluster behavior. The proposed method is able to achieve improved accuracy for identifying anomalies in comparison to existing methods.
Resumo:
Safety concerns in the operation of autonomous aerial systems require safe-landing protocols be followed during situations where the mission should be aborted due to mechanical or other failure. This article presents a pulse-coupled neural network (PCNN) to assist in the vegetation classification in a vision-based landing site detection system for an unmanned aircraft. We propose a heterogeneous computing architecture and an OpenCL implementation of a PCNN feature generator. Its performance is compared across OpenCL kernels designed for CPU, GPU, and FPGA platforms. This comparison examines the compute times required for network convergence under a variety of images to determine the plausibility for real-time feature detection.
Resumo:
Although there are many approaches for developing secure programs, they are not necessarily helpful for evaluating the security of a pre-existing program. Software metrics promise an easy way of comparing the relative security of two programs or assessing the security impact of modifications to an existing one. Most studies in this area focus on high level source code but this approach fails to take compiler-specific code generation into account. In this work we describe a set of object-oriented Java bytecode security metrics which are capable of assessing the security of a compiled program from the point of view of potential information flow. These metrics can be used to compare the security of programs or assess the effect of program modifications on security using a tool which we have developed to automatically measure the security of a given Java bytecode program in terms of the accessibility of distinguished ‘classified’ attributes.
Resumo:
Cryptosystems based on the hardness of lattice problems have recently acquired much importance due to their average-case to worst-case equivalence, their conjectured resistance to quantum cryptanalysis, their ease of implementation and increasing practicality, and, lately, their promising potential as a platform for constructing advanced functionalities. In this work, we construct “Fuzzy” Identity Based Encryption from the hardness of the Learning With Errors (LWE) problem. We note that for our parameters, the underlying lattice problems (such as gapSVP or SIVP) are assumed to be hard to approximate within supexponential factors for adversaries running in subexponential time. We give CPA and CCA secure variants of our construction, for small and large universes of attributes. All our constructions are secure against selective-identity attacks in the standard model. Our construction is made possible by observing certain special properties that secret sharing schemes need to satisfy in order to be useful for Fuzzy IBE. We also discuss some obstacles towards realizing lattice-based attribute-based encryption (ABE).
Resumo:
For the past several decades, cryptographers have consistently provided us with stronger and more capable primitives and protocols that have found many applications in security systems in everyday life. One of the central tenets of cryptographic design is that, whereas a system’s architecture ought to be public and open to scrutiny, the keys on which it depends — long, utterly random, unique strings of bits — will be perfectly preserved by their owner, and yet nominally inaccessible to foes.
Resumo:
For TREC Crowdsourcing 2011 (Stage 2) we propose a networkbased approach for assigning an indicative measure of worker trustworthiness in crowdsourced labelling tasks. Workers, the gold standard and worker/gold standard agreements are modelled as a network. For the purpose of worker trustworthiness assignment, a variant of the PageRank algorithm, named TurkRank, is used to adaptively combine evidence that suggests worker trustworthiness, i.e., agreement with other trustworthy co-workers and agreement with the gold standard. A single parameter controls the importance of co-worker agreement versus gold standard agreement. The TurkRank score calculated for each worker is incorporated with a worker-weighted mean label aggregation.