992 resultados para public integrity verification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The coal industry in Queensland operates in a very complex regulatory environment with a matrix of Federal and State laws covering the environment, health and safety, taxation and royalties, tenure, and development approvals. The Queensland Government in 2012 recognised the validity of certain industry concerns and passed two Acts being the Environmental Protection (Greentape Reduction) Amendment Act 2012 (the Greentape Act) and the Mines Legislation (Streamlining) Amendment Act 2012 (the Streamlining Act). Other changes are foreshadowed in relation to overlapping tenure and in the development of common resources legislation. Accordingly there is a great level of activity and change that has occurred or which is on the horizon. This article focuses upon these regulatory changes and foreshadows other areas requiring consideration. It commences with a consideration of the changes that have already occurred, examines those regulatory amendments that are on the drawing board and concludes with suggestions as to further interventions and amendments that have the potential to enhance the efficiency and effectiveness of the legislative framework in which coal mining is conducted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Corporate scandals are as old as the corporate form itself. Consider, for example, the controversies surrounding the role of one of the first modern corporations, the British East India Company, in the Bengal famine of 1770 and in the Chinese opium trade. Yet it is the increasing scale and scope of unethical acts carried out by individuals in the name, and interests, of corporations that continue to be concerning. Recent revelations surrounding the extent of bribery and covert surveillance used by News Corporation journalists in its British operations continue to shock the world and undermine confidence in that organiszation and journalists in general. Yet despite the systemic nature of many of these unethical activities, corporate leaders generally plead ignorance when transgressions come to light. During the enquity into the News Corporation scandal, Rupert Murdoch, the CEO and chairman, rejected the assertion that he was ultimately 'responsible for this whole fiasco' (House of Commons, 2011, Q.230). Instead, like many corporate leaders before him, Murdoch placed blame on the employees within the newspaper. His responses poses an increasingly important question: Do corporate leaders bear responsibility for the conduct of individuals within a corporation and, if so, why?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Iris based identity verification is highly reliable but it can also be subject to attacks. Pupil dilation or constriction stimulated by the application of drugs are examples of sample presentation security attacks which can lead to higher false rejection rates. Suspects on a watch list can potentially circumvent the iris based system using such methods. This paper investigates a new approach using multiple parts of the iris (instances) and multiple iris samples in a sequential decision fusion framework that can yield robust performance. Results are presented and compared with the standard full iris based approach for a number of iris degradations. An advantage of the proposed fusion scheme is that the trade-off between detection errors can be controlled by setting parameters such as the number of instances and the number of samples used in the system. The system can then be operated to match security threat levels. It is shown that for optimal values of these parameters, the fused system also has a lower total error rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The decentralisation reform in Indonesia has mandated the Central Government to transfer some functions and responsibilities to local governments including the transfer of human resources, assets and budgets. Local governments became giant asset holders almost overnight and most were ill prepared to handle these transformations. Assets were transferred without analysing local government need, ability or capability to manage the assets and no local government was provided with an asset management framework. Therefore, the aim of this research is to develop a Public Asset Management Framework for provincial governments in Indonesia, especially for infrastructure and real property assets. This framework will enable provincial governments to develop integrated asset management procedures throughout asset‘s lifecycle. Achieving the research aim means answering the following three research questions; 1) How do provincial governments in Indonesia currently manage their public assets? 2) What factors influence the provincial governments in managing these public assets? 3) How is a Public Asset Management Framework developed that is specific for the Indonesian provincial governments‘ situation? This research applied case studies approach after a literature review; document retrieval, interviews and observations were collated. Data was collected in June 2009 (preliminary data collection) and January to July 2010 in the major eastern Indonesian provinces. Once the public asset management framework was developed, a focus group was used to verify the framework. Results are threefold and indicate that Indonesian provincial governments need to improve the effectiveness and efficiency of current practice of public asset management in order to improve public service quality. The second result shows that the 5 major concerns that influence the local government public asset management processes are asset identification and inventory systems, public asset holding, asset guidance and legal arrangements, asset management efficiency and effectiveness, and, human resources and their organisational arrangements. The framework was then applied to assets already transferred to local governments and so included a system of asset identification and a needs analysis to classify the importance of these assets to local governments, their functions and responsibilities in delivering public services. Assets that support local government functions and responsibilities will then be managed using suitable asset lifecycle processes. Those categorised as surplus assets should be disposed. Additionally functions and responsibilities that do not need an asset solution should be performed directly by local governments. These processes must be measured using performance measurement indicators. All these stages should be guided and regulated with sufficient laws and regulations. Constant improvements to the quality and quantity of human resources hold an important role in successful public asset management processes. This research focuses on developing countries, and contributes toward the knowledge of a Public Asset Management Framework at local government level, particularly Indonesia. The framework provides local governments a foundation to improve their effectiveness and efficiency in managing public assets, which could lead to improved public service quality. This framework will ensure that the best decisions are made throughout asset decision ownership and provide a better asset life cycle process, leading to selection of the most appropriate asset, improve its acquisition and delivery process, optimise asset performance, and provide an appropriate disposal program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design-build (DB) delivery system is an effective means of delivering a green construction project and selecting an appropriate contractor is critical to project success. Moreover, the delivery of green buildings requires specific design, construction and operation and maintenance considerations not generally encountered in the procurement of conventional buildings. Specifying clear sustainability requirements to potential contractors is particularly important in achieving sustainable project goals. However, many client/owners either do not explicitly specify sustainability requirements or do so in a prescriptive manner during the project procurement process. This paper investigates the current state-of-the-art procurement process used in specifying the sustainability requirements of the public sector in the USA construction market by means of a robust content analysis of 40 design-build requests for proposals (RFPs). The results of the content analysis indicate that the sustainability requirement is one of the most important dimensions in the best-value evaluation of DB contractors. Client/owners predominantly specify the LEED certification levels (e.g. LEED Certified, Silver, Gold, and Platinum) for a particular facility, and include the sustainability requirements as selection criteria (with specific importance weightings) for contractor evolution. Additionally, larger size projects tend to allocate higher importance weightings to sustainability requirements.This study provides public DB client/owners with a number of practical implications for selecting appropriate design-builders for sustainable DB projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and purpose: The purpose of the work presented in this paper was to determine whether patient positioning and delivery errors could be detected using electronic portal images of intensity modulated radiotherapy (IMRT). Patients and methods: We carried out a series of controlled experiments delivering an IMRT beam to a humanoid phantom using both the dynamic and multiple static field method of delivery. The beams were imaged, the images calibrated to remove the IMRT fluence variation and then compared with calibrated images of the reference beams without any delivery or position errors. The first set of experiments involved translating the position of the phantom both laterally and in a superior/inferior direction a distance of 1, 2, 5 and 10 mm. The phantom was also rotated 1 and 28. For the second set of measurements the phantom position was kept fixed and delivery errors were introduced to the beam. The delivery errors took the form of leaf position and segment intensity errors. Results: The method was able to detect shifts in the phantom position of 1 mm, leaf position errors of 2 mm, and dosimetry errors of 10% on a single segment of a 15 segment IMRT step and shoot delivery (significantly less than 1% of the total dose). Conclusions: The results of this work have shown that the method of imaging the IMRT beam and calibrating the images to remove the intensity modulations could be a useful tool in verifying both the patient position and the delivery of the beam.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Property is an important factor in all businesses production in order to function. Nourse (1990) quoted ¡°some businesses are real estate, all businesses use real estate¡±. In recent years, the management of property assets has become the focus of many organisations, including the non-real estate businesses. Good asset management is concerned with the effective utilisation of a property owner.s assets. It is the management process of ensuring that the portfolio of properties held meets the overall requirements of the users. In short, it is the process of identifying the user.s requirement and the rationalisation of property holdings to match that requirement best, followed by a monitoring and ongoing review of practice. In Malaysia, federal agencies and local authorities are among the largest property asset owners. Recently the federal government has released a Total Asset Management Manual (TAMM). It is at the preliminary stage of implementation. This thesis will study the international practices of asset management of public sector assets and assess the effectiveness of TAMM. This research will focus on current international practices for the effective management of public sector property assets. The current application in Malaysia will be highlighted, to determine the awareness and understanding of the current practices to the recently released TAMM. This research is an exploratory research. The basis of this research relies on the combination of qualitative and quantitative approach, whereby the qualitative approach focuses on the international practices and its application to the management of public sector property assets. Questionnaires survey will be conducted among the Malaysian public property assets managers and users in the quantitative approach to gauge the collective opinion on the current practices of TAMM and its implementation

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Examining the evolution of British and Australian policing, this comparative review of the literature considers the historical underpinnings of policing in these two countries and the impact of community legitimacy derived from the early concepts of policing by consent. Using the August 2011 disorder in Britain as a lens, this paper considers whether, in striving to maintain community confidence, undue emphasis is placed on the police's public image at the expense of community safety. Examining the path of policing reform, the impact of bureaucracy on policing and the evolving debate surrounding police performance, this review suggests that, while largely delivering on the ideal of an ethical and strong police force, a preoccupation with self-image may in fact result in tarnishing the very thing British and Australian police forces strive to achieve – their standing with the public. This paper advocates for a more realistic goal of gaining public respect rather than affection in order to achieve the difficult balance between maintaining trust and respect as an approachable, ethical entity providing firm, confident policing in this ever-evolving, modern society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Electronic Portal Imaging Devices (EPIDs) are available with most linear accelerators (Amonuk, 2002), the current technology being amorphous silicon flat panel imagers. EPIDs are currently used routinely in patient positioning before radiotherapy treatments. There has been an increasing interest in using EPID technology tor dosimetric verification of radiotherapy treatments (van Elmpt, 2008). A straightforward technique involves the EPID panel being used to measure the fluence exiting the patient during a treatment which is then compared to a prediction of the fluence based on the treatment plan. However, there are a number of significant limitations which exist in this Method: Resulting in a limited proliferation ot this technique in a clinical environment. In this paper, we aim to present a technique of simulating IMRT fields using Monte Carlo to predict the dose in an EPID which can then be compared to the measured dose in the EPID. Materials: Measurements were made using an iView GT flat panel a-SI EPfD mounted on an Elekta Synergy linear accelerator. The images from the EPID were acquired using the XIS software (Heimann Imaging Systems). Monte Carlo simulations were performed using the BEAMnrc and DOSXVZnrc user codes. The IMRT fieids to be delivered were taken from the treatment planning system in DICOMRT format and converted into BEAMnrc and DOSXYZnrc input files using an in-house application (Crowe, 2009). Additionally. all image processing and analysis was performed using another in-house application written using the Interactive Data Language (IDL) (In Visual Information Systems). Comparison between the measured and Monte Carlo EPID images was performed using a gamma analysis (Low, 1998) incorporating dose and distance to agreement criteria. Results: The fluence maps recorded by the EPID were found to provide good agreement between measured and simulated data. Figure 1 shows an example of measured and simulated IMRT dose images and profiles in the x and y directions. "A technique for the quantitative evaluation of dose distributions", Med Phys, 25(5) May 1998 S. Crowe, 1. Kairn, A. Fielding, "The Development of a Monte Carlo system to verify Radiotherapy treatment dose calculations", Radiotherapy & Oncology, Volume 92, Supplement 1, August 2009, Pages S71-S71.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have taken a new method of calibrating portal images of IMRT beams and used this to measure patient set-up accuracy and delivery errors, such as leaf errors and segment intensity errors during treatment. A calibration technique was used to remove the intensity modulations from the images leaving equivalent open field images that show patient anatomy that can be used for verification of the patient position. The images of the treatment beam can also be used to verify the delivery of the beam in terms of multileaf collimator leaf position and dosimetric errors. A series of controlled experiments delivering an IMRT anterior beam to the head and neck of a humanoid phantom were undertaken. A 2mm translation in the position of the phantom could be detected. With intentional introduction of delivery errors into the beam this method allowed us to detect leaf positioning errors of 2mm and variation in monitor units of 1%. The method was then applied to the case of a patient who received IMRT treatment to the larynx and cervical nodes. The anterior IMRT beam was imaged during four fractions and the images calibrated and investigated for the characteristic signs of patient position error and delivery error that were shown in the control experiments. No significant errors were seen. The method of imaging the IMRT beam and calibrating the images to remove the intensity modulations can be a useful tool in verifying both the patient position and the delivery of the beam.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The natural disasters incident that frequently hit Indonesia are floods, severe droughts, tsunamis, earth-quakes, volcano, eruptions, landslides, windstorm and forest fires. The impact of those natural disasters are significantly severe and affecting the quality of life of the community due to the breakdown of the public as-sets as one source to deliver public services. This paper is aimed to emphasis the importance of natural disaster risk-informed in relation to public asset management in Indonesian Central Government, particularly in asset planning stage where asset decision is made as the gate into the whole public asset management processes. A Case study in the Ministry of Finance Indonesia as the central government public asset manager and in 5 (five) line ministries/governmental agencies as public asset users was used as the approach to achieved the research objective. The case study devoured three data collection techniques i.e. interviews, observations and document archival which will be analysed by a content analysis approach. The result of the study indicates that Indonesian geographical position exposing many of public infra-structure assets as a high vulnerability to natural disasters. Information on natural-disaster trends and predictions to identify and measure the risks are available, however, such information are not utilise and integrated to the process of public infrastructure asset planning as the gate to the whole public asset management processes. Therefore, in order to accommodate and incorporate this natural disaster risk-information into public asset management processes, particularly in public asset planning, a public asset performance measurements framework should be adopted and applied in the process as one sources in making decision for infrastructure asset planning. Findings from this study provide useful input for the Ministry of Finance as public asset manager, scholars and private asset management practitioners in Indonesia to establish natural disaster risks awareness in public infrastructure asset management processes.