995 resultados para Binary cyclic code


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper presents and discusses the use of dierent codes regarding the numerical simulation of a radial-in ow turbine. A radial-in ow turbine test case was selected from published literature [1] and commercial codes (Fluent and CFX) were used to perform the steady-state numerical simulations. An in-house compressible- ow simulation code, Eilmer3 [2] was also adapted in order to make it suitable to perform turbomachinery simulations and preliminary results are presented and discussed. The code itself as well as its adaptation, comprising the addition of terms for the rotating frame of reference, programmable boundary conditions for periodic boundaries and a mixing plane interface between the rotating and non-rotating blocks are also discussed. Several cases with dierent orders of complexity in terms of geometry were considered and the results were compared across the dierent codes. The agreement between these results and published data is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Refactoring is a common approach to producing better quality software. Its impact on many software quality properties, including reusability, maintainability and performance, has been studied and measured extensively. However, its impact on the information security of programs has received relatively little attention. In this work, we assess the impact of a number of the most common code-level refactoring rules on data security, using security metrics that are capable of measuring security from the viewpoint of potential information flow. The metrics are calculated for a given Java program using a static analysis tool we have developed to automatically analyse compiled Java bytecode. We ran our Java code analyser on various programs which were refactored according to each rule. New values of the metrics for the refactored programs then confirmed that the code changes had a measurable effect on information security.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article suggests that the issue of proportionality in anti-doping sanctions has been inconsistently dealt with by the Court of Arbitration for Sport (CAS). Given CAS’s pre-eminent role in interpreting and applying the World Anti-Doping Code under the anti-doping policies of its signatories, an inconsistent approach to the application of the proportionality principle will cause difficulties for domestic anti-doping tribunals seeking guidance as to the appropriateness of their doping sanctions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An optical system which performs the multiplication of binary numbers is described and proof-of-principle experiments are performed. The simultaneous generation of all partial products, optical regrouping of bit products, and optical carry look-ahead addition are novel features of the proposed scheme which takes advantage of the parallel operations capability of optical computers. The proposed processor uses liquid crystal light valves (LCLVs). By space-sharing the LCLVs one such system could function as an array of multipliers. Together with the optical carry look-ahead adders described, this would constitute an optical matrix-vector multiplier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

User interfaces for source code editing are a crucial component in any software development environment, and in many editors visual annotations (overlaid on the textual source code) are used to provide important contextual information to the programmer. This paper focuses on the real-time programming activity of ‘cyberphysical’ programming, and considers the type of visual annotations which may be helpful in this programming context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first version of the Standard PREanalytical Code (SPREC) was developed in 2009 by the International Society for Biological and Environmental Repositories (ISBER) Biospecimen Science Working Group to facilitate documentation and communication of the most important preanalytical quality parameters of different types of biospecimens used for research. This same Working Group has now updated the SPREC to version 2.0, presented here, so that it contains more options to allow for recent technological developments. Existing elements have been fine tuned. An interface to the Biospecimen Reporting for Improved Study Quality (BRISQ) has been defined, and informatics solutions for SPREC implementation have been developed. A glossary with SPRECrelated definitions has also been added.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When crest-fixed thin steel roof cladding systems are subjected to wind uplift, local pull-through or pull-out failures occur prematurely at their screwed connections. During high wind events such as storms and cyclones these localised failures then lead to severe damage to buildings and their contents. In recent times, the use of thin steel battens/purlins has increased considerably. This has made the pull-out failures more critical in the design of steel cladding systems. Recent research has developed a design formula for the static pull-out strength of steel cladding systems. However, the effects of fluctuating wind uplift loading that occurs during high wind events are not known. Therefore a series of constant amplitude cyclic tests has been undertaken on connections between steel battens made of different thicknesses and steel grades, and screw fasteners with varying diameter and pitch. This paper presents the details of these cyclic tests and the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Stretching of tissue stimulates angiogenesis but increased motion at a fracture site hinders revascularisation. In vitro studies have indicated that mechanical stimuli promote angiogenic responses in endothelial cells, but can either inhibit or enhance responses when applied directly to angiogenesis assays. We anticipated that cyclic tension applied during endothelial network assembly would increase vascular structure formation up to a certain threshold. Methods Fibroblast/HUVEC co-cultures were subjected to cyclic equibiaxial strain (1 Hz; 6 h/day; 7 days) using the FlexerCell FX-4000T system and limiting rings for simultaneous application of multiple strain magnitudes (0–13%). Cells were labelled using anti-PECAM-1, and image analysis provided measures of endothelial network length and numbers of junctions. Results Cyclic stretching had no significant effect on the total length of endothelial networks (P > 0.2) but resulted in a strain-dependent decrease in branching and localised alignments of endothelial structures, which were in turn aligned with the supporting fibroblastic construct. Conclusion The organisation of endothelial networks under cyclic strain is dominated by structural adaptation to the supporting construct. It may be that, in fracture healing, the formation and integrity of the granulation tissue and callus is ultimately critical in revascularisation and its failure under severe strain conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The three-volume Final Report of the Wood inquiry into NSW Police (Royal Commission Into the New South Wales Police Service, 'Final Report, Vol I: Corruption; Vol II: Reform; Vol III: Appendices', May 1997) was publicly released on 15 May 1997, to much media fanfare. The Sydney Morning Herald (SMH) devoted an 8-page special report on I May to the pending release of the Inquiry Report, headed The Police Purge. On the day of the public release of the Report, the SMH five-page 'Special Report' under the banner The Police Verdict was headlined Wood, Carr Split on Drugs. The Australian led with Call for Drug Law Revamp, Force Overhaul to Fight Corruption, Wood Attacks Culture of Greed, and the Daily Telegraph front page 'Final Verdict' was True Blue Strategy for an Honest Police Force...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a steganalysis method that is able to identify the locations of stego bearing pixels in the binary image. In order to do that, our proposed method will calculate the residual between a given stego image and its estimated cover image. After that, we will compute the local entropy difference between these two versions of images as well. Finally, we will compute the mean of residual and mean of local entropy difference across multiple stego images. From these two means, the locations of stego bearing pixels can be identified. The presented empirical results demonstrate that our proposed method can identify the stego bearing locations of near perfect accuracy when sufficient stego images are supplied. Hence, our proposed method can be used to reveal which pixels in the binary image have been used to carry the secret message.