243 resultados para Verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Voltage drop and rise at network peak and off–peak periods along with voltage unbalance are the major power quality problems in low voltage distribution networks. Usually, the utilities try to use adjusting the transformer tap changers as a solution for the voltage drop. They also try to distribute the loads equally as a solution for network voltage unbalance problem. On the other hand, the ever increasing energy demand, along with the necessity of cost reduction and higher reliability requirements, are driving the modern power systems towards Distributed Generation (DG) units. This can be in the form of small rooftop photovoltaic cells (PV), Plug–in Electric Vehicles (PEVs) or Micro Grids (MGs). Rooftop PVs, typically with power levels ranging from 1–5 kW installed by the householders are gaining popularity due to their financial benefits for the householders. Also PEVs will be soon emerged in residential distribution networks which behave as a huge residential load when they are being charged while in their later generation, they are also expected to support the network as small DG units which transfer the energy stored in their battery into grid. Furthermore, the MG which is a cluster of loads and several DG units such as diesel generators, PVs, fuel cells and batteries are recently introduced to distribution networks. The voltage unbalance in the network can be increased due to the uncertainties in the random connection point of the PVs and PEVs to the network, their nominal capacity and time of operation. Therefore, it is of high interest to investigate the voltage unbalance in these networks as the result of MGs, PVs and PEVs integration to low voltage networks. In addition, the network might experience non–standard voltage drop due to high penetration of PEVs, being charged at night periods, or non–standard voltage rise due to high penetration of PVs and PEVs generating electricity back into the grid in the network off–peak periods. In this thesis, a voltage unbalance sensitivity analysis and stochastic evaluation is carried out for PVs installed by the householders versus their installation point, their nominal capacity and penetration level as different uncertainties. A similar analysis is carried out for PEVs penetration in the network working in two different modes: Grid to vehicle and Vehicle to grid. Furthermore, the conventional methods are discussed for improving the voltage unbalance within these networks. This is later continued by proposing new and efficient improvement methods for voltage profile improvement at network peak and off–peak periods and voltage unbalance reduction. In addition, voltage unbalance reduction is investigated for MGs and new improvement methods are proposed and applied for the MG test bed, planned to be established at Queensland University of Technology (QUT). MATLAB and PSCAD/EMTDC simulation softwares are used for verification of the analyses and the proposals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis presented in this paper is that the land fraud committed by Matthew Perrin in Queensland and inflicted upon Roger Mildenhall in Western Australia demonstrates the need for urgent procedural reform to the conveyancing process. Should this not occur, then calls to reform the substantive principles of the Torrens system will be heard throughout the jurisdictions that adopt title by registration, particularly in those places where immediate indefeasibility is still the norm. This paper closely examines the factual matrix behind both of these frauds, and asks what steps should have been taken to prevent them occurring. With 2012 bringing us Australian legislation embedding a national e-conveyancing system and a new Land Transfer Act for New Zealand we ask what legislative measures should be introduced to minimise the potential for such fraud. In undertaking this study, we reflect on whether the activities of Perrin and the criminals responsible for stealing Mildenhall's land would have succeeded under the present system for automated registration utilised in New Zealand.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Food modelling systems such as the Core Foods and the Australian Guide to Healthy Eating are frequently used as nutritional assessment tools for menus in ‘well’ groups (such as boarding schools, prisons and mental health facilities), with the draft Foundation and Total Diets (FATD) the latest revision. The aim of this paper is to apply the FATD to an assessment of food provision in a long stay, ‘well’, group setting to determine its usefulness as a tool. A detailed menu review was conducted in a 1000 bed male prison, including verification of all recipes. Full diet histories were collected on 106 prisoners which included foods consumed from the menu and self funded snacks. Both the menu and diet histories were analysed according to core foods, with recipes used to assist in quantification of mixed dishes. Comparison was made of average core foods with Foundation Diet recommendations (FDR) for males. Results showed that the standard menu provided sufficient quantity for 8 of 13 FDRs, however was low in nuts, legumes, refined cereals and marginally low in fruits and orange vegetables. The average prisoner diet achieved 9 of 13 FDRs, notably with margarines and oils less than half and legumes one seventh of recommended. Overall, although the menu and prisoner diets could easily be assessed using the FDRs, it was not consistent with recommendations. In long stay settings other Nutrient Reference Values not modelled in the FATDS need consideration, in particular, Suggested Dietary Targets and professional judgement is required in interpretation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the effects of limited speech data in the context of speaker verification using a probabilistic linear discriminant analysis (PLDA) approach. Being able to reduce the length of required speech data is important to the development of automatic speaker verification system in real world applications. When sufficient speech is available, previous research has shown that heavy-tailed PLDA (HTPLDA) modeling of speakers in the i-vector space provides state-of-the-art performance, however, the robustness of HTPLDA to the limited speech resources in development, enrolment and verification is an important issue that has not yet been investigated. In this paper, we analyze the speaker verification performance with regards to the duration of utterances used for both speaker evaluation (enrolment and verification) and score normalization and PLDA modeling during development. Two different approaches to total-variability representation are analyzed within the PLDA approach to show improved performance in short-utterance mismatched evaluation conditions and conditions for which insufficient speech resources are available for adequate system development. The results presented within this paper using the NIST 2008 Speaker Recognition Evaluation dataset suggest that the HTPLDA system can continue to achieve better performance than Gaussian PLDA (GPLDA) as evaluation utterance lengths are decreased. We also highlight the importance of matching durations for score normalization and PLDA modeling to the expected evaluation conditions. Finally, we found that a pooled total-variability approach to PLDA modeling can achieve better performance than the traditional concatenated total-variability approach for short utterances in mismatched evaluation conditions and conditions for which insufficient speech resources are available for adequate system development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sandwich shells have recently emerged as aesthetically pleasing, efficient and economical structural systems, with a number of applications. They combine the advantages of sandwich layer technology together with those of shell action. With different materials and thicknesses used in the sandwich layers, their performance characteristics largely remain un-quantified and there are no guidelines at present for their design. This research paper provides verification, through finite element modeling and testing, for the application of this technology to dome styled dwellings with research currently being conducted into the further application to roofing and floor structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Client puzzles are cryptographic problems that are neither easy nor hard to solve. Most puzzles are based on either number theoretic or hash inversions problems. Hash-based puzzles are very efficient but so far have been shown secure only in the random oracle model; number theoretic puzzles, while secure in the standard model, tend to be inefficient. In this paper, we solve the problem of constucting cryptographic puzzles that are secure int he standard model and are very efficient. We present an efficient number theoretic puzzle that satisfies the puzzle security definition of Chen et al. (ASIACRYPT 2009). To prove the security of our puzzle, we introduce a new variant of the interval discrete logarithm assumption which may be of independent interest, and show this new problem to be hard under reasonable assumptions. Our experimental results show that, for 512-bit modulus, the solution verification time of our proposed puzzle can be up to 50x and 89x faster than the Karame-Capkum puzzle and the Rivest et al.'s time-lock puzzle respectively. In particular, the solution verification tiem of our puzzle is only 1.4x slower than that of Chen et al.'s efficient hash based puzzle.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Automated feature extraction and correspondence determination is an extremely important problem in the face recognition community as it often forms the foundation of the normalisation and database construction phases of many recognition and verification systems. This paper presents a completely automatic feature extraction system based upon a modified volume descriptor. These features form a stable descriptor for faces and are utilised in a reversible jump Markov chain Monte Carlo correspondence algorithm to automatically determine correspondences which exist between faces. The developed system is invariant to changes in pose and occlusion and results indicate that it is also robust to minor face deformations which may be present with variations in expression.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Monte Carlo DICOM Tool-Kit (MCDTK) is a software suite designed for treatment plan dose verification, using the BEAMnrc and DOSXYZnrc Monte Carlo codes. MCDTK converts DICOM-format treatment plan information into Monte Carlo input files and compares the results of Monte Carlo treatment simulations with conventional treatment planning dose calculations. In this study, a treatment is planned using a commercial treatment planning system, delivered to a pelvis phantom containing ten thermoluminescent dosimeters and simulated using BEAMnrc and DOSXYZnrc using inputs derived from MCDTK. The dosimetric accuracy of the Monte Carlo data is then evaluated via comparisons with the dose distribution obtained from the treatment planning system as well as the in-phantom point dose measurements. The simulated beam arrangement produced by MCDTK is found to be in geometric agreement with the planned treatment. An isodose display generated from the Monte Carlo data by MCDTK shows general agreement with the isodose display obtained from the treatment planning system, except for small regions around density heterogeneities in the phantom, where the pencil-beam dose calculation performed by the treatment planning systemis likely to be less accurate. All point dose measurements agree with the Monte Carlo data obtained using MCDTK, within confidence limits, and all except one of these point dose measurements show closer agreement with theMonte Carlo data than with the doses calculated by the treatment planning system. This study provides a simple demonstration of the geometric and dosimetric accuracy ofMonte Carlo simulations based on information from MCDTK.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As one of the first institutional repositories in Australia and the first in the world to have an institution-wide deposit mandate, QUT ePrints has great ‘brand recognition’ within the University (Queensland University of Technology) and beyond. The repository is managed by the library but, over the years, the Library’s repository team has worked closely with other departments (especially the Office of Research and IT Services) to ensure that QUT ePrints was embedded into the business processes and systems our academics use regularly. For example, the repository is the source of the publication information which displays on each academic’s Staff Profile page. The repository pulls in citation data from Scopus and Web of Science and displays the data in the publications records. Researchers can monitor their citations at a glance via the repository ‘View’ which displays all their publications. A trend in recent years has been to populate institutional repositories with publication details imported from the University’s research information system (RIS). The main advantage of the RIS to Repository workflow is that it requires little input from the academics as the publication details are often imported into the RIS from publisher databases. Sadly, this is also its main disadvantage. Generally, only the metadata is imported from the RIS and the lack of engagement by the academics results in very low proportions of records with open access full-texts. Consequently, while we could see the value of integrating the two systems, we were determined to make the repository the entry point for publication data. In 2011, the University funded a project to convert a number of paper-based processes into web-based workflows. This included a workflow to replace the paper forms academics used to complete to report new publications (which were later used by the data entry staff to input the details into the RIS). Publication details and full-text files are uploaded to the repository (by the academics or their nominees). Each night, the repository (QUT ePrints) pushes the metadata for new publications into a holding table. The data is checked by Office of Research staff the next day and then ‘imported’ into the RIS. Publication details (including the repository URLs) are pushed from the RIS to the Staff Profiles system. Previously, academics were required to supply the Office of research with photocopies of their publication (for verification/auditing purposes). The repository is now the source of verification information. Library staff verify the accuracy of the publication details and, where applicable, the peer review status of the work. The verification metadata is included in the information passed to the Office of Research. The RIS at QUT comprises two separate systems built on an Oracle database; a proprietary product (ResearchMaster) plus a locally produced system known as RAD (Research Activity Database). The repository platform is EPrints which is built on a MySQL database. This partly explains why the data is passed from one system to the other via a holding table. The new workflow went live in early April 2012. Tests of the technical integration have all been successful. At the end of the first 12 months, the impact of the new workflow on the proportion of full-texts deposited will be evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Lockyer Valley in southeast Queensland, Australia, hosts an economically significant alluvial aquifer system which has been impacted by prolonged drought conditions (~1997 to ~ 2009). Throughout this time, the system was under continued groundwater extraction, resulting in severe aquifer depletion. By 2008, much of the aquifer was at <30% of storage but some relief occurred with rains in early 2009. However, between December 2010 and January 2011, most of southeast Queensland experienced unprecedented flooding, which generated significant aquifer recharge. In order to understand the spatial and temporal controls of groundwater recharge in the alluvium, a detailed 3D lithological property model of gravels, sands and clays was developed using GOCAD software. The spatial distribution of recharge throughout the catchment was assessed using hydrograph data from about 400 groundwater observation wells screened at the base of the alluvium. Water levels from these bores were integrated into a catchment-wide 3D geological model using the 3D geological modelling software GOCAD; the model highlights the complexity of recharge mechanisms. To support this analysis, groundwater tracers (e.g. major and minor ions, stable isotopes, 3H and 14C) were used as independent verification. The use of these complementary methods has allowed the identification of zones where alluvial recharge primarily occurs from stream water during episodic flood events. However, the study also demonstrates that in some sections of the alluvium, rainfall recharge and discharge from the underlying basement into the alluvium are the primary recharge mechanisms of the alluvium. This is indicated by the absence of any response to the flood, as well as the observed old radiocarbon ages and distinct basement water chemistry signatures at these locations. Within the 3D geological model, integration of water chemistry and time-series displays of water level surfaces before and after the flood suggests that the spatial variations of the flood response in the alluvium are primarily controlled by the valley morphology and lithological variations within the alluvium. The integration of time-series of groundwater level surfaces in the 3D geological model also enables the quantification of the volumetric change of groundwater stored in the unconfined sections of this alluvial aquifer during drought and following flood events. The 3D representation and analysis of hydraulic and recharge information has considerable advantages over the traditional 2D approach. For example, while many studies focus on singular aspects of catchment dynamics and groundwater-surface water interactions, the 3D approach is capable of integrating multiple types of information (topography, geological, hydraulic, water chemistry and spatial) into a single representation which provides valuable insights into the major factors controlling aquifer processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract (provisional) Background Food Frequency Questionnaires (FFQs) are commonly used in epidemiologic studies to assess long-term nutritional exposure. Because of wide variations in dietary habits in different countries, a FFQ must be developed to suit the specific population. Sri Lanka is undergoing nutritional transition and diet-related chronic diseases are emerging as an important health problem. Currently, no FFQ has been developed for Sri Lankan adults. In this study, we developed a FFQ to assess the regular dietary intake of Sri Lankan adults. Methods A nationally representative sample of 600 adults was selected by a multi-stage random cluster sampling technique and dietary intake was assessed by random 24-h dietary recall. Nutrient analysis of the FFQ required the selection of foods, development of recipes and application of these to cooked foods to develop a nutrient database. We constructed a comprehensive food list with the units of measurement. A stepwise regression method was used to identify foods contributing to a cumulative 90% of variance to total energy and macronutrients. In addition, a series of photographs were included. Results We obtained dietary data from 482 participants and 312 different food items were recorded. Nutritionists grouped similar food items which resulted in a total of 178 items. After performing step-wise multiple regression, 93 foods explained 90% of the variance for total energy intake, carbohydrates, protein, total fat and dietary fibre. Finally, 90 food items and 12 photographs were selected. Conclusion We developed a FFQ and the related nutrient composition database for Sri Lankan adults. Culturally specific dietary tools are central to capturing the role of diet in risk for chronic disease in Sri Lanka. The next step will involve the verification of FFQ reproducibility and validity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the use of mel-frequency deltaphase (MFDP) features in comparison to, and in fusion with, traditional mel-frequency cepstral coefficient (MFCC) features within joint factor analysis (JFA) speaker verification. MFCC features, commonly used in speaker recognition systems, are derived purely from the magnitude spectrum, with the phase spectrum completely discarded. In this paper, we investigate if features derived from the phase spectrum can provide additional speaker discriminant information to the traditional MFCC approach in a JFA based speaker verification system. Results are presented which provide a comparison of MFCC-only, MFDPonly and score fusion of the two approaches within a JFA speaker verification approach. Based upon the results presented using the NIST 2008 Speaker Recognition Evaluation (SRE) dataset, we believe that, while MFDP features alone cannot compete with MFCC features, MFDP can provide complementary information that result in improved speaker verification performance when both approaches are combined in score fusion, particularly in the case of shorter utterances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of Trusted Platform Module (TPM) is be- coming increasingly popular in many security sys- tems. To access objects protected by TPM (such as cryptographic keys), several cryptographic proto- cols, such as the Object Specific Authorization Pro- tocol (OSAP), can be used. Given the sensitivity and the importance of those objects protected by TPM, the security of this protocol is vital. Formal meth- ods allow a precise and complete analysis of crypto- graphic protocols such that their security properties can be asserted with high assurance. Unfortunately, formal verification of these protocols are limited, de- spite the abundance of formal tools that one can use. In this paper, we demonstrate the use of Coloured Petri Nets (CPN) - a type of formal technique, to formally model the OSAP. Using this model, we then verify the authentication property of this protocol us- ing the state space analysis technique. The results of analysis demonstrates that as reported by Chen and Ryan the authentication property of OSAP can be violated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computational Fluid Dynamics (CFD) simulations are widely used in mechanical engineering. Although achieving a high level of confidence in numerical modelling is of crucial importance in the field of turbomachinery, verification and validation of CFD simulations are very tricky especially for complex flows encountered in radial turbines. Comprehensive studies of radial machines are available in the literature. Unfortunately, none of them include enough detailed geometric data to be properly reproduced and so cannot be considered for academic research and validation purposes. As a consequence, design improvements of such configurations are difficult. Moreover, it seems that well-developed analyses of radial turbines are used in commercial software but are not available in the open literature especially at high pressure ratios. It is the purpose of this paper to provide a fully open set of data to reproduce the exact geometry of the high pressure ratio single stage radial-inflow turbine used in the Sundstrand Power Systems T-100 Multipurpose Small Power Unit. First, preliminary one-dimensional meanline design and analysis are performed using the commercial software RITAL from Concepts-NREC in order to establish a complete reference test case available for turbomachinery code validation. The proposed design of the existing turbine is then carefully and successfully checked against the geometrical and experimental data partially published in the literature. Then, three-dimensional Reynolds-Averaged Navier-Stokes simulations are conducted by means of the Axcent-PushButton CFDR CFD software. The effect of the tip clearance gap is investigated in detail for a wide range of operating conditions. The results confirm that the 3D geometry is correctly reproduced. It also reveals that the turbine is shocked while designed to give a high-subsonic flow and highlight the importance of the diffuser.