412 resultados para Requirements elicitation techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates advanced channel compensation techniques for the purpose of improving i-vector speaker verification performance in the presence of high intersession variability using the NIST 2008 and 2010 SRE corpora. The performance of four channel compensation techniques: (a) weighted maximum margin criterion (WMMC), (b) source-normalized WMMC (SN-WMMC), (c) weighted linear discriminant analysis (WLDA), and; (d) source-normalized WLDA (SN-WLDA) have been investigated. We show that, by extracting the discriminatory information between pairs of speakers as well as capturing the source variation information in the development i-vector space, the SN-WLDA based cosine similarity scoring (CSS) i-vector system is shown to provide over 20% improvement in EER for NIST 2008 interview and microphone verification and over 10% improvement in EER for NIST 2008 telephone verification, when compared to SN-LDA based CSS i-vector system. Further, score-level fusion techniques are analyzed to combine the best channel compensation approaches, to provide over 8% improvement in DCF over the best single approach, (SN-WLDA), for NIST 2008 interview/ telephone enrolment-verification condition. Finally, we demonstrate that the improvements found in the context of CSS also generalize to state-of-the-art GPLDA with up to 14% relative improvement in EER for NIST SRE 2010 interview and microphone verification and over 7% relative improvement in EER for NIST SRE 2010 telephone verification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ascorbic acid or vitamin C is involved in a number of biochemical pathways that are important to exercise metabolism and the health of exercising individuals. This review reports the results of studies investigating the requirement for vitamin C with exercise on the basis of dietary vitamin C intakes, the response to supplementation and alterations in plasma, serum, and leukocyte ascorbic acid concentration following both acute exercise and regular training. The possible physiological significance of changes in ascorbic acid with exercise is also addressed. Exercise generally causes a transient increase in circulating ascorbic acid in the hours following exercise, but a decline below pre-exercise levels occurs in the days after prolonged exercise. These changes could be associated with increased exercise-induced oxidative stress. On the basis of alterations in the concentration of ascorbic acid within the blood, it remains unclear if regular exercise increases the metabolism of vitamin C. However, the similar dietary intakes and responses to supplementation between athletes and nonathletes suggest that regular exercise does not increase the requirement for vitamin C in athletes. Two novel hypotheses are put forward to explain recent findings of attenuated levels of cortisol postexercise following supplementation with high doses of vitamin C.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of mathematical models investigating certain aspects of the complicated process of wound healing are reported in the literature in recent years. However, effective numerical methods and supporting error analysis for the fractional equations which describe the process of wound healing are still limited. In this paper, we consider the numerical simulation of a fractional mathematical model of epidermal wound healing (FMM-EWH), which is based on the coupled advection-diffusion equations for cell and chemical concentration in a polar coordinate system. The space fractional derivatives are defined in the Left and Right Riemann-Liouville sense. Fractional orders in the advection and diffusion terms belong to the intervals (0, 1) or (1, 2], respectively. Some numerical techniques will be used. Firstly, the coupled advection-diffusion equations are decoupled to a single space fractional advection-diffusion equation in a polar coordinate system. Secondly, we propose a new implicit difference method for simulating this equation by using the equivalent of Riemann-Liouville and Grünwald-Letnikov fractional derivative definitions. Thirdly, its stability and convergence are discussed, respectively. Finally, some numerical results are given to demonstrate the theoretical analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter provides researchers with a guide to some of the types of dating techniques that can be used in geomorpological investigations and issues that need to be addressed when using gechronological data, specifically issues relating to accuracy and precision. This chapter also introduces the 'types' of dating methods that are commonly used in geomorphological studies. This includes sidereal, isotopic, radiogenic, and chemical dating methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Restoring old buildings to conform the current building policies and standards is a great challenge to engineers and architects. The restoration of the Brisbane City Hall, a heritage building listed by the State of Queensland in Australia, developed an innovative approach to upgrade the building using the method called ‘concrete overlay’ following the guidelines of both the International Council on Monuments and Sites and the Burra Charter of Australia. Concrete overlay is a new method of structural strengthening by drilling new reinforcement and placing new concrete on top of the existing structure, akin to a bone transplant or bone grafting in the case of a human being. This method is popularly used for newer bridges which have suffered load stresses. However, this method had never been used on any heritage buildings which were built on different conditions and standards. The compatibility of this method is currently being monitored. Most of the modern historic buildings are rapidly deteriorating and require immediate interventions in order to be saved. As most of these heritage buildings are on the stage of advanced deterioration, significant attempts are being made and several innovations are being applied to upgrade these structures to conform with the current building requirements. To date, the knowledge and literature in regarding ‘concrete cancer’ in relation to rehabilitating these reinforced concrete heritage structures is significantly lacking. It is hoped that the method of concrete overlay and the case study of Brisbane City Hall restoration will contribute to the development of restoration techniques and policies for Modern Heritage Buildings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Autonomous acoustic recorders are widely available and can provide a highly efficient method of species monitoring, especially when coupled with software to automate data processing. However, the adoption of these techniques is restricted by a lack of direct comparisons with existing manual field surveys. 2. We assessed the performance of autonomous methods by comparing manual and automated examination of acoustic recordings with a field-listening survey, using commercially available autonomous recorders and custom call detection and classification software. We compared the detection capability, time requirements, areal coverage and weather condition bias of these three methods using an established call monitoring programme for a nocturnal bird, the little spotted kiwi(Apteryx owenii). 3. The autonomous recorder methods had very high precision (>98%) and required <3% of the time needed for the field survey. They were less sensitive, with visual spectrogram inspection recovering 80% of the total calls detected and automated call detection 40%, although this recall increased with signal strength. The areal coverage of the spectrogram inspection and automatic detection methods were 85% and 42% of the field survey. The methods using autonomous recorders were more adversely affected by wind and did not show a positive association between ground moisture and call rates that was apparent from the field counts. However, all methods produced the same results for the most important conservation information from the survey: the annual change in calling activity. 4. Autonomous monitoring techniques incur different biases to manual surveys and so can yield different ecological conclusions if sampling is not adjusted accordingly. Nevertheless, the sensitivity, robustness and high accuracy of automated acoustic methods demonstrate that they offer a suitable and extremely efficient alternative to field observer point counts for species monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design-build (DB) delivery system is an effective means of delivering a green construction project and selecting an appropriate contractor is critical to project success. Moreover, the delivery of green buildings requires specific design, construction and operation and maintenance considerations not generally encountered in the procurement of conventional buildings. Specifying clear sustainability requirements to potential contractors is particularly important in achieving sustainable project goals. However, many client/owners either do not explicitly specify sustainability requirements or do so in a prescriptive manner during the project procurement process. This paper investigates the current state-of-the-art procurement process used in specifying the sustainability requirements of the public sector in the USA construction market by means of a robust content analysis of 40 design-build requests for proposals (RFPs). The results of the content analysis indicate that the sustainability requirement is one of the most important dimensions in the best-value evaluation of DB contractors. Client/owners predominantly specify the LEED certification levels (e.g. LEED Certified, Silver, Gold, and Platinum) for a particular facility, and include the sustainability requirements as selection criteria (with specific importance weightings) for contractor evolution. Additionally, larger size projects tend to allocate higher importance weightings to sustainability requirements.This study provides public DB client/owners with a number of practical implications for selecting appropriate design-builders for sustainable DB projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The feral pig, Sus scrofa, is a widespread and abundant invasive species in Australia. Feral pigs pose a significant threat to the environment, agricultural industry, and human health, and in far north Queensland they endanger World Heritage values of the Wet Tropics. Historical records document the first introduction of domestic pigs into Australia via European settlers in 1788 and subsequent introductions from Asia from 1827 onwards. Since this time, domestic pigs have been accidentally and deliberately released into the wild and significant feral pig populations have become established, resulting in the declaration of this species as a class 2 pest in Queensland. The overall objective of this study was to assess the population genetic structure of feral pigs in far north Queensland, in particular to enable delineation of demographically independent management units. The identification of ecologically meaningful management units using molecular techniques can assist in targeting feral pig control to bring about effective long-term management. Molecular genetic analysis was undertaken on 434 feral pigs from 35 localities between Tully and Innisfail. Seven polymorphic and unlinked microsatellite loci were screened and fixation indices (FST and analogues) and Bayesian clustering methods were used to identify population structure and management units in the study area. Sequencing of the hyper-variable mitochondrial control region (D-loop) of 35 feral pigs was also examined to identify pig ancestry. Three management units were identified in the study at a scale of 25 to 35 km. Even with the strong pattern of genetic structure identified in the study area, some evidence of long distance dispersal and/or translocation was found as a small number of individuals exhibited ancestry from a management unit outside of which they were sampled. Overall, gene flow in the study area was found to be influenced by environmental features such as topography and land use, but no distinct or obvious natural or anthropogenic geographic barriers were identified. Furthermore, strong evidence was found for non-random mating between pigs of European and Asian breeds indicating that feral pig ancestry influences their population genetic structure. Phylogenetic analysis revealed two distinct mitochondrial DNA clades, representing Asian domestic pig breeds and European breeds. A significant finding was that pigs of Asian origin living in Innisfail and south Tully were not mating randomly with European breed pigs populating the nearby Mission Beach area. Feral pig control should be implemented in each of the management units identified in this study. The control should be coordinated across properties within each management unit to prevent re-colonisation from adjacent localities. The adjacent rainforest and National Park Estates, as well as the rainforest-crop boundary should be included in a simultaneous control operation for greater success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the question of how interim financial reporting regulation varies across the Asia-Pacific region. Using a content analysis method, the study investigates the relevant regulations in eight selected countries in the Asia-Pacific region which differ in a number of country-level attributes. We find that the regulations in the region show considerable variation in terms of the form of regulatory enforcement, reporting lag, audit requirements, and reporting form. By providing the first in-depth review of the nature of differences in interim financial reporting in key countries in the Asia-Pacific region, the findings of this study will be of interest to investors, regulators and researchers in their quest for international “convergence” in financial reporting practices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative market data has traditionally been used throughout marketing and business as a tool to inform and direct design decisions. However, in our changing economic climate, businesses need to innovate and create products their customers will love. Deep customer insight methods move beyond just questioning customers and aims to provoke true emotional responses in order to reveal new opportunities that go beyond functional product requirements. This paper explores traditional market research methods and compares them to methods used to gain deep customer insights. This study reports on a collaborative research project with seven small to medium enterprises and four multi-national organisations. Firms were introduced to a design led innovation approach, and were taught the different methods to gain deep customer insights. Interviews were conducted to understand the experience and outcomes of pre-existing research methods and deep customer insight approaches. Findings concluded that deep customer insights were unlikely to be revealed through traditional market research techniques. The theoretical outcome of this study is a complementary methods matrix, providing guidance on appropriate research methods in accordance to a project’s timeline.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increased adoption of business process management approaches, tools and practices, has led organizations to accumulate large collections of business process models. These collections can easily include hundred to thousand models, especially in the context of multinational corporations or as a result of organizational mergers and acquisitions. A concrete problem is thus how to maintain these large repositories in such a way that their complexity does not hamper their practical usefulness as a means to describe and communicate business operations. This paper proposes a technique to automatically infer suitable names for business process models and fragments thereof. This technique is useful for model abstraction scenarios, as for instance when user-specific views of a repository are required, or as part of a refactoring initiative aimed to simplify the repository’s complexity. The technique is grounded in an adaptation of the theory of meaning to the realm of business process models. We implemented the technique in a prototype tool and conducted an extensive evaluation using three process model collections from practice and a case study involving process modelers with different experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increasing popularity and adoption of building information modeling (BIM), the amount of digital information available about a building is overwhelming. Enormous challenges remain however in identifying meaningful and required information from a complex BIM model to support a particular construction management (CM) task. Detailed specifications of information required by different construction domains and expressive and easy-to-use BIM reasoning mechanisms are seen as an important means in addressing these challenges. This paper analyzes some of the characteristics and requirements of component-specific construction knowledge in relation to the current work practice and BIM-based applications. It is argued that domain ontologies and information extraction approaches, such as queries could significantly bring much needed support for knowledge sharing and integration of information between design, construction and facility management.