286 resultados para correctness verification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method for prediction of the radiation pattern of N strongly coupled antennas with mismatched sources is presented. The method facilitates fast and accurate design of compact arrays. The prediction is based on the measured N-port S parameters of the coupled antennas and the N active element patterns measured in a 50 ω environment. By introducing equivalent power sources, the radiation pattern with excitation by sources with arbitrary impedances and various decoupling and matching networks (DMN) can be accurately predicted without the need for additional measurements. Two experiments were carried out for verification: pattern prediction for parasitic antennas with different loads and for antennas with DMN. The difference between measured and predicted patterns was within 1 to 2 dB.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant amount of speech is typically required for speaker verification system development and evaluation, especially in the presence of large intersession variability. This paper introduces a source and utterance duration normalized linear discriminant analysis (SUN-LDA) approaches to compensate session variability in short-utterance i-vector speaker verification systems. Two variations of SUN-LDA are proposed where normalization techniques are used to capture source variation from both short and full-length development i-vectors, one based upon pooling (SUN-LDA-pooled) and the other on concatenation (SUN-LDA-concat) across the duration and source-dependent session variation. Both the SUN-LDA-pooled and SUN-LDA-concat techniques are shown to provide improvement over traditional LDA on NIST 08 truncated 10sec-10sec evaluation conditions, with the highest improvement obtained with the SUN-LDA-concat technique achieving a relative improvement of 8% in EER for mis-matched conditions and over 3% for matched conditions over traditional LDA approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research developed and scientifically validated a new ultrasound transmission computed tomography system with the aim of quantitative assessment of a polymer gel dosimeter including dose response verification of ultrasonic parameters of attenuation, velocity and broadband ultrasound attenuation (BUA). This work was the first to investigate and report ultrasound frequency dependent attenuation in a gel dosimeter, demonstrating a dose dependence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction. The purpose of this chapter is to address the question raised in the chapter title. Specifically, how can models of motor control help us understand low back pain (LBP)? There are several classes of models that have been used in the past for studying spinal loading, stability, and risk of injury (see Reeves and Cholewicki (2003) for a review of past modeling approaches), but for the purpose of this chapter we will focus primarily on models used to assess motor control and its effect on spine behavior. This chapter consists of 4 sections. The first section discusses why a shift in modeling approaches is needed to study motor control issues. We will argue that the current approach for studying the spine system is limited and not well-suited for assessing motor control issues related to spine function and dysfunction. The second section will explore how models can be used to gain insight into how the central nervous system (CNS) controls the spine. This segues segue nicely into the next section that will address how models of motor control can be used in the diagnosis and treatment of LBP. Finally, the last section will deal with the issue of model verification and validity. This issue is important since modelling accuracy is critical for obtaining useful insight into the behavior of the system being studied. This chapter is not intended to be a critical review of the literature, but instead intended to capture some of the discussion raised during the 2009 Spinal Control Symposium, with some elaboration on certain issues. Readers interested in more details are referred to the cited publications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Application of 'advanced analysis' methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A research project has been conducted with the aim of developing concentrated plasticity methods suitable for practical advanced analysis of steel frame structures comprising non-compact sections. A primary objective was to produce a comprehensive range of new distributed plasticity analytical benchmark solutions for verification of the concentrated plasticity methods. A distributed plasticity model was developed using shell finite elements to explicitly account for the effects of gradual yielding and spread of plasticity, initial geometric imperfections, residual stresses and local buckling deformations. The model was verified by comparison with large-scale steel frame test results and a variety of existing analytical benchmark solutions. This paper presents a description of the distributed plasticity model and details of the verification study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Application of 'advanced analysis' methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A research project has been conducted with the aim of developing concentrated plasticity methods suitable for practical advanced analysis of steel frame structures comprising non-compact sections. A series of large-scale tests were performed in order to provide experimental results for verification of the new analytical models. Each of the test frames comprised non-compact sections, and exhibited significant local buckling behaviour prior to failure. This paper presents details of the test program including the test specimens, set-up and instrumentation, procedure, and results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lyngbya majuscula is a cyanobacterium (blue-green algae) occurring naturally in tropical and subtropical coastal areas worldwide. Deception Bay, in Northern Moreton Bay, Queensland, has a history of Lyngbya blooms, and forms a case study for this investigation. The South East Queensland (SEQ) Healthy Waterways Partnership, collaboration between government, industry, research and the community, was formed to address issues affecting the health of the river catchments and waterways of South East Queensland. The Partnership coordinated the Lyngbya Research and Management Program (2005-2007) which culminated in a Coastal Algal Blooms (CAB) Action Plan for harmful and nuisance algal blooms, such as Lyngbya majuscula. This first phase of the project was predominantly of a scientific nature and also facilitated the collection of additional data to better understand Lyngbya blooms. The second phase of this project, SEQ Healthy Waterways Strategy 2007-2012, is now underway to implement the CAB Action Plan and as such is more management focussed. As part of the first phase of the project, a Science model for the initiation of a Lyngbya bloom was built using Bayesian Networks (BN). The structure of the Science Bayesian Network was built by the Lyngbya Science Working Group (LSWG) which was drawn from diverse disciplines. The BN was then quantified with annual data and expert knowledge. Scenario testing confirmed the expected temporal nature of bloom initiation and it was recommended that the next version of the BN be extended to take this into account. Elicitation for this BN thus occurred at three levels: design, quantification and verification. The first level involved construction of the conceptual model itself, definition of the nodes within the model and identification of sources of information to quantify the nodes. The second level included elicitation of expert opinion and representation of this information in a form suitable for inclusion in the BN. The third and final level concerned the specification of scenarios used to verify the model. The second phase of the project provides the opportunity to update the network with the newly collected detailed data obtained during the previous phase of the project. Specifically the temporal nature of Lyngbya blooms is of interest. Management efforts need to be directed to the most vulnerable periods to bloom initiation in the Bay. To model the temporal aspects of Lyngbya we are using Object Oriented Bayesian networks (OOBN) to create ‘time slices’ for each of the periods of interest during the summer. OOBNs provide a framework to simplify knowledge representation and facilitate reuse of nodes and network fragments. An OOBN is more hierarchical than a traditional BN with any sub-network able to contain other sub-networks. Connectivity between OOBNs is an important feature and allows information flow between the time slices. This study demonstrates more sophisticated use of expert information within Bayesian networks, which combine expert knowledge with data (categorized using expert-defined thresholds) within an expert-defined model structure. Based on the results from the verification process the experts are able to target areas requiring greater precision and those exhibiting temporal behaviour. The time slices incorporate the data for that time period for each of the temporal nodes (instead of using the annual data from the previous static Science BN) and include lag effects to allow the effect from one time slice to flow to the next time slice. We demonstrate a concurrent steady increase in the probability of initiation of a Lyngbya bloom and conclude that the inclusion of temporal aspects in the BN model is consistent with the perceptions of Lyngbya behaviour held by the stakeholders. This extended model provides a more accurate representation of the increased risk of algal blooms in the summer months and show that the opinions elicited to inform a static BN can be readily extended to a dynamic OOBN, providing more comprehensive information for decision makers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Process-Aware Information Systems (PAISs) support executions of operational processes that involve people, resources, and software applications on the basis of process models. Process models describe vast, often infinite, amounts of process instances, i.e., workflows supported by the systems. With the increasing adoption of PAISs, large process model repositories emerged in companies and public organizations. These repositories constitute significant information resources. Accurate and efficient retrieval of process models and/or process instances from such repositories is interesting for multiple reasons, e.g., searching for similar models/instances, filtering, reuse, standardization, process compliance checking, verification of formal properties, etc. This paper proposes a technique for indexing process models that relies on their alternative representations, called untanglings. We show the use of untanglings for retrieval of process models based on process instances that they specify via a solution to the total executability problem. Experiments with industrial process models testify that the proposed retrieval approach is up to three orders of magnitude faster than the state of the art.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Twitter and other social media have become increasingly important tools for maintaining the relationships between fans and their idols across a range of activities, from politics and the arts to celebrity and sports culture. Twitter, Inc. itself has initiated several strategic approaches, especially to entertainment and sporting organisations; late in 2012, for example, a Twitter, Inc. delegation toured Australia in order to develop formal relationships with a number of key sporting bodies covering popular sports such as Australian Rules Football, A-League football (soccer), and V8 touring car racing, as well as to strengthen its connections with key Australian broadcasters and news organisations (Jackson & Christensen, 2012). Similarly, there has been a concerted effort between Twitter Germany and the German Bundesliga clubs and football association to coordinate the presence of German football on Twitter ahead of the 2012–2013 season: the Twitter accounts of almost all first-division teams now bear the official Twitter verification mark, and a system of ‘official’ hashtags for tweeting about individual games (combining the abbreviations of the two teams, e.g. #H96FCB) has also been instituted (Twitter auf Deutsch, 2012).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diagnostics of rolling element bearings involves a combination of different techniques of signal enhancing and analysis. The most common procedure presents a first step of order tracking and synchronous averaging, able to remove the undesired components, synchronous with the shaft harmonics, from the signal, and a final step of envelope analysis to obtain the squared envelope spectrum. This indicator has been studied thoroughly, and statistically based criteria have been obtained, in order to identify damaged bearings. The statistical thresholds are valid only if all the deterministic components in the signal have been removed. Unfortunately, in various industrial applications, characterized by heterogeneous vibration sources, the first step of synchronous averaging is not sufficient to eliminate completely the deterministic components and an additional step of pre-whitening is needed before the envelope analysis. Different techniques have been proposed in the past with this aim: The most widely spread are linear prediction filters and spectral kurtosis. Recently, a new technique for pre-whitening has been proposed, based on cepstral analysis: the so-called cepstrum pre-whitening. Owing to its low computational requirements and its simplicity, it seems a good candidate to perform the intermediate pre-whitening step in an automatic damage recognition algorithm. In this paper, the effectiveness of the new technique will be tested on the data measured on a full-scale industrial bearing test-rig, able to reproduce the harsh conditions of operation. A benchmark comparison with the traditional pre-whitening techniques will be made, as a final step for the verification of the potentiality of the cepstrum pre-whitening.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Product Lifecycle Management (PLM) systems are widely used in the manufacturing industry. A core feature of such systems is to provide support for versioning of product data. As workflow functionality is increasingly used in PLM systems, the possibility emerges that the versioning transitions for product objects as encapsulated in process models do not comply with the valid version control policies mandated in the objects’ actual lifecycles. In this paper we propose a solution to tackle the (non-)compliance issues between processes and object version control policies. We formally define the notion of compliance between these two artifacts in product lifecycle management and then develop a compliance checking method which employs a well-established workflow analysis technique. This forms the basis of a tool which offers automated support to the proposed approach. By applying the approach to a collection of real-life specifications in a main PLM system, we demonstrate the practical applicability of our solution to the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Material yielding is typically modeled either by plastic zone or plastic hinge methods under the context of geometric and material nonlinear finite element methods. In fire analysis of steel structures, the plastic zone method is widely used, but it requires extensively more computational efforts. The objective of this paper is to develop the nonlinear material model allowing for interaction of both axial force and bending moment, which relies on the plastic hinge method to achieve numerical efficiency and reduce computational effort. The biggest advantage of the plastic-hinge approach is its computational efficiency and easy verification by the design code formulae of the axial force–moment interaction yield criterion for beam–column members. Further, the method is reliable and robust when used in analysis of practical and large structures. In order to allow for the effect of catenary action, axial thermal expansion is considered in the axial restraint equations. The yield function for material yielding incorporated in the stiffness formulation, which allows for both axial force and bending moment effects, is more accurate and rational to predict the behaviour of the frames under fire. In the present fire analysis, the mechanical properties at elevated temperatures follow mainly the Eurocode 3 [Design of steel structures, Part 1.2: Structural fire design. European Committee for Standisation; 2003]. Example of a tension member at a steady state heating condition is modeled to verify the proposed spring formulation and to compare with results by others. The behaviour of a heated member in a highly redundant structure is also studied by the present approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Object classification is plagued by the issue of session variation. Session variation describes any variation that makes one instance of an object look different to another, for instance due to pose or illumination variation. Recent work in the challenging task of face verification has shown that session variability modelling provides a mechanism to overcome some of these limitations. However, for computer vision purposes, it has only been applied in the limited setting of face verification. In this paper we propose a local region based intersession variability (ISV) modelling approach, and apply it to challenging real-world data. We propose a region based session variability modelling approach so that local session variations can be modelled, termed Local ISV. We then demonstrate the efficacy of this technique on a challenging real-world fish image database which includes images taken underwater, providing significant real-world session variations. This Local ISV approach provides a relative performance improvement of, on average, 23% on the challenging MOBIO, Multi-PIE and SCface face databases. It also provides a relative performance improvement of 35% on our challenging fish image dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present two unconditional secure protocols for private set disjointness tests. In order to provide intuition of our protocols, we give a naive example that applies Sylvester matrices. Unfortunately, this simple construction is insecure as it reveals information about the intersection cardinality. More specifically, it discloses its lower bound. By using the Lagrange interpolation, we provide a protocol for the honest-but-curious case without revealing any additional information. Finally, we describe a protocol that is secure against malicious adversaries. In this protocol, a verification test is applied to detect misbehaving participants. Both protocols require O(1) rounds of communication. Our protocols are more efficient than the previous protocols in terms of communication and computation overhead. Unlike previous protocols whose security relies on computational assumptions, our protocols provide information theoretic security. To our knowledge, our protocols are the first ones that have been designed without a generic secure function evaluation. More important, they are the most efficient protocols for private disjointness tests in the malicious adversary case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aiming at the large scale numerical simulation of particle reinforced materials, the concept of local Eshelby matrix has been introduced into the computational model of the eigenstrain boundary integral equation (BIE) to solve the problem of interactions among particles. The local Eshelby matrix can be considered as an extension of the concepts of Eshelby tensor and the equivalent inclusion in numerical form. Taking the subdomain boundary element method as the control, three-dimensional stress analyses are carried out for some ellipsoidal particles in full space with the proposed computational model. Through the numerical examples, it is verified not only the correctness and feasibility but also the high efficiency of the present model with the corresponding solution procedure, showing the potential of solving the problem of large scale numerical simulation of particle reinforced materials.