996 resultados para verification algorithm
Resumo:
Behavioral models capture operational principles of real-world or designed systems. Formally, each behavioral model defines the state space of a system, i.e., its states and the principles of state transitions. Such a model is the basis for analysis of the system’s properties. In practice, state spaces of systems are immense, which results in huge computational complexity for their analysis. Behavioral models are typically described as executable graphs, whose execution semantics encodes a state space. The structure theory of behavioral models studies the relations between the structure of a model and the properties of its state space. In this article, we use the connectivity property of graphs to achieve an efficient and extensive discovery of the compositional structure of behavioral models; behavioral models get stepwise decomposed into components with clear structural characteristics and inter-component relations. At each decomposition step, the discovered compositional structure of a model is used for reasoning on properties of the whole state space of the system. The approach is exemplified by means of a concrete behavioral model and verification criterion. That is, we analyze workflow nets, a well-established tool for modeling behavior of distributed systems, with respect to the soundness property, a basic correctness property of workflow nets. Stepwise verification allows the detection of violations of the soundness property by inspecting small portions of a model, thereby considerably reducing the amount of work to be done to perform soundness checks. Besides formal results, we also report on findings from applying our approach to an industry model collection.
Resumo:
Formal representations of business processes are used for analysis of the process behavior. Workflow nets are a widely used formalism for describing the behavior of business processes. Structure theory of processes investigates the relation between the structure of a model and its behavior. In this paper, we propose to employ the connectivity property of workflow nets as an angle to their structural analysis. In particular, we show how soundness verification can be organized using biconnected components of a workflow net. This allows for efficient identification and localization of flaws in the behavior of workflow nets and for supporting process analysts with diagnostic information
Resumo:
Fluid–Structure Interaction (FSI) problem is significant in science and engineering, which leads to challenges for computational mechanics. The coupled model of Finite Element and Smoothed Particle Hydrodynamics (FE-SPH) is a robust technique for simulation of FSI problems. However, two important steps of neighbor searching and contact searching in the coupled FE-SPH model are extremely time-consuming. Point-In-Box (PIB) searching algorithm has been developed by Swegle to improve the efficiency of searching. However, it has a shortcoming that efficiency of searching can be significantly affected by the distribution of points (nodes in FEM and particles in SPH). In this paper, in order to improve the efficiency of searching, a novel Striped-PIB (S-PIB) searching algorithm is proposed to overcome the shortcoming of PIB algorithm that caused by points distribution, and the two time-consuming steps of neighbor searching and contact searching are integrated into one searching step. The accuracy and efficiency of the newly developed searching algorithm is studied on by efficiency test and FSI problems. It has been found that the newly developed model can significantly improve the computational efficiency and it is believed to be a powerful tool for the FSI analysis.
Resumo:
Cancers of the brain and central nervous system account for 1.6% of new cancers and 1.8% of cancer deaths globally. The highest rates of all developed nations are observed in Australia and New Zealand. There are known complexities associated with dose measurement of very small radiation fields. Here, 3D dosimetric verification of treatments for small intracranial tumours using gel dosimetry was investigated.
Resumo:
Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.
Resumo:
We determine the affine equivalence classes of the eight variable degree three homogeneous bent functions using a new algorithm. Our algorithm applies to general bent functions and can systematically determine the automorphism groups. We provide a partial verification of the enumeration of eight variable degree three homogeneous bent functions obtained by Meng et al. We determine the affine equivalence classes of these functions.
Resumo:
The results of comprehensive experimental studies of the operation, stability, and plasma parameters of the low-frequency (0.46 MHz) inductively coupled plasmas sustained by the internal oscillating rf current are reported. The rf plasma is generated by using a custom-designed configuration of the internal rf coil that comprises two perpendicular sets of eight currents in each direction. Various diagnostic tools, such as magnetic probes, optical emission spectroscopy, and an rf-compensated Langmuir probe were used to investigate the electromagnetic, optical, and global properties of the argon plasma in wide ranges of the applied rf power and gas feedstock pressure. It is found that the uniformity of the electromagnetic field inside the plasma reactor is improved as compared to the conventional sources of inductively coupled plasmas with the external flat coil configuration. A reasonable agreement between the experimental data and computed electromagnetic field topography inside the chamber is reported. The Langmuir probe measurements reveal that the spatial profiles of the electron density, the effective electron temperature, plasma potential, and electron energy distribution/probability functions feature a high degree of the radial and axial uniformity and a weak azimuthal dependence, which is consistent with the earlier theoretical predictions. As the input rf power increases, the azimuthal dependence of the global plasma parameters vanishes. The obtained results demonstrate that by introducing the internal oscillated rf currents one can noticeably improve the uniformity of electromagnetic field topography, rf power deposition, and the plasma density in the reactor.
Resumo:
The Common Scrambling Algorithm Stream Cipher (CSASC) is a shift register based stream cipher designed to encrypt digital video broadcast. CSA-SC produces a pseudo-random binary sequence that is used to mask the contents of the transmission. In this paper, we analyse the initialisation process of the CSA-SC keystream generator and demonstrate weaknesses which lead to state convergence, slid pairs and shifted keystreams. As a result, the cipher may be vulnerable to distinguishing attacks, time-memory-data trade-off attacks or slide attacks.
Resumo:
Organisations are constantly seeking new ways to improve operational efficiencies. This research study investigates a novel way to identify potential efficiency gains in business operations by observing how they are carried out in the past and then exploring better ways of executing them by taking into account trade-offs between time, cost and resource utilisation. This paper demonstrates how they can be incorporated in the assessment of alternative process execution scenarios by making use of a cost environment. A genetic algorithm-based approach is proposed to explore and assess alternative process execution scenarios, where the objective function is represented by a comprehensive cost structure that captures different process dimensions. Experiments conducted with different variants of the genetic algorithm evaluate the approach's feasibility. The findings demonstrate that a genetic algorithm-based approach is able to make use of cost reduction as a way to identify improved execution scenarios in terms of reduced case durations and increased resource utilisation. The ultimate aim is to utilise cost-related insights gained from such improved scenarios to put forward recommendations for reducing process-related cost within organisations.
Resumo:
This paper proposes a combination of source-normalized weighted linear discriminant analysis (SN-WLDA) and short utterance variance (SUV) PLDA modelling to improve the short utterance PLDA speaker verification. As short-length utterance i-vectors vary with the speaker, session variations and phonetic content of the utterance (utterance variation), a combined approach of SN-WLDA projection and SUV PLDA modelling is used to compensate the session and utterance variations. Experimental studies have found that a combination of SN-WLDA and SUV PLDA modelling approach shows an improvement over baseline system (WCCN[LDA]-projected Gaussian PLDA (GPLDA)) as this approach effectively compensates the session and utterance variations.
Resumo:
A Monte Carlo model of an Elekta iViewGT amorphous silicon electronic portal imaging device (a-Si EPID) has been validated for pre-treatment verification of clinical IMRT treatment plans. The simulations involved the use of the BEAMnrc and DOSXYZnrc Monte Carlo codes to predict the response of the iViewGT a-Si EPID model. The predicted EPID images were compared to the measured images obtained from the experiment. The measured EPID images were obtained by delivering a photon beam from an Elekta Synergy linac to the Elekta iViewGT a-Si EPID. The a-Si EPID was used with no additional build-up material. Frame averaged EPID images were acquired and processed using in-house software. The agreement between the predicted and measured images was analyzed using the gamma analysis technique with acceptance criteria of 3% / 3 mm. The results show that the predicted EPID images for four clinical IMRT treatment plans have a good agreement with the measured EPID signal. Three prostate IMRT plans were found to have an average gamma pass rate of more than 95.0 % and a spinal IMRT plan has the average gamma pass rate of 94.3 %. During the period of performing this work a routine MLC calibration was performed and one of the IMRT treatments re-measured with the EPID. A change in the gamma pass rate for one field was observed. This was the motivation for a series of experiments to investigate the sensitivity of the method by introducing delivery errors, MLC position and dosimetric overshoot, into the simulated EPID images. The method was found to be sensitive to 1 mm leaf position errors and 10% overshoot errors.
Resumo:
Live migration of multiple Virtual Machines (VMs) has become an integral management activity in data centers for power saving, load balancing and system maintenance. While state-of-the-art live migration techniques focus on the improvement of migration performance of an independent single VM, only a little has been investigated to the case of live migration of multiple interacting VMs. Live migration is mostly influenced by the network bandwidth and arbitrarily migrating a VM which has data inter-dependencies with other VMs may increase the bandwidth consumption and adversely affect the performances of subsequent migrations. In this paper, we propose a Random Key Genetic Algorithm (RKGA) that efficiently schedules the migration of a given set of VMs accounting both inter-VM dependency and data center communication network. The experimental results show that the RKGA can schedule the migration of multiple VMs with significantly shorter total migration time and total downtime compared to a heuristic algorithm.
Resumo:
For robots operating in outdoor environments, a number of factors, including weather, time of day, rough terrain, high speeds, and hardware limitations, make performing vision-based simultaneous localization and mapping with current techniques infeasible due to factors such as image blur and/or underexposure, especially on smaller platforms and low-cost hardware. In this paper, we present novel visual place-recognition and odometry techniques that address the challenges posed by low lighting, perceptual change, and low-cost cameras. Our primary contribution is a novel two-step algorithm that combines fast low-resolution whole image matching with a higher-resolution patch-verification step, as well as image saliency methods that simultaneously improve performance and decrease computing time. The algorithms are demonstrated using consumer cameras mounted on a small vehicle in a mixed urban and vegetated environment and a car traversing highway and suburban streets, at different times of day and night and in various weather conditions. The algorithms achieve reliable mapping over the course of a day, both when incrementally incorporating new visual scenes from different times of day into an existing map, and when using a static map comprising visual scenes captured at only one point in time. Using the two-step place-recognition process, we demonstrate for the first time single-image, error-free place recognition at recall rates above 50% across a day-night dataset without prior training or utilization of image sequences. This place-recognition performance enables topologically correct mapping across day-night cycles.
Resumo:
The work presented in this report is aimed to implement a cost-effective offline mission path planner for aerial inspection tasks of large linear infrastructures. Like most real-world optimisation problems, mission path planning involves a number of objectives which ideally should be minimised simultaneously. Understandably, the objectives of a practical optimisation problem are conflicting each other and the minimisation of one of them necessarily implies the impossibility to minimise the other ones. This leads to the need to find a set of optimal solutions for the problem; once such a set of available options is produced, the mission planning problem is reduced to a decision making problem for the mission specialists, who will choose the solution which best fit the requirements of the mission. The goal of this work is then to develop a Multi-Objective optimisation tool able to provide the mission specialists a set of optimal solutions for the inspection task amongst which the final trajectory will be chosen, given the environment data, the mission requirements and the definition of the objectives to minimise. All the possible optimal solutions of a Multi-Objective optimisation problem are said to form the Pareto-optimal front of the problem. For any of the Pareto-optimal solutions, it is impossible to improve one objective without worsening at least another one. Amongst a set of Pareto-optimal solutions, no solution is absolutely better than another and the final choice must be a trade-off of the objectives of the problem. Multi-Objective Evolutionary Algorithms (MOEAs) are recognised to be a convenient method for exploring the Pareto-optimal front of Multi-Objective optimization problems. Their efficiency is due to their parallelism architecture which allows to find several optimal solutions at each time