946 resultados para RM(rate monotonic)algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work presented in this report is aimed to implement a cost-effective offline mission path planner for aerial inspection tasks of large linear infrastructures. Like most real-world optimisation problems, mission path planning involves a number of objectives which ideally should be minimised simultaneously. Understandably, the objectives of a practical optimisation problem are conflicting each other and the minimisation of one of them necessarily implies the impossibility to minimise the other ones. This leads to the need to find a set of optimal solutions for the problem; once such a set of available options is produced, the mission planning problem is reduced to a decision making problem for the mission specialists, who will choose the solution which best fit the requirements of the mission. The goal of this work is then to develop a Multi-Objective optimisation tool able to provide the mission specialists a set of optimal solutions for the inspection task amongst which the final trajectory will be chosen, given the environment data, the mission requirements and the definition of the objectives to minimise. All the possible optimal solutions of a Multi-Objective optimisation problem are said to form the Pareto-optimal front of the problem. For any of the Pareto-optimal solutions, it is impossible to improve one objective without worsening at least another one. Amongst a set of Pareto-optimal solutions, no solution is absolutely better than another and the final choice must be a trade-off of the objectives of the problem. Multi-Objective Evolutionary Algorithms (MOEAs) are recognised to be a convenient method for exploring the Pareto-optimal front of Multi-Objective optimization problems. Their efficiency is due to their parallelism architecture which allows to find several optimal solutions at each time

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a novel scheme for carrying out speaker diarization in an iterative manner. We aim to show that the information obtained through the first pass of speaker diarization can be reused to refine and improve the original diarization results. We call this technique speaker rediarization and demonstrate the practical application of our rediarization algorithm using a large archive of two-speaker telephone conversation recordings. We use the NIST 2008 SRE summed telephone corpora for evaluating our speaker rediarization system. This corpus contains recurring speaker identities across independent recording sessions that need to be linked across the entire corpus. We show that our speaker rediarization scheme can take advantage of inter-session speaker information, linked in the initial diarization pass, to achieve a 30% relative improvement over the original diarization error rate (DER) after only two iterations of rediarization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate Bayesian Computation’ (ABC) represents a powerful methodology for the analysis of complex stochastic systems for which the likelihood of the observed data under an arbitrary set of input parameters may be entirely intractable – the latter condition rendering useless the standard machinery of tractable likelihood-based, Bayesian statistical inference [e.g. conventional Markov chain Monte Carlo (MCMC) simulation]. In this paper, we demonstrate the potential of ABC for astronomical model analysis by application to a case study in the morphological transformation of high-redshift galaxies. To this end, we develop, first, a stochastic model for the competing processes of merging and secular evolution in the early Universe, and secondly, through an ABC-based comparison against the observed demographics of massive (Mgal > 1011 M⊙) galaxies (at 1.5 < z < 3) in the Cosmic Assembly Near-IR Deep Extragalatic Legacy Survey (CANDELS)/Extended Groth Strip (EGS) data set we derive posterior probability densities for the key parameters of this model. The ‘Sequential Monte Carlo’ implementation of ABC exhibited herein, featuring both a self-generating target sequence and self-refining MCMC kernel, is amongst the most efficient of contemporary approaches to this important statistical algorithm. We highlight as well through our chosen case study the value of careful summary statistic selection, and demonstrate two modern strategies for assessment and optimization in this regard. Ultimately, our ABC analysis of the high-redshift morphological mix returns tight constraints on the evolving merger rate in the early Universe and favours major merging (with disc survival or rapid reformation) over secular evolution as the mechanism most responsible for building up the first generation of bulges in early-type discs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computationally efficient sequential Monte Carlo algorithm is proposed for the sequential design of experiments for the collection of block data described by mixed effects models. The difficulty in applying a sequential Monte Carlo algorithm in such settings is the need to evaluate the observed data likelihood, which is typically intractable for all but linear Gaussian models. To overcome this difficulty, we propose to unbiasedly estimate the likelihood, and perform inference and make decisions based on an exact-approximate algorithm. Two estimates are proposed: using Quasi Monte Carlo methods and using the Laplace approximation with importance sampling. Both of these approaches can be computationally expensive, so we propose exploiting parallel computational architectures to ensure designs can be derived in a timely manner. We also extend our approach to allow for model uncertainty. This research is motivated by important pharmacological studies related to the treatment of critically ill patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graphyne is an allotrope of graphene. The mechanical properties of graphynes (α-, β-, γ- and 6,6,12-graphynes) under uniaxial tension deformation at different temperatures and strain rates are studied using molecular dynamics simulations. It is found that graphynes are more sensitive to temperature changes than graphene in terms of fracture strength and Young's modulus. The temperature sensitivity of the different graphynes is proportionally related to the percentage of acetylenic linkages in their structures, with the α-graphyne (having 100% of acetylenic linkages) being most sensitive to temperature. For the same graphyne, temperature exerts a more pronounced effect on the Young's modulus than fracture strength, which is different from that of graphene. The mechanical properties of graphynes are also sensitive to strain rate, in particular at higher temperatures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Network Real-Time Kinematic (NRTK) is a technology that can provide centimeter-level accuracy positioning services in real time, and it is enabled by a network of Continuously Operating Reference Stations (CORS). The location-oriented CORS placement problem is an important problem in the design of a NRTK as it will directly affect not only the installation and operational cost of the NRTK, but also the quality of positioning services provided by the NRTK. This paper presents a Memetic Algorithm (MA) for the location-oriented CORS placement problem, which hybridizes the powerful explorative search capacity of a genetic algorithm and the efficient and effective exploitative search capacity of a local optimization. Experimental results have shown that the MA has better performance than existing approaches. In this paper we also conduct an empirical study about the scalability of the MA, effectiveness of the hybridization technique and selection of crossover operator in the MA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the physiological tolerance times when wearing explosive and chemical (>35kg) personal protective equipment (PPE) in simulated environmental extremes across a range of differing work intensities. Twelve healthy males undertook nine trials which involved walking on a treadmill at 2.5, 4 and 5.5 km.h-1 in the following environmental conditions, 21, 30 and 37 °C wet bulb globe temperature (WBGT). Participants exercised for 60 min or until volitional fatigue, core temperature reached 39 °C, or heart rate exceeded 90% of maximum. Tolerance time, core temperature, skin temperature, mean body temperature, heart rate and body mass loss were measured. Exercise time was reduced in the higher WBGT environments (WBGT37

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extracting frequent subtrees from the tree structured data has important applications in Web mining. In this paper, we introduce a novel canonical form for rooted labelled unordered trees called the balanced-optimal-search canonical form (BOCF) that can handle the isomorphism problem efficiently. Using BOCF, we define a tree structure guided scheme based enumeration approach that systematically enumerates only the valid subtrees. Finally, we present the balanced optimal search tree miner (BOSTER) algorithm based on BOCF and the proposed enumeration approach, for finding frequent induced subtrees from a database of labelled rooted unordered trees. Experiments on the real datasets compare the efficiency of BOSTER over the two state-of-the-art algorithms for mining induced unordered subtrees, HybridTreeMiner and UNI3. The results are encouraging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an algorithm for mining unordered embedded subtrees using the balanced-optimal-search canonical form (BOCF). A tree structure guided scheme based enumeration approach is defined using BOCF for systematically enumerating the valid subtrees only. Based on this canonical form and enumeration technique, the balanced optimal search embedded subtree mining algorithm (BEST) is introduced for mining embedded subtrees from a database of labelled rooted unordered trees. The extensive experiments on both synthetic and real datasets demonstrate the efficiency of BEST over the two state-of-the-art algorithms for mining embedded unordered subtrees, SLEUTH and U3.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose To investigate longitudinal changes of subbasal nerve plexus (SNP) morphology and its relationship with conventional measures of neuropathy in individuals with diabetes. Methods A cohort of 147 individuals with type 1 diabetes and 60 age-balanced controls underwent detailed assessment of clinical and metabolic factors, neurologic deficits, quantitative sensory testing, nerve conduction studies and corneal confocal microscopy at baseline and four subsequent annual visits. The SNP parameters included corneal nerve fiber density (CNFD), branch density (CNBD) and fiber length (CNFL) and were quantified using a fully-automated algorithm. Linear mixed models were fitted to examine the changes in corneal nerve parameters over time. Results At baseline, 27% of the participants had mild diabetic neuropathy. All SNP parameters were significantly lower in the neuropathy group compared to controls (P<0.05). Overall, 89% of participants examined at baseline also completed the final visit. There was no clinically significant change to health and metabolic parameters and neuropathy measures from baseline to the final visit. Linear mixed model revealed a significant linear decline of CNFD (annual change rate, -0.9 nerve/mm2, P=0.01) in the neuropathy group compared to controls, which was associated with age (β=-0.06, P=0.04) and duration of diabetes (β=-0.08, P=0.03). In the neuropathy group, absolute changes of CNBD and CNFL showed moderate correlations with peroneal conduction velocity and cold sensation threshold, respectively (rs, 0.38 and 0.40, P<0.05). Conclusion This study demonstrates dynamic small fiber damage at the SNP, thus providing justification for our ongoing efforts to establish corneal nerve morphology as an appropriate adjunct to conventional measures of DPN.