973 resultados para Short Loadlength, Fast Algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a fast Poisson preconditioner for the efficient numerical solution of a class of two-sided nonlinear space fractional diffusion equations in one and two dimensions using the method of lines. Using the shifted Gr¨unwald finite difference formulas to approximate the two-sided(i.e. the left and right Riemann-Liouville) fractional derivatives, the resulting semi-discrete nonlinear systems have dense Jacobian matrices owing to the non-local property of fractional derivatives. We employ a modern initial value problem solver utilising backward differentiation formulas and Jacobian-free Newton-Krylov methods to solve these systems. For efficient performance of the Jacobianfree Newton-Krylov method it is essential to apply an effective preconditioner to accelerate the convergence of the linear iterative solver. The key contribution of our work is to generalise the fast Poisson preconditioner, widely used for integer-order diffusion equations, so that it applies to the two-sided space fractional diffusion equation. A number of numerical experiments are presented to demonstrate the effectiveness of the preconditioner and the overall solution strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data structures such as k-D trees and hierarchical k-means trees perform very well in approximate k nearest neighbour matching, but are only marginally more effective than linear search when performing exact matching in high-dimensional image descriptor data. This paper presents several improvements to linear search that allows it to outperform existing methods and recommends two approaches to exact matching. The first method reduces the number of operations by evaluating the distance measure in order of significance of the query dimensions and terminating when the partial distance exceeds the search threshold. This method does not require preprocessing and significantly outperforms existing methods. The second method improves query speed further by presorting the data using a data structure called d-D sort. The order information is used as a priority queue to reduce the time taken to find the exact match and to restrict the range of data searched. Construction of the d-D sort structure is very simple to implement, does not require any parameter tuning, and requires significantly less time than the best-performing tree structure, and data can be added to the structure relatively efficiently.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the implementation of the first portable, embedded data acquisition unit (BabelFuse) that is able to acquire and timestamp generic sensor data and trigger General Purpose I/O (GPIO) events against a microsecond-accurate wirelessly-distributed ‘global’ clock. A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fast-moving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment especially if non-deterministic communication hardware (such as IEEE-802.11-based wireless) and inaccurate clock synchronisation protocols are used. The issue of differing timebases makes correlation of data difficult and prevents the units from reliably performing synchronised operations or manoeuvres. By utilising hardware-assisted timestamping, clock synchronisation protocols based on industry standards and firmware designed to minimise indeterminism, an embedded data acquisition unit capable of microsecond-level clock synchronisation is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main aim of this paper is to describe an adaptive re-planning algorithm based on a RRT and Game Theory to produce an efficient collision free obstacle adaptive Mission Path Planner for Search and Rescue (SAR) missions. This will provide UAV autopilots and flight computers with the capability to autonomously avoid static obstacles and No Fly Zones (NFZs) through dynamic adaptive path replanning. The methods and algorithms produce optimal collision free paths and can be integrated on a decision aid tool and UAV autopilots.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study of destination brand performance measurement has only emerged in earnest as a field in the tourism literature since 2007. The concept of consumer-based brand equity (CBBE) is gaining favour from services marketing researchers as an alternative to the traditional ‘net-present-value of future earnings’ method of measuring brand equity. The perceptions-based CBBE model also appears suitable for examining destination brand performance, where a financial brand equity valuation on a destination marketing organisation’s (DMO) balance sheet is largely irrelevant. This is the first study to test and compare the model in both short and long haul markets. The paper reports the results of tests of a CBBE model for Australia in a traditional short haul market (New Zealand) and an emerging long haul market (Chile). The data from both samples indicated destination brand salience, brand image, and brand value are positively related to purchase intent for Australia in these two disparate markets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Damage assessment (damage detection, localization and quantification) in structures and appropriate retrofitting will enable the safe and efficient function of the structures. In this context, many Vibration Based Damage Identification Techniques (VBDIT) have emerged with potential for accurate damage assessment. VBDITs have achieved significant research interest in recent years, mainly due to their non-destructive nature and ability to assess inaccessible and invisible damage locations. Damage Index (DI) methods are also vibration based, but they are not based on the structural model. DI methods are fast and inexpensive compared to the model-based methods and have the ability to automate the damage detection process. DI method analyses the change in vibration response of the structure between two states so that the damage can be identified. Extensive research has been carried out to apply the DI method to assess damage in steel structures. Comparatively, there has been very little research interest in the use of DI methods to assess damage in Reinforced Concrete (RC) structures due to the complexity of simulating the predominant damage type, the flexural crack. Flexural cracks in RC beams distribute non- linearly and propagate along all directions. Secondary cracks extend more rapidly along the longitudinal and transverse directions of a RC structure than propagation of existing cracks in the depth direction due to stress distribution caused by the tensile reinforcement. Simplified damage simulation techniques (such as reductions in the modulus or section depth or use of rotational spring elements) that have been extensively used with research on steel structures, cannot be applied to simulate flexural cracks in RC elements. This highlights a big gap in knowledge and as a consequence VBDITs have not been successfully applied to damage assessment in RC structures. This research will address the above gap in knowledge and will develop and apply a modal strain energy based DI method to assess damage in RC flexural members. Firstly, this research evaluated different damage simulation techniques and recommended an appropriate technique to simulate the post cracking behaviour of RC structures. The ABAQUS finite element package was used throughout the study with properly validated material models. The damaged plasticity model was recommended as the method which can correctly simulate the post cracking behaviour of RC structures and was used in the rest of this study. Four different forms of Modal Strain Energy based Damage Indices (MSEDIs) were proposed to improve the damage assessment capability by minimising the numbers and intensities of false alarms. The developed MSEDIs were then used to automate the damage detection process by incorporating programmable algorithms. The developed algorithms have the ability to identify common issues associated with the vibration properties such as mode shifting and phase change. To minimise the effect of noise on the DI calculation process, this research proposed a sequential order of curve fitting technique. Finally, a statistical based damage assessment scheme was proposed to enhance the reliability of the damage assessment results. The proposed techniques were applied to locate damage in RC beams and slabs on girder bridge model to demonstrate their accuracy and efficiency. The outcomes of this research will make a significant contribution to the technical knowledge of VBDIT and will enhance the accuracy of damage assessment in RC structures. The application of the research findings to RC flexural members will enable their safe and efficient performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter contains sections titled: Introduction ICZM and sustainable development of coastal zone International legal framework for ICZM Implementation of international legal obligations in domestic arena Concluding remarks References

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Water Catchment: fast forward to the past comprises two parts: a creative piece and an exegesis. The methodology is Creative Practice as Research; a process of critical reflection, where I observe how researching the exegesis, in my case analysing how the social reality of an era in which an author writes affects their writing of the protagonist's journey, and how this in turn shapes how I write the hero's pathway in the creative piece. The genre in which the protagonist's journey is charted and represented is dystopian young adult fiction; hence my creative piece, The Water Catchment, is a novel manuscript for a dystopian young adult fantasy. It is a speculative novel set in a possible future and poses (and answers) the question: What might happen if water becomes the most powerful commodity on earth? There are two communities, called 'worlds' to create a barrier and difference where physical ones are not in evidence. A battle ensues over unfair conditions and access to water. In the end the protagonist, Caitlyn, takes over leadership heralding a new era of co-operation and water management between the two worlds. The exegesis examines how the hero's pathway, the journey towards knowledge and resolution, is best explored in young adult literature through dystopian narratives. I explore how the dystopian worlds of Ursula Le Guin's first and last books of The Earthsea Quartet are foundational, and lay this examination over an analysis of both the hero's pathway within and the social contexts outside of the novels. Dystopian narratives constitute a liberating space for the adolescent protagonist between the reliance on adults in childhood and the world of adults. In young adult literature such narratives provide fertile ground to explore those aspects informing an adolescent's future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most powerful known primitive in public-key cryptography is undoubtedly elliptic curve pairings. Upon their introduction just over ten years ago the computation of pairings was far too slow for them to be considered a practical option. This resulted in a vast amount of research from many mathematicians and computer scientists around the globe aiming to improve this computation speed. From the use of modern results in algebraic and arithmetic geometry to the application of foundational number theory that dates back to the days of Gauss and Euler, cryptographic pairings have since experienced a great deal of improvement. As a result, what was an extremely expensive computation that took several minutes is now a high-speed operation that takes less than a millisecond. This thesis presents a range of optimisations to the state-of-the-art in cryptographic pairing computation. Both through extending prior techniques, and introducing several novel ideas of our own, our work has contributed to recordbreaking pairing implementations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assessment of skin temperature (Tsk) in athletic therapy and sports medicine research is an extremely important physiological outcome measure.Various methodsof recording Tsk, including thermistors, thermocouples and thermocrons are currently being used for research purposes. These techniques are constrained by their wires limiting the freedom of the subject, slow response times, and/or sensors falling off. Furthermore, as these products typically are directly attached to the skin and cover the measurement site, their validity may be questionable.This manuscript addresses the use and potential benefits of using thermal imaging (TI) in sport medicine research.Non-contact infrared TI offers a quick, non-invasive, portable and athlete-friendly method of assessing Tsk. TI is a useful Tsk diagnostic tool that has potential to be an integral part of sport medicine research in the future. Furthermore, as the technique is non-contact it has several advantages over existing methods of recording skin temperature

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnesium and its alloys have shown a great potential in effective hydrogen storage due to their advantages of high volumetric/gravimetric hydrogen storage capacity and low cost. However, the use of these materials in fuel cells for automotive applications at the present time is limited by high hydrogenation temperature and sluggish sorption kinetics. This paper presents the recent results of design and development of magnesium-based nanocomposites demonstrating the catalytic effects of carbon nanotubes and transition metals on hydrogen adsorption in these materials. The results are promising for the application of magnesium materials for hydrogen storage, with significantly reduced absorption temperatures and enhanced ab/desorption kinetics. High level Density Functional Theory calculations support the analysis of the hydrogenation mechanisms by revealing the detailed atomic and molecular interactions that underpin the catalytic roles of incorporated carbon and titanium, providing clear guidance for further design and development of such materials with better hydrogen storage properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stereo visual odometry has received little investigation in high altitude applications due to the generally poor performance of rigid stereo rigs at extremely small baseline-to-depth ratios. Without additional sensing, metric scale is considered lost and odometry is seen as effective only for monocular perspectives. This paper presents a novel modification to stereo based visual odometry that allows accurate, metric pose estimation from high altitudes, even in the presence of poor calibration and without additional sensor inputs. By relaxing the (typically fixed) stereo transform during bundle adjustment and reducing the dependence on the fixed geometry for triangulation, metrically scaled visual odometry can be obtained in situations where high altitude and structural deformation from vibration would cause traditional algorithms to fail. This is achieved through the use of a novel constrained bundle adjustment routine and accurately scaled pose initializer. We present visual odometry results demonstrating the technique on a short-baseline stereo pair inside a fixed-wing UAV flying at significant height (~30-100m).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Illumina's Infinium SNP BeadChips are extensively used in both small and large-scale genetic studies. A fundamental step in any analysis is the processing of raw allele A and allele B intensities from each SNP into genotype calls (AA, AB, BB). Various algorithms which make use of different statistical models are available for this task. We compare four methods (GenCall, Illuminus, GenoSNP and CRLMM) on data where the true genotypes are known in advance and data from a recently published genome-wide association study. Results In general, differences in accuracy are relatively small between the methods evaluated, although CRLMM and GenoSNP were found to consistently outperform GenCall. The performance of Illuminus is heavily dependent on sample size, with lower no call rates and improved accuracy as the number of samples available increases. For X chromosome SNPs, methods with sex-dependent models (Illuminus, CRLMM) perform better than methods which ignore gender information (GenCall, GenoSNP). We observe that CRLMM and GenoSNP are more accurate at calling SNPs with low minor allele frequency than GenCall or Illuminus. The sample quality metrics from each of the four methods were found to have a high level of agreement at flagging samples with unusual signal characteristics. Conclusions CRLMM, GenoSNP and GenCall can be applied with confidence in studies of any size, as their performance was shown to be invariant to the number of samples available. Illuminus on the other hand requires a larger number of samples to achieve comparable levels of accuracy and its use in smaller studies (50 or fewer individuals) is not recommended.