689 resultados para real interpolation space
Resumo:
In this paper, an enriched radial point interpolation method (e-RPIM) is developed the for the determination of crack tip fields. In e-RPIM, the conventional RBF interpolation is novelly augmented by the suitable trigonometric basis functions to reflect the properties of stresses for the crack tip fields. The performance of the enriched RBF meshfree shape functions is firstly investigated to fit different surfaces. The surface fitting results have proven that, comparing with the conventional RBF shape function, the enriched RBF shape function has: (1) a similar accuracy to fit a polynomial surface; (2) a much better accuracy to fit a trigonometric surface; and (3) a similar interpolation stability without increase of the condition number of the RBF interpolation matrix. Therefore, it has proven that the enriched RBF shape function will not only possess all advantages of the conventional RBF shape function, but also can accurately reflect the properties of stresses for the crack tip fields. The system of equations for the crack analysis is then derived based on the enriched RBF meshfree shape function and the meshfree weak-form. Several problems of linear fracture mechanics are simulated using this newlydeveloped e-RPIM method. It has demonstrated that the present e-RPIM is very accurate and stable, and it has a good potential to develop a practical simulation tool for fracture mechanics problems.
Resumo:
The coral reefs around the world may be likened to canaries down the mineshaft of global warming. These sensitive plant-like animals have evolved for life in tropical seas. Their needs are quite specific – not too cold, not too hot. A rise of as little as one degree Celsius is enough to cause some bleaching of these colourful jewels of the sea. Many climate models indicate we can expect sea temperature increases of between two and six degrees Celsius. Research - such as that detailed in a 2004 report by the University of Queensland’s Centre for Marine Studies – indicates that by the year 2050 most of the worlds major reef systems will be dead. Many of us have heard this kind of information, but it remains difficult to comprehend. It’s almost impossible to imagine the death of the Great Barrier Reef. Some six to nine thousand years old and visible from space, it is the world’s largest structure created by living organisms. Yet whilst it is hard to believe, this gentle, sensitive giant is at grave risk because it cannot adapt quickly enough to the changes in the environment. This cluster of fluffy felt brain coral sculptures are connected in real time to temperature data collected by monitoring stations within the Great Barrier Reef, that form part of the Australian Institute of Marine Science’s Great Barrier Reed Ocean Observing System. These corals display illumination patterns showing changes in sea temperature at Heron Reef, one of the 2,900 reefs that comprise the Great Barrier Reef. Their spectrum of colour ranges from cool hues, through warm tones to bright white when temperatures exceed those that tropical corals are able to tolerate over sustained periods. The Flower Animals also blush in colour and make sound when people come within close proximity. In a reef, fishes and other creatures generate significant amounts of sound. These cacophonies are considered an indicator of reef health, and are used by reef fish to determine where they can best live and forage.
Resumo:
The main objective of this paper is to detail the development of a feasible hardware design based on Evolutionary Algorithms (EAs) to determine flight path planning for Unmanned Aerial Vehicles (UAVs) navigating terrain with obstacle boundaries. The design architecture includes the hardware implementation of Light Detection And Ranging (LiDAR) terrain and EA population memories within the hardware, as well as the EA search and evaluation algorithms used in the optimizing stage of path planning. A synthesisable Very-high-speed integrated circuit Hardware Description Language (VHDL) implementation of the design was developed, for realisation on a Field Programmable Gate Array (FPGA) platform. Simulation results show significant speedup compared with an equivalent software implementation written in C++, suggesting that the present approach is well suited for UAV real-time path planning applications.
Resumo:
The Queensland University of Technology badges itself as “a university for the real world”. For the last decade the Law Faculty has aimed to provide its students with a ‘real world’ degree, that is, a practical law degree. This has seen skills such as research, advocacy and negotiation incorporated into the undergraduate degree under a University Teaching & Learning grant, a project that gained international recognition and praise. In 2007–2008 the Law Faculty undertook another curriculum review of its undergraduate law degree. As a result of the two year review, QUT’s undergraduate lawdegree has fewer core units, a focus on first year student transition, scaffolding of law graduate capabilities throughout the degree,work integrated learning and transition to the workplace. The revised degree commenced implementation in 2009. This paper focuses on the “real world” approach to the degree achieved through the first year programme, embedding and scaffolding law graduate capabilities through authentic and valid assessment and work integrated learning.
Resumo:
Drink driving causes more fatal crashes than any other single factor on Australian roads, with a third of crashes having alcohol as a contributing factor. In recent years there has been a plateau in the numbers of drink drivers apprehended by RBT, and around 12% of the general population in self report surveys admit to drinking and driving. There is limited information about the first offender group, particularly the subgroup of these offenders who admit to prior drink driving, the offence therefore being the “first time caught”. This research focuses on the differences between those who report drink driving prior to apprehension for the offence and those who don’t. Methods: 201 first time drink driving offenders were interviewed at the time of their court appearance. Information was collected on socio-demographic variables, driving behaviour, method of apprehension, offence information, alcohol use and self reported previous drink driving. Results: 78% of respondents reported that they had driven over the legal alcohol limit in the 6 months prior to the offence. Analyses revealed that those offenders who had driven over the limit previously without being caught were more likely to be younger and have an issue with risky drinking. When all variables were taken into account in a multivariate model using logistic regression, only risky drinking emerged as significantly related to past drink driving. High risk drinkers were 4.8 times more likely to report having driven over the limit without being apprehended in the previous 6 months. Conclusion: The majority of first offenders are those who are “first time apprehended” rather than “first time drink drivers”. Having an understanding of the differences between these groups may alter the focus of educational or rehabilitation countermeasures. This research is part of a larger project aiming to target first time apprehended offenders for tailored intervention.
Resumo:
Stereo vision is a method of depth perception, in which depth information is inferred from two (or more) images of a scene, taken from different perspectives. Applications of stereo vision include aerial photogrammetry, autonomous vehicle guidance, robotics, industrial automation and stereomicroscopy. A key issue in stereo vision is that of image matching, or identifying corresponding points in a stereo pair. The difference in the positions of corresponding points in image coordinates is termed the parallax or disparity. When the orientation of the two cameras is known, corresponding points may be projected back to find the location of the original object point in world coordinates. Matching techniques are typically categorised according to the nature of the matching primitives they use and the matching strategy they employ. This report provides a detailed taxonomy of image matching techniques, including area based, transform based, feature based, phase based, hybrid, relaxation based, dynamic programming and object space methods. A number of area based matching metrics as well as the rank and census transforms were implemented, in order to investigate their suitability for a real-time stereo sensor for mining automation applications. The requirements of this sensor were speed, robustness, and the ability to produce a dense depth map. The Sum of Absolute Differences matching metric was the least computationally expensive; however, this metric was the most sensitive to radiometric distortion. Metrics such as the Zero Mean Sum of Absolute Differences and Normalised Cross Correlation were the most robust to this type of distortion but introduced additional computational complexity. The rank and census transforms were found to be robust to radiometric distortion, in addition to having low computational complexity. They are therefore prime candidates for a matching algorithm for a stereo sensor for real-time mining applications. A number of issues came to light during this investigation which may merit further work. These include devising a means to evaluate and compare disparity results of different matching algorithms, and finding a method of assigning a level of confidence to a match. Another issue of interest is the possibility of statistically combining the results of different matching algorithms, in order to improve robustness.
Resumo:
The Australian beach is now accepted as a significant part of Australian national culture and identity. However, Huntsman (2001) and Booth (2001) both believe that the beach is dying: “intellectuals have failed to apply to the beach the attention they have lavished on the bush…” (Huntsman 2001, 218). Yet the beach remains a prominent image in contemporary literature and film; authors such as Tim Winton and Robert Drewe frequently set their stories in and around the coast. Although initially considered a space of myth (Fiske, Hodge, and Turner 1987), Meaghan Morris labelled the beach as ‘ordinary’ (1998), and as recently as 2001 in the wake of the Sydney Olympic Games, Bonner, McKee, and Mackay termed the beach ‘tacky’ and ‘familiar’. The beach, it appears, defies an easy categorisation. In fact, I believe the beach is more than merely mythic or ordinary, or a combination of the two. Instead it is an imaginative space, seamlessly shifting its metaphorical meanings dependent on readings of the texts. My studies examine the beach through five common beach myths; this paper will explore the myth of the beach as an egalitarian space. Contemporary Australian national texts no longer conform to these mythical representations – (in fact, was the beach ever a space of equality?), instead creating new definitions for the beach space that continually shifts in meaning. Recent texts such as Tim Winton’s Breath (2008) and Stephen Orr’s Time’s Long Ruin (2010) lay a more complex metaphorical meaning upon the beach space. This paper will explore the beach as a space of egalitarianism in conjunction with recent Australian fiction and films in order to discover how the contemporary beach is represented.
Resumo:
The Internet presents a constantly evolving frontier for criminology and policing, especially in relation to online predators – paedophiles operating within the Internet for safer access to children, child pornography and networking opportunities with other online predators. The goals of this qualitative study are to undertake behavioural research – identify personality types and archetypes of online predators and compare and contrast them with behavioural profiles and other psychological research on offline paedophiles and sex offenders. It is also an endeavour to gather intelligence on the technological utilisation of online predators and conduct observational research on the social structures of online predator communities. These goals were achieved through the covert monitoring and logging of public activity within four Internet Relay Chat(rooms) (IRC) themed around child sexual abuse and which were located on the Undernet network. Five days of monitoring was conducted on these four chatrooms between Wednesday 1 to Sunday 5 April 2009; this raw data was collated and analysed. The analysis identified four personality types – the gentleman predator, the sadist, the businessman and the pretender – and eight archetypes consisting of the groomers, dealers, negotiators, roleplayers, networkers, chat requestors, posters and travellers. The characteristics and traits of these personality types and archetypes, which were extracted from the literature dealing with offline paedophiles and sex offenders, are detailed and contrasted against the online sexual predators identified within the chatrooms, revealing many similarities and interesting differences particularly with the businessman and pretender personality types. These personality types and archetypes were illustrated by selecting users who displayed the appropriate characteristics and tracking them through the four chatrooms, revealing intelligence data on the use of proxies servers – especially via the Tor software – and other security strategies such as Undernet’s host masking service. Name and age changes, which is used as a potential sexual grooming tactic was also revealed through the use of Analyst’s Notebook software and information on ISP information revealed the likelihood that many online predators were not using any safety mechanism and relying on the anonymity of the Internet. The activities of these online predators were analysed, especially in regards to child sexual grooming and the ‘posting’ of child pornography, which revealed a few of the methods in which online predators utilised new Internet technologies to sexually groom and abuse children – using technologies such as instant messengers, webcams and microphones – as well as store and disseminate illegal materials on image sharing websites and peer-to-peer software such as Gigatribe. Analysis of the social structures of the chatrooms was also carried out and the community functions and characteristics of each chatroom explored. The findings of this research have indicated several opportunities for further research. As a result of this research, recommendations are given on policy, prevention and response strategies with regards to online predators.
Resumo:
Real‐time kinematic (RTK) GPS techniques have been extensively developed for applications including surveying, structural monitoring, and machine automation. Limitations of the existing RTK techniques that hinder their applications for geodynamics purposes are twofold: (1) the achievable RTK accuracy is on the level of a few centimeters and the uncertainty of vertical component is 1.5–2 times worse than those of horizontal components and (2) the RTK position uncertainty grows in proportional to the base‐torover distances. The key limiting factor behind the problems is the significant effect of residual tropospheric errors on the positioning solutions, especially on the highly correlated height component. This paper develops the geometry‐specified troposphere decorrelation strategy to achieve the subcentimeter kinematic positioning accuracy in all three components. The key is to set up a relative zenith tropospheric delay (RZTD) parameter to absorb the residual tropospheric effects and to solve the established model as an ill‐posed problem using the regularization method. In order to compute a reasonable regularization parameter to obtain an optimal regularized solution, the covariance matrix of positional parameters estimated without the RZTD parameter, which is characterized by observation geometry, is used to replace the quadratic matrix of their “true” values. As a result, the regularization parameter is adaptively computed with variation of observation geometry. The experiment results show that new method can efficiently alleviate the model’s ill condition and stabilize the solution from a single data epoch. Compared to the results from the conventional least squares method, the new method can improve the longrange RTK solution precision from several centimeters to the subcentimeter in all components. More significantly, the precision of the height component is even higher. Several geosciences applications that require subcentimeter real‐time solutions can largely benefit from the proposed approach, such as monitoring of earthquakes and large dams in real‐time, high‐precision GPS leveling and refinement of the vertical datum. In addition, the high‐resolution RZTD solutions can contribute to effective recovery of tropospheric slant path delays in order to establish a 4‐D troposphere tomography.
Resumo:
This paper presents a framework for performing real-time recursive estimation of landmarks’ visual appearance. Imaging data in its original high dimensional space is probabilistically mapped to a compressed low dimensional space through the definition of likelihood functions. The likelihoods are subsequently fused with prior information using a Bayesian update. This process produces a probabilistic estimate of the low dimensional representation of the landmark visual appearance. The overall filtering provides information complementary to the conventional position estimates which is used to enhance data association. In addition to robotics observations, the filter integrates human observations in the appearance estimates. The appearance tracks as computed by the filter allow landmark classification. The set of labels involved in the classification task is thought of as an observation space where human observations are made by selecting a label. The low dimensional appearance estimates returned by the filter allow for low cost communication in low bandwidth sensor networks. Deployment of the filter in such a network is demonstrated in an outdoor mapping application involving a human operator, a ground and an air vehicle.
Resumo:
In this paper, an enriched radial point interpolation method (e-RPIM) is developed the for the determination of crack tip fields. In e-RPIM, the conventional RBF interpolation is novelly augmented by the suitable trigonometric basis functions to reflect the properties of stresses for the crack tip fields. The performance of the enriched RBF meshfree shape functions is firstly investigated to fit different surfaces. The surface fitting results have proven that, comparing with the conventional RBF shape function, the enriched RBF shape function has: (1) a similar accuracy to fit a polynomial surface; (2) a much better accuracy to fit a trigonometric surface; and (3) a similar interpolation stability without increase of the condition number of the RBF interpolation matrix. Therefore, it has proven that the enriched RBF shape function will not only possess all advantages of the conventional RBF shape function, but also can accurately reflect the properties of stresses for the crack tip fields. The system of equations for the crack analysis is then derived based on the enriched RBF meshfree shape function and the meshfree weak-form. Several problems of linear fracture mechanics are simulated using this newlydeveloped e-RPIM method. It has demonstrated that the present e-RPIM is very accurate and stable, and it has a good potential to develop a practical simulation tool for fracture mechanics problems.
A Modified inverse integer Cholesky decorrelation method and the performance on ambiguity resolution
Resumo:
One of the research focuses in the integer least squares problem is the decorrelation technique to reduce the number of integer parameter search candidates and improve the efficiency of the integer parameter search method. It remains as a challenging issue for determining carrier phase ambiguities and plays a critical role in the future of GNSS high precise positioning area. Currently, there are three main decorrelation techniques being employed: the integer Gaussian decorrelation, the Lenstra–Lenstra–Lovász (LLL) algorithm and the inverse integer Cholesky decorrelation (IICD) method. Although the performance of these three state-of-the-art methods have been proved and demonstrated, there is still a potential for further improvements. To measure the performance of decorrelation techniques, the condition number is usually used as the criterion. Additionally, the number of grid points in the search space can be directly utilized as a performance measure as it denotes the size of search space. However, a smaller initial volume of the search ellipsoid does not always represent a smaller number of candidates. This research has proposed a modified inverse integer Cholesky decorrelation (MIICD) method which improves the decorrelation performance over the other three techniques. The decorrelation performance of these methods was evaluated based on the condition number of the decorrelation matrix, the number of search candidates and the initial volume of search space. Additionally, the success rate of decorrelated ambiguities was calculated for all different methods to investigate the performance of ambiguity validation. The performance of different decorrelation methods was tested and compared using both simulation and real data. The simulation experiment scenarios employ the isotropic probabilistic model using a predetermined eigenvalue and without any geometry or weighting system constraints. MIICD method outperformed other three methods with conditioning improvements over LAMBDA method by 78.33% and 81.67% without and with eigenvalue constraint respectively. The real data experiment scenarios involve both the single constellation system case and dual constellations system case. Experimental results demonstrate that by comparing with LAMBDA, MIICD method can significantly improve the efficiency of reducing the condition number by 78.65% and 97.78% in the case of single constellation and dual constellations respectively. It also shows improvements in the number of search candidate points by 98.92% and 100% in single constellation case and dual constellations case.
Resumo:
Community engagement with time poor and seemingly apathetic citizens continues to challenge local governments. Capturing the attention of a digitally literate community who are technology and socially savvy adds a new quality to this challenge. Community engagement is resource and time intensive, yet local governments have to manage on continually tightened budgets. The benefits of assisting citizens in taking ownership in making their community and city a better place to live in collaboration with planners and local governments are well established. This study investigates a new collaborative form of civic participation and engagement for urban planning that employs in-place digital augmentation. It enhances people’s experience of physical spaces with digital technologies that are directly accessible within that space, in particular through interaction with mobile phones and public displays. The study developed and deployed a system called Discussions in Space (DIS) in conjunction with a major urban planning project in Brisbane. Planners used the system to ask local residents planning-related questions via a public screen, and passers-by sent responses via SMS or Twitter onto the screen for others to read and reflect, hence encouraging in-situ, real-time, civic discourse. The low barrier of entry proved to be successful in engaging a wide range of residents who are generally not heard due to their lack of time or interest. The system also reflected positively on the local government for reaching out in this way. Challenges and implications of the short-texted and ephemeral nature of this medium were evaluated in two focus groups with urban planners. The paper concludes with an analysis of the planners’ feedback evaluating the merits of the data generated by the system to better engage with Australia’s new digital locals.
Resumo:
In total, 782 Escherichia coli strains originating from various host sources have been analyzed in this study by using a highly discriminatory single-nucleotide polymorphism (SNP) approach. A set of eight SNPs, with a discrimination value (Simpson's index of diversity [D]) of 0.96, was determined using the Minimum SNPs software, based on sequences of housekeeping genes from the E. coli multilocus sequence typing (MLST) database. Allele-specific real-time PCR was used to screen 114 E. coli isolates from various fecal sources in Southeast Queensland (SEQ). The combined analysis of both the MLST database and SEQ E. coli isolates using eight high-D SNPs resolved the isolates into 74 SNP profiles. The data obtained suggest that SNP typing is a promising approach for the discrimination of host-specific groups and allows for the identification of human-specific E. coli in environmental samples. However, a more diverse E. coli collection is required to determine animal- and environment-specific E. coli SNP profiles due to the abundance of human E. coli strains (56%) in the MLST database.