216 resultados para degeneration thesis


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The World Wide Web has become a medium for people to share information. People use Web-based collaborative tools such as question answering (QA) portals, blogs/forums, email and instant messaging to acquire information and to form online-based communities. In an online QA portal, a user asks a question and other users can provide answers based on their knowledge, with the question usually being answered by many users. It can become overwhelming and/or time/resource consuming for a user to read all of the answers provided for a given question. Thus, there exists a need for a mechanism to rank the provided answers so users can focus on only reading good quality answers. The majority of online QA systems use user feedback to rank users’ answers and the user who asked the question can decide on the best answer. Other users who didn’t participate in answering the question can also vote to determine the best answer. However, ranking the best answer via this collaborative method is time consuming and requires an ongoing continuous involvement of users to provide the needed feedback. The objective of this research is to discover a way to recommend the best answer as part of a ranked list of answers for a posted question automatically, without the need for user feedback. The proposed approach combines both a non-content-based reputation method and a content-based method to solve the problem of recommending the best answer to the user who posted the question. The non-content method assigns a score to each user which reflects the users’ reputation level in using the QA portal system. Each user is assigned two types of non-content-based reputations cores: a local reputation score and a global reputation score. The local reputation score plays an important role in deciding the reputation level of a user for the category in which the question is asked. The global reputation score indicates the prestige of a user across all of the categories in the QA system. Due to the possibility of user cheating, such as awarding the best answer to a friend regardless of the answer quality, a content-based method for determining the quality of a given answer is proposed, alongside the non-content-based reputation method. Answers for a question from different users are compared with an ideal (or expert) answer using traditional Information Retrieval and Natural Language Processing techniques. Each answer provided for a question is assigned a content score according to how well it matched the ideal answer. To evaluate the performance of the proposed methods, each recommended best answer is compared with the best answer determined by one of the most popular link analysis methods, Hyperlink-Induced Topic Search (HITS). The proposed methods are able to yield high accuracy, as shown by correlation scores: Kendall correlation and Spearman correlation. The reputation method outperforms the HITS method in terms of recommending the best answer. The inclusion of the reputation score with the content score improves the overall performance, which is measured through the use of Top-n match scores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Within the Australian wet tropics bioregion, only 900 000 hectares of once continuous rainforest habitat between Townsville and Cooktown now remains. While on the Atherton Tableland, only 4% of the rainforest that once occurred there remains today with remnant vegetation now forming a matrix of rainforest dispersed within agricultural land (sugarcane, banana, orchard crops, townships and pastoral land). Some biologists have suggested that remnants often support both faunal and floral communities that differ significantly from remaining continuous forest. Australian tropical forests possess a relatively high diversity of native small mammal species particularly rodents, which unlike larger mammalian and avian frugivores elsewhere, have been shown to be resilient to the effects of fragmentation, patch isolation and reduction in patch size. While small mammals often become the dominant mammalian frugivores, in terms of their relative abundance, the relationship that exists between habitat diversity and structure, and the impacts of small mammal foraging within fragmented habitat patches in Australia, is still poorly understood. The relationship between foraging behaviour and demography of two small mammal species, Rattus fuscipes and Melomys cervinipes, and food resources in fragmented rainforest sites, were investigated in the current study. Population densities of both species were strongly related with overall density of seed resources in all rainforest fragments. The distribution of both mammal species however, was found to be independent of the distribution of seed resources. Seed utilisation trials indicated that M.cervinipes and R.fuscipes had less impact on seed resources (extent of seed harvesting) than did other rainforest frugivores. Experimental feeding trials demonstrated that in 85% of fruit species tested, rodent feeding increased seed germination by a factor of 3.5 suggesting that in Australian tropical rainforest remnants, small mammals may play a significant role in enhancing germination of large seeded fruits. This study has emphasised the role of small mammals in tropical rainforest systems in north eastern Australia, in particular, the role that they play within isolated forest fragments where larger frugivorous species may be absent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Tide Lords series of fantasy novels set out to examine the issue of immortality. Its purpose was to look at the desirability of immortality, specifically why people actively seek it. It was meant to examine the practicality of immortality, specifically — having got there, what does one do to pass the time with eternity to fill? I also wished to examine the notion of true immortality — immortals who could not be killed. What I did not anticipate when embarking upon this series, and what did not become apparent until after the series had been sold to two major publishing houses in Australia and the US, was the strength of the immortality tropes. This series was intended to fly in the face of these tropes, but confronted with the reality of such a work, the Australian publishers baulked at the ideas presented, requesting the series be re-written with the tropes taken into consideration. They wanted immortals who could die, mortals who wanted to be immortal. And a hero with a sense of humour. This exegesis aims to explore where these tropes originated. It will also discuss the ways I negotiated a way around the tropes, and was eventually able to please the publishers by appearing to adhere to the tropes, while still staying true to the story I wanted to tell. As such, this discussion is, in part, an analysis of how an author negotiates the tensions around writing within a genre while trying to innovate within it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The establishment of corporate objectives regarding economic, environmental, social, and ethical responsibilities, to inform business practice, has been gaining credibility in the business sector since the early 1990’s. This is witnessed through (i) the formation of international forums for sustainable and accountable development, (ii) the emergence of standards, systems, and frameworks to provide common ground for regulatory and corporate dialogue, and (iii) the significant quantum of relevant popular and academic literature in a diverse range of disciplines. How then has this move towards greater corporate responsibility become evident in the provision of major urban infrastructure projects? The gap identified, in both academic literature and industry practice, is a structured and auditable link between corporate intent and project outcomes. Limited literature has been discovered which makes a link between corporate responsibility; project performance indicators (or critical success factors) and major infrastructure provision. This search revealed that a comprehensive mapping framework, from an organisation’s corporate objectives through to intended, anticipated and actual outcomes and impacts has not yet been developed for the delivery of such projects. The research problem thus explored is ‘the need to better identify, map and account for the outcomes, impacts and risks associated with economic, environmental, social and ethical outcomes and impacts which arise from major economic infrastructure projects, both now, and into the future’. The methodology being used to undertake this research is based on Checkland’s soft system methodology, engaging in action research on three collaborative case studies. A key outcome of this research is a value-mapping framework applicable to Australian public sector agencies. This is a decision-making methodology which will enable project teams responsible for delivering major projects, to better identify and align project objectives and impacts with stated corporate objectives.