863 resultados para degeneration thesis
Resumo:
Background/Aims: In an investigation of the functional impact of amblyopia on children, the fine motor skills, perceived self-esteem and eye movements of amblyopic children were compared with that of age-matched controls. The influence of amblyogenic condition or treatment factors that might predict any decrement in outcome measures was investigated. The relationship between indirect measures of eye movements that are used clinically and eye movement characteristics recorded during reading was examined and the relevance of proficiency in fine motor skills to performance on standardised educational tests was explored in a sub-group of the control children. Methods: Children with amblyopia (n=82; age 8.2 ± 1.3 years) from differing causes (infantile esotropia n=17, acquired strabismus n=28, anisometropia n=15, mixed n=13 and deprivation n=9), and a control group of children (n=106; age 9.5 ± 1.2 years) participated in this study. Measures of visual function included monocular logMAR visual acuity (VA) and stereopsis assessed with the Randot Preschool Stereoacuity test, while fine motor skills were measured using the Visual-Motor Control (VMC) and Upper Limb Speed and Dexterity (ULSD) subtests of the Brunicks-Oseretsky Test of Motor Proficiency. Perceived self esteem was assessed for those children from grade 3 school level with the Harter Self Perception Profile for Children and for those in younger grades (preschool to grade 2) with the Pictorial Scale of Perceived Competence and Acceptance for Young Children. A clinical measure of eye movements was made with the Developmental Eye Movement (DEM) test for those children aged eight years and above. For appropriate case-control comparison of data, the results from amblyopic children were compared with age-matched sub-samples drawn from the group of children with normal vision who completed the tests. Eye movements during reading for comprehension were recorded by the Visagraph infra-red recording system and results of standardised tests of educational performance were also obtained for a sub-set of the control group. Results Amblyopic children (n=82; age 8.2 ± 1.7 years) performed significantly poorer than age-matched control children (n=37; age 8.3 ± 1.3 years) on 9 of 16 fine motor skills sub-items and for the overall age-standardised scores for both VMC and ULSD items (p<0.05); differences were most evident on timed manual dexterity tasks. The underlying aetiology of amblyopia and level of stereoacuity significantly affected fine motor skill performance on both items. However, when examined in a multiple regression model that took into account the inter-correlation between visual characteristics, poorer fine motor skills performance was only associated with strabismus (F1,75 = 5.428; p =0. 022), and not with the level of stereoacuity, refractive error or visual acuity in either eye. Amblyopic children from grade 3 school level and above (n=47; age 9.2 ± 1.3 years), particularly those with acquired strabismus, had significantly lower social acceptance scores than age-matched control children (n=52; age 9.4 ± 0.5 years) (F(5,93) = 3.14; p = 0.012). However, the scores of the amblyopic children were not significantly different to controls for other areas related to self-esteem, including scholastic competence, physical appearance, athletic competence, behavioural conduct and global self worth. A lower social acceptance score was independently associated with a history of treatment with patching but not with a history of strabismus or wearing glasses. Amblyopic children from pre-school to grade 2 school level (n=29; age = 6.6 ± 0.6 years) had similar self-perception scores to their age-matched peers (n=20; age = 6.4 ± 0.5 years). There were no significant differences between the amblyopic (n=39; age 9.1 ± 0.9 years) and age-matched control (n = 42; age = 9.3 ± 0.38 years) groups for any of the DEM outcome measures (Vertical Time, Horizontal Time, Number of Errors and Ratio (Horizontal time/Vertical time)). Performance on the DEM did not significantly relate to measures of VA in either eye, level of binocular function, history of strabismus or refractive error. Developmental Eye Movement test outcome measures Horizontal Time and Vertical Time were significantly correlated with reading rates measured by the Visagraph for both reading for comprehension and naming numbers (r>0.5). Some moderate correlations were also seen between the DEM Ratio and word reading rates as recorded by Visagraph (r=0.37). In children with normal vision, academic scores in mathematics, spelling and reading were associated with measures of fine motor skills. Strongest effect sizes were seen with the timed manual dexterity domain, Upper Limb Speed and Dexterity. Conclusions Amblyopia may have a negative impact on a child’s fine motor skills and an older child’s sense of acceptance by their peers may be influenced by treatment that includes eye patching. Clinical measures of eye movements were not affected in amblyopic children. A number of the outcome measures of the DEM are associated with objective recordings of reading rates, supporting its clinical use for identification of children with slower reading rates. In children with normal vision, proficiency on clinical measures of fine motor skill are associated with outcomes on standardised measures of educational performance. Scores on timed manual dexterity tasks had the strongest association with educational performance. Collectively, the results of this study indicate that, in addition to the reduction in visual acuity and binocular function that define the condition, amblyopes have functional impairment in childhood development skills that underlie proficiency in everyday activities. The study provides support for strategies aimed at early identification and remediation of amblyopia and the co-morbidities that arise from abnormal visual neurodevelopment.
Resumo:
3D Motion capture is a medium that plots motion, typically human motion, converting it into a form that can be represented digitally. It is a fast evolving field and recent inertial technology may provide new artistic possibilities for its use in live performance. Although not often used in this context, motion capture has a combination of attributes that can provide unique forms of collaboration with performance arts. The inertial motion capture suit used for this study has orientation sensors placed at strategic points on the body to map body motion. Its portability, real-time performance, ease of use, and its immunity from line-of-sight problems inherent in optical systems suggest it would work well as a live performance technology. Many animation techniques can be used in real-time. This research examines a broad cross-section of these techniques using four practice-led cases to assess the suitability of inertial motion capture to live performance. Although each case explores different visual possibilities, all make use of the performativity of the medium, using either an improvisational format or interactivity among stage, audience and screen that would be difficult to emulate any other way. A real-time environment is not capable of reproducing the depth and sophistication of animation people have come to expect through media. These environments take many hours to render. In time the combination of what can be produced in real-time and the tools available in a 3D environment will no doubt create their own tree of aesthetic directions in live performance. The case study looks at the potential of interactivity that this technology offers.
Resumo:
The central aim for the research undertaken in this PhD thesis is the development of a model for simulating water droplet movement on a leaf surface and to compare the model behavior with experimental observations. A series of five papers has been presented to explain systematically the way in which this droplet modelling work has been realised. Knowing the path of the droplet on the leaf surface is important for understanding how a droplet of water, pesticide, or nutrient will be absorbed through the leaf surface. An important aspect of the research is the generation of a leaf surface representation that acts as the foundation of the droplet model. Initially a laser scanner is used to capture the surface characteristics for two types of leaves in the form of a large scattered data set. After the identification of the leaf surface boundary, a set of internal points is chosen over which a triangulation of the surface is constructed. We present a novel hybrid approach for leaf surface fitting on this triangulation that combines Clough-Tocher (CT) and radial basis function (RBF) methods to achieve a surface with a continuously turning normal. The accuracy of the hybrid technique is assessed using numerical experimentation. The hybrid CT-RBF method is shown to give good representations of Frangipani and Anthurium leaves. Such leaf models facilitate an understanding of plant development and permit the modelling of the interaction of plants with their environment. The motion of a droplet traversing this virtual leaf surface is affected by various forces including gravity, friction and resistance between the surface and the droplet. The innovation of our model is the use of thin-film theory in the context of droplet movement to determine the thickness of the droplet as it moves on the surface. Experimental verification shows that the droplet model captures reality quite well and produces realistic droplet motion on the leaf surface. Most importantly, we observed that the simulated droplet motion follows the contours of the surface and spreads as a thin film. In the future, the model may be applied to determine the path of a droplet of pesticide along a leaf surface before it falls from or comes to a standstill on the surface. It will also be used to study the paths of many droplets of water or pesticide moving and colliding on the surface.
Resumo:
The World Wide Web has become a medium for people to share information. People use Web-based collaborative tools such as question answering (QA) portals, blogs/forums, email and instant messaging to acquire information and to form online-based communities. In an online QA portal, a user asks a question and other users can provide answers based on their knowledge, with the question usually being answered by many users. It can become overwhelming and/or time/resource consuming for a user to read all of the answers provided for a given question. Thus, there exists a need for a mechanism to rank the provided answers so users can focus on only reading good quality answers. The majority of online QA systems use user feedback to rank users’ answers and the user who asked the question can decide on the best answer. Other users who didn’t participate in answering the question can also vote to determine the best answer. However, ranking the best answer via this collaborative method is time consuming and requires an ongoing continuous involvement of users to provide the needed feedback. The objective of this research is to discover a way to recommend the best answer as part of a ranked list of answers for a posted question automatically, without the need for user feedback. The proposed approach combines both a non-content-based reputation method and a content-based method to solve the problem of recommending the best answer to the user who posted the question. The non-content method assigns a score to each user which reflects the users’ reputation level in using the QA portal system. Each user is assigned two types of non-content-based reputations cores: a local reputation score and a global reputation score. The local reputation score plays an important role in deciding the reputation level of a user for the category in which the question is asked. The global reputation score indicates the prestige of a user across all of the categories in the QA system. Due to the possibility of user cheating, such as awarding the best answer to a friend regardless of the answer quality, a content-based method for determining the quality of a given answer is proposed, alongside the non-content-based reputation method. Answers for a question from different users are compared with an ideal (or expert) answer using traditional Information Retrieval and Natural Language Processing techniques. Each answer provided for a question is assigned a content score according to how well it matched the ideal answer. To evaluate the performance of the proposed methods, each recommended best answer is compared with the best answer determined by one of the most popular link analysis methods, Hyperlink-Induced Topic Search (HITS). The proposed methods are able to yield high accuracy, as shown by correlation scores: Kendall correlation and Spearman correlation. The reputation method outperforms the HITS method in terms of recommending the best answer. The inclusion of the reputation score with the content score improves the overall performance, which is measured through the use of Top-n match scores.
Resumo:
Within the Australian wet tropics bioregion, only 900 000 hectares of once continuous rainforest habitat between Townsville and Cooktown now remains. While on the Atherton Tableland, only 4% of the rainforest that once occurred there remains today with remnant vegetation now forming a matrix of rainforest dispersed within agricultural land (sugarcane, banana, orchard crops, townships and pastoral land). Some biologists have suggested that remnants often support both faunal and floral communities that differ significantly from remaining continuous forest. Australian tropical forests possess a relatively high diversity of native small mammal species particularly rodents, which unlike larger mammalian and avian frugivores elsewhere, have been shown to be resilient to the effects of fragmentation, patch isolation and reduction in patch size. While small mammals often become the dominant mammalian frugivores, in terms of their relative abundance, the relationship that exists between habitat diversity and structure, and the impacts of small mammal foraging within fragmented habitat patches in Australia, is still poorly understood. The relationship between foraging behaviour and demography of two small mammal species, Rattus fuscipes and Melomys cervinipes, and food resources in fragmented rainforest sites, were investigated in the current study. Population densities of both species were strongly related with overall density of seed resources in all rainforest fragments. The distribution of both mammal species however, was found to be independent of the distribution of seed resources. Seed utilisation trials indicated that M.cervinipes and R.fuscipes had less impact on seed resources (extent of seed harvesting) than did other rainforest frugivores. Experimental feeding trials demonstrated that in 85% of fruit species tested, rodent feeding increased seed germination by a factor of 3.5 suggesting that in Australian tropical rainforest remnants, small mammals may play a significant role in enhancing germination of large seeded fruits. This study has emphasised the role of small mammals in tropical rainforest systems in north eastern Australia, in particular, the role that they play within isolated forest fragments where larger frugivorous species may be absent.
Resumo:
Financial processes may possess long memory and their probability densities may display heavy tails. Many models have been developed to deal with this tail behaviour, which reflects the jumps in the sample paths. On the other hand, the presence of long memory, which contradicts the efficient market hypothesis, is still an issue for further debates. These difficulties present challenges with the problems of memory detection and modelling the co-presence of long memory and heavy tails. This PhD project aims to respond to these challenges. The first part aims to detect memory in a large number of financial time series on stock prices and exchange rates using their scaling properties. Since financial time series often exhibit stochastic trends, a common form of nonstationarity, strong trends in the data can lead to false detection of memory. We will take advantage of a technique known as multifractal detrended fluctuation analysis (MF-DFA) that can systematically eliminate trends of different orders. This method is based on the identification of scaling of the q-th-order moments and is a generalisation of the standard detrended fluctuation analysis (DFA) which uses only the second moment; that is, q = 2. We also consider the rescaled range R/S analysis and the periodogram method to detect memory in financial time series and compare their results with the MF-DFA. An interesting finding is that short memory is detected for stock prices of the American Stock Exchange (AMEX) and long memory is found present in the time series of two exchange rates, namely the French franc and the Deutsche mark. Electricity price series of the five states of Australia are also found to possess long memory. For these electricity price series, heavy tails are also pronounced in their probability densities. The second part of the thesis develops models to represent short-memory and longmemory financial processes as detected in Part I. These models take the form of continuous-time AR(∞) -type equations whose kernel is the Laplace transform of a finite Borel measure. By imposing appropriate conditions on this measure, short memory or long memory in the dynamics of the solution will result. A specific form of the models, which has a good MA(∞) -type representation, is presented for the short memory case. Parameter estimation of this type of models is performed via least squares, and the models are applied to the stock prices in the AMEX, which have been established in Part I to possess short memory. By selecting the kernel in the continuous-time AR(∞) -type equations to have the form of Riemann-Liouville fractional derivative, we obtain a fractional stochastic differential equation driven by Brownian motion. This type of equations is used to represent financial processes with long memory, whose dynamics is described by the fractional derivative in the equation. These models are estimated via quasi-likelihood, namely via a continuoustime version of the Gauss-Whittle method. The models are applied to the exchange rates and the electricity prices of Part I with the aim of confirming their possible long-range dependence established by MF-DFA. The third part of the thesis provides an application of the results established in Parts I and II to characterise and classify financial markets. We will pay attention to the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), the NASDAQ Stock Exchange (NASDAQ) and the Toronto Stock Exchange (TSX). The parameters from MF-DFA and those of the short-memory AR(∞) -type models will be employed in this classification. We propose the Fisher discriminant algorithm to find a classifier in the two and three-dimensional spaces of data sets and then provide cross-validation to verify discriminant accuracies. This classification is useful for understanding and predicting the behaviour of different processes within the same market. The fourth part of the thesis investigates the heavy-tailed behaviour of financial processes which may also possess long memory. We consider fractional stochastic differential equations driven by stable noise to model financial processes such as electricity prices. The long memory of electricity prices is represented by a fractional derivative, while the stable noise input models their non-Gaussianity via the tails of their probability density. A method using the empirical densities and MF-DFA will be provided to estimate all the parameters of the model and simulate sample paths of the equation. The method is then applied to analyse daily spot prices for five states of Australia. Comparison with the results obtained from the R/S analysis, periodogram method and MF-DFA are provided. The results from fractional SDEs agree with those from MF-DFA, which are based on multifractal scaling, while those from the periodograms, which are based on the second order, seem to underestimate the long memory dynamics of the process. This highlights the need and usefulness of fractal methods in modelling non-Gaussian financial processes with long memory.
Resumo:
The Tide Lords series of fantasy novels set out to examine the issue of immortality. Its purpose was to look at the desirability of immortality, specifically why people actively seek it. It was meant to examine the practicality of immortality, specifically — having got there, what does one do to pass the time with eternity to fill? I also wished to examine the notion of true immortality — immortals who could not be killed. What I did not anticipate when embarking upon this series, and what did not become apparent until after the series had been sold to two major publishing houses in Australia and the US, was the strength of the immortality tropes. This series was intended to fly in the face of these tropes, but confronted with the reality of such a work, the Australian publishers baulked at the ideas presented, requesting the series be re-written with the tropes taken into consideration. They wanted immortals who could die, mortals who wanted to be immortal. And a hero with a sense of humour. This exegesis aims to explore where these tropes originated. It will also discuss the ways I negotiated a way around the tropes, and was eventually able to please the publishers by appearing to adhere to the tropes, while still staying true to the story I wanted to tell. As such, this discussion is, in part, an analysis of how an author negotiates the tensions around writing within a genre while trying to innovate within it.
Resumo:
The establishment of corporate objectives regarding economic, environmental, social, and ethical responsibilities, to inform business practice, has been gaining credibility in the business sector since the early 1990’s. This is witnessed through (i) the formation of international forums for sustainable and accountable development, (ii) the emergence of standards, systems, and frameworks to provide common ground for regulatory and corporate dialogue, and (iii) the significant quantum of relevant popular and academic literature in a diverse range of disciplines. How then has this move towards greater corporate responsibility become evident in the provision of major urban infrastructure projects? The gap identified, in both academic literature and industry practice, is a structured and auditable link between corporate intent and project outcomes. Limited literature has been discovered which makes a link between corporate responsibility; project performance indicators (or critical success factors) and major infrastructure provision. This search revealed that a comprehensive mapping framework, from an organisation’s corporate objectives through to intended, anticipated and actual outcomes and impacts has not yet been developed for the delivery of such projects. The research problem thus explored is ‘the need to better identify, map and account for the outcomes, impacts and risks associated with economic, environmental, social and ethical outcomes and impacts which arise from major economic infrastructure projects, both now, and into the future’. The methodology being used to undertake this research is based on Checkland’s soft system methodology, engaging in action research on three collaborative case studies. A key outcome of this research is a value-mapping framework applicable to Australian public sector agencies. This is a decision-making methodology which will enable project teams responsible for delivering major projects, to better identify and align project objectives and impacts with stated corporate objectives.