82 resultados para tracing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper draws on a study of gender and politics in the Australian parliament in order to make a contribution to methodological debates in feminist political science. The paper begins by outlining the different dimensions of feminist political science methodology that have been identified in the literature. According to this literature five key principles can be seen to constitute feminist approaches to political science. These are: a focus on gender, a deconstruction of the public/private divide, giving voice to women, using research as a basis for transformation, and using reflexivity to critique researcher positionality. The next part of the paper focuses more specifically on reflexivity tracing arguments about its definition, usefulness and the criticisms it has attracted from researchers. Following this, I explore how my background as a member of the Australian House of Representatives from 1987 to 1996 provided an important academic resource in my doctoral study of gender and politics in the national parliament. Through this process I highlight the value of a reflexive approach to research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Security-critical communications devices must be evaluated to the highest possible standards before they can be deployed. This process includes tracing potential information flow through the device's electronic circuitry, for each of the device's operating modes. Increasingly, however, security functionality is being entrusted to embedded software running on microprocessors within such devices, so new strategies are needed for integrating information flow analyses of embedded program code with hardware analyses. Here we show how standard compiler principles can augment high-integrity security evaluations to allow seamless tracing of information flow through both the hardware and software of embedded systems. This is done by unifying input/output statements in embedded program execution paths with the hardware pins they access, and by associating significant software states with corresponding operating modes of the surrounding electronic circuitry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This book addresses current debates about globalization and culture by tracing the emergence of Australia as a significant exporter of television to the world market. The authors investigate why Australian programs have found international popularity. The book describes the Australian industry and the international television marketplace. It also examines the impact of Australian programs on the television cultures of the importing countries. The authors outline policy implications and speculate on future directions of Australian television.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aberrations affect image quality of the eye away from the line of sight as well as along it. High amounts of lower order aberrations are found in the peripheral visual field and higher order aberrations change away from the centre of the visual field. Peripheral resolution is poorer than that in central vision, but peripheral vision is important for movement and detection tasks (for example driving) which are adversely affected by poor peripheral image quality. Any physiological process or intervention that affects axial image quality will affect peripheral image quality as well. The aim of this study was to investigate the effects of accommodation, myopia, age, and refractive interventions of orthokeratology, laser in situ keratomileusis and intraocular lens implantation on the peripheral aberrations of the eye. This is the first systematic investigation of peripheral aberrations in a variety of subject groups. Peripheral aberrations can be measured either by rotating a measuring instrument relative to the eye or rotating the eye relative to the instrument. I used the latter as it is much easier to do. To rule out effects of eye rotation on peripheral aberrations, I investigated the effects of eye rotation on axial and peripheral cycloplegic refraction using an open field autorefractor. For axial refraction, the subjects fixated at a target straight ahead, while their heads were rotated by ±30º with a compensatory eye rotation to view the target. For peripheral refraction, the subjects rotated their eyes to fixate on targets out to ±34° along the horizontal visual field, followed by measurements in which they rotated their heads such that the eyes stayed in the primary position relative to the head while fixating at the peripheral targets. Oblique viewing did not affect axial or peripheral refraction. Therefore it is not critical, within the range of viewing angles studied, if axial and peripheral refractions are measured with rotation of the eye relative to the instrument or rotation of the instrument relative to the eye. Peripheral aberrations were measured using a commercial Hartmann-Shack aberrometer. A number of hardware and software changes were made. The 1.4 mm range limiting aperture was replaced by a larger aperture (2.5 mm) to ensure all the light from peripheral parts of the pupil reached the instrument detector even when aberrations were high such as those occur in peripheral vision. The power of the super luminescent diode source was increased to improve detection of spots passing through the peripheral pupil. A beam splitter was placed between the subjects and the aberrometer, through which they viewed an array of targets on a wall or projected on a screen in a 6 row x 7 column matrix of points covering a visual field of 42 x 32. In peripheral vision, the pupil of the eye appears elliptical rather than circular; data were analysed off-line using custom software to determine peripheral aberrations. All analyses in the study were conducted for 5.0 mm pupils. Influence of accommodation on peripheral aberrations was investigated in young emmetropic subjects by presenting fixation targets at 25 cm and 3 m (4.0 D and 0.3 D accommodative demands, respectively). Increase in accommodation did not affect the patterns of any aberrations across the field, but there was overall negative shift in spherical aberration across the visual field of 0.10 ± 0.01m. Subsequent studies were conducted with the targets at a 1.2 m distance. Young emmetropes, young myopes and older emmetropes exhibited similar patterns of astigmatism and coma across the visual field. However, the rate of change of coma across the field was higher in young myopes than young emmetropes and was highest in older emmetropes amongst the three groups. Spherical aberration showed an overall decrease in myopes and increase in older emmetropes across the field, as compared to young emmetropes. Orthokeratology, spherical IOL implantation and LASIK altered peripheral higher order aberrations considerably, especially spherical aberration. Spherical IOL implantation resulted in an overall increase in spherical aberration across the field. Orthokeratology and LASIK reversed the direction of change in coma across the field. Orthokeratology corrected peripheral relative hypermetropia through correcting myopia in the central visual field. Theoretical ray tracing demonstrated that changes in aberrations due to orthokeratology and LASIK can be explained by the induced changes in radius of curvature and asphericity of the cornea. This investigation has shown that peripheral aberrations can be measured with reasonable accuracy with eye rotation relative to the instrument. Peripheral aberrations are affected by accommodation, myopia, age, orthokeratology, spherical intraocular lens implantation and laser in situ keratomileusis. These factors affect the magnitudes and patterns of most aberrations considerably (especially coma and spherical aberration) across the studied visual field. The changes in aberrations across the field may influence peripheral detection and motion perception. However, further research is required to investigate how the changes in aberrations influence peripheral detection and motion perception and consequently peripheral vision task performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although comparison phakometry has been used by a number of studies to measure posterior corneal shape, these studies have not calculated the size of the posterior corneal zones of reflection they assessed. This paper develops paraxial equations for calculating posterior corneal zones of reflection, based on standard keratometry equations and equivalent mirror theory. For targets used in previous studies, posterior corneal reflection zone sizes were calculated using paraxial equations and using exact ray tracing, assuming spherical and aspheric corneal surfaces. Paraxial methods and exact ray tracing methods give similar estimates for reflection zone sizes less than 2 mm, but for larger zone sizes ray tracing methods should be used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The word “queer” is a slippery one; its etymology is uncertain, and academic and popular usage attributes conflicting meanings to the word. By the mid-nineteenth century, “queer” was used as a pejorative term for a (male) homosexual. This negative connotation continues when it becomes a term for homophobic abuse. In recent years, “queer” has taken on additional uses: as an all encompassing term for culturally marginalised sexualities – gay, lesbian, trans, bi, and intersex (“GLBTI”) – and as a theoretical strategy which deconstructs binary oppositions that govern identity formation. Tracing its history, the Oxford English Dictionary notes that the earliest references to “queer” may have appeared in the sixteenth century. These early examples of queer carried negative connotations such as “vulgar,” “bad,” “worthless,” “strange,” or “odd” and such associations continued until the mid-twentieth century. The early nineteenth century, and perhaps earlier, employed “queer” as a verb, meaning to “to put out of order,” “to spoil”, “to interfere with”. The adjectival form also began to emerge during this time to refer to a person’s condition as being “not normal,” “out of sorts” or to cause a person “to feel queer” meaning “to disconcert, perturb, unsettle.” According to Eve Sedgwick (1993), “the word ‘queer’ itself means across – it comes from the Indo-European root – twerkw, which also yields the German quer (traverse), Latin torquere (to twist), English athwart . . . it is relational and strange.” Despite the gaps in the lineage and changes in usage, meaning and grammatical form, “queer” as a political and theoretical strategy has benefited from its diverse origins. It refuses to settle comfortably into a single classification, preferring instead to traverse several categories that would otherwise attempt to stabilise notions of chromosomal sex, gender and sexuality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The homeless have been subject to considerable scrutiny, historically and within current social, political and public discourse. The aetiology of homelessness has been the focus of a large body of economic, sociological, historical and political investigation. Importantly, efforts to conceptualise, explain and measure, the phenomenon of homelessness and homeless people has occurred largely within the context of defining “the problem of the homeless” and the generation of solutions to the ‘problem’. There has been little consideration of how and why homelessness has come to be seen, or understood, as a problem, or how this can change across time and/or place. This alternative stream of research has focused on tracing and analysing the relationship between how people experiencing homeless have become a matter of government concern and the manner in which homelessness itself has been problematised. With this in mind this study has analysed the discourses - political, social and economic rationalities and knowledges - which have provided the conditions of possibility for the identification of the homeless and homelessness as a problem needing to be governed and the means for translating these discourses into the applied domain. The aim of this thesis has been to contribute to current knowledge by developing a genealogy of the conditions and rationalities that have underpinned the problematisation of homelessness and the homeless. The outcome of this analysis has been to open up the opportunity to consider alternative governmental possibilities arising from the exposure of the way in which contemporary problematisation and responses have been influenced by the past. An understanding of this process creates an ability to appreciate the intended and unintended consequences for the future direction of public policy and contemporary research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current research and practice related to the first year experience (FYE) of commencing higher education students are still mainly piecemeal rather than institution-wide with institutions struggling to achieve cross-institutional integration, coordination and coherence of FYE policy and practice. Drawing on a decade of FYE-related research including an ALTC Senior Fellowship and evidence at a large Australian metropolitan university, this paper explores how one institution has addressed that issue by tracing the evolution and maturation of strategies that ultimately conceptualize FYE as “everybody's business.” It is argued that, when first generation co-curricular and second generation curricular approaches are integrated and implemented through an intentionally designed curriculum by seamless partnerships of academic and professional staff in a whole-of-institution transformation, we have a third generation approach labelled here as transition pedagogy. It is suggested that transition pedagogy provides the optimal vehicle for dealing with the increasingly diverse commencing student cohorts by facilitating a sense of engagement, support and belonging. What is presented here is an example of transition pedagogy in action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed Denial of Services DDoS, attacks has become one of the biggest threats for resources over Internet. Purpose of these attacks is to make servers deny from providing services to legitimate users. These attacks are also used for occupying media bandwidth. Currently intrusion detection systems can just detect the attacks but cannot prevent / track the location of intruders. Some schemes also prevent the attacks by simply discarding attack packets, which saves victim from attack, but still network bandwidth is wasted. In our opinion, DDoS requires a distributed solution to save wastage of resources. The paper, presents a system that helps us not only in detecting such attacks but also helps in tracing and blocking (to save the bandwidth as well) the multiple intruders using Intelligent Software Agents. The system gives dynamic response and can be integrated with the existing network defense systems without disturbing existing Internet model. We have implemented an agent based networking monitoring system in this regard.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interactional competence has emerged as a focal point for language testing researchers in recent years. In spoken communication involving two or more interlocutors, the co-construction of discourse is central to successful interaction. The acknowledgement of co-construction has led to concern over the impact of the interlocutor and the separability of performances in speaking tests involving interaction. The purpose of this article is to review recent studies of direct relevance to the construct of interactional competence and its operationalisation by raters in the context of second language speaking tests. The review begins by tracing the emergence of interaction as a criterion in speaking tests from a theoretical perspective, and then focuses on research salient to interactional effectiveness that has been carried out in the context of language testing interviews and group and paired speaking tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid growth of mobile telephone use, satellite services, and now the wireless Internet and WLANs are generating tremendous changes in telecommunication and networking. As indoor wireless communications become more prevalent, modeling indoor radio wave propagation in populated environments is a topic of significant interest. Wireless MIMO communication exploits phenomena such as multipath propagation to increase data throughput and range, or reduce bit error rates, rather than attempting to eliminate effects of multipath propagation as traditional SISO communication systems seek to do. The MIMO approach can yield significant gains for both link and network capacities, with no additional transmitting power or bandwidth consumption when compared to conventional single-array diversity methods. When MIMO and OFDM systems are combined and deployed in a suitable rich scattering environment such as indoors, a significant capacity gain can be observed due to the assurance of multipath propagation. Channel variations can occur as a result of movement of personnel, industrial machinery, vehicles and other equipment moving within the indoor environment. The time-varying effects on the propagation channel in populated indoor environments depend on the different pedestrian traffic conditions and the particular type of environment considered. A systematic measurement campaign to study pedestrian movement effects in indoor MIMO-OFDM channels has not yet been fully undertaken. Measuring channel variations caused by the relative positioning of pedestrians is essential in the study of indoor MIMO-OFDM broadband wireless networks. Theoretically, due to high multipath scattering, an increase in MIMO-OFDM channel capacity is expected when pedestrians are present. However, measurements indicate that some reductions in channel capacity could be observed as the number of pedestrians approaches 10 due to a reduction in multipath conditions as more human bodies absorb the wireless signals. This dissertation presents a systematic characterization of the effects of pedestrians in indoor MIMO-OFDM channels. Measurement results, using the MIMO-OFDM channel sounder developed at the CSIRO ICT Centre, have been validated by a customized Geometric Optics-based ray tracing simulation. Based on measured and simulated MIMO-OFDM channel capacity and MIMO-OFDM capacity dynamic range, an improved deterministic model for MIMO-OFDM channels in indoor populated environments is presented. The model can be used for the design and analysis of future WLAN to be deployed in indoor environments. The results obtained show that, in both Fixed SNR and Fixed Tx for deterministic condition, the channel capacity dynamic range rose with the number of pedestrians as well as with the number of antenna combinations. In random scenarios with 10 pedestrians, an increment in channel capacity of up to 0.89 bits/sec/Hz in Fixed SNR and up to 1.52 bits/sec/Hz in Fixed Tx has been recorded compared to the one pedestrian scenario. In addition, from the results a maximum increase in average channel capacity of 49% has been measured while 4 antenna elements are used, compared with 2 antenna elements. The highest measured average capacity, 11.75 bits/sec/Hz, corresponds to the 4x4 array with 10 pedestrians moving randomly. Moreover, Additionally, the spread between the highest and lowest value of the the dynamic range is larger for Fixed Tx, predicted 5.5 bits/sec/Hz and measured 1.5 bits/sec/Hz, in comparison with Fixed SNR criteria, predicted 1.5 bits/sec/Hz and measured 0.7 bits/sec/Hz. This has been confirmed by both measurements and simulations ranging from 1 to 5, 7 and 10 pedestrians.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An foundational introduction to current issues in postcolonial research on education and youth studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.