968 resultados para informative counting


Relevância:

10.00% 10.00%

Publicador:

Resumo:

President’s Report Hello fellow AITPM members, A few weeks ago we saw another example of all levels of Government pulling together in real time to try to deal with a major transport incident, this time it was container loads of ammonium nitrate falling off the Pacific Adventurer during Cyclone Hamish and the associated major oil spill due to piercing of its hull off Moreton Bay in southern Queensland. The oil spill was extensive, affecting beaches and estuaries from Moreton Island north to the Sunshine Coast; a coastal stretch of at least 60km. We saw the Queensland Government, Brisbane, Moreton Bay and Sunshine Coast Regional Council crews deployed quickly once the gravity of the situation was realised to clean up toxic oil on beaches and prevent extensive upstream contamination. Environmental agencies public and private were quick to respond to help affected wildlife. The Navy’s HMAS Yarra and another minesweeper were deployed to search for the containers in the coastal area in an effort to have them salvaged before all ammonium nitrate could leach into and harm marine habitat, which would have a substantial impact not only on that environment but also the fishing industry. all of this during the final fortnight before a State election.) While this could be branded as a maritime problem, the road transport and logistics system was crucial to the cleanup. The private vehicular ferries were enlisted to transport plant and equipment from Brisbane to Moreton Island. The plant themselves, such as graders, were drawn from road building and maintenance inventory. Hundreds of Councils’ staff were released from other activities to undertake the cleanup. While it will take some time for us to know the long term impacts of this incident, it seems difficult to fault “grassroots” government crews and their private counterparts, such as Island tourism staff, in the initial cleanup effort. From a traffic planning and management perspective, we should also remember that this sort of incident has happened on road and rail corridors in the past, albeit on lesser scales. It underlines that we do need to continue to protect communities, commercial interests, and the environment through rigorous heavy vehicle management, planning and management of dangerous goods routesincluding rail corridors through urban areas), and carefully considered incident and disaster recovery plans and protocols. I’d like to close in reminding everyone again that AITPM’s flagship event, the 2009 AITPM National Conference, Traffic Beyond Tomorrow, is being held in Adelaide from 5 to 7 August. SA Branch President Paul Morris informs me that we have had over 50 paper submissions to date, from which a very balanced and informative programme of sessions has been prepared. www.aitpm.com has all of the details about how to register, sponsor a booth, session, etc. Best regards all, Jon Bunker

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding users' capabilities, needs and expectations is key to the domain of Inclusive Design. Much of the work in the field could be informed and further strengthened by clear, valid and representative data covering the full range of people's capabilities. This article reviews existing data sets and identifies the challenges inherent in measuring capability in a manner that is informative for work in Inclusive Design. The need for a design-relevant capability data set is identified and consideration is given to a variety of capability construct operationalisation issues including questions associated with self-report and performance measures, sampling and the appropriate granularity of measures. The need for further experimental work is identified and a programme of research designed to culminate in the design of a valid and reliable capability survey is described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes an initiative in the Faculty of Health at the Queensland University of Technology, Australia, where a short writing task was introduced to first year undergraduates in four courses including Public Health, Nursing, Social Work and Human Services, and Human Movement Studies. Over 1,000 students were involved in the trial. The task was assessed using an adaptation of the MASUS Procedure (Measuring the Academic Skills of University Students) (Webb & Bonanno, 1994). Feedback to the students including MASUS scores then enabled students to be directed to developmental workshops targeting their academic literacy needs. Students who achieved below the benchmark score were required to attend academic writing workshops in order to obtain the same summative 10% that was obtained by those who had achieved above the benchmark score. The trial was very informative, in terms of determining task appropriateness and timing, student feedback, student use of support, and student perceptions of the task and follow-up workshops. What we learned from the trial will be presented with a view to further refinement of this initiative.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Concern regarding the health effects of indoor air quality has grown in recent years, due to the increased prevalence of many diseases, as well as the fact that many people now spend most of their time indoors. While numerous studies have reported on the dynamics of aerosols indoors, the dynamics of bioaerosols in indoor environments are still poorly understood and very few studies have focused on fungal spore dynamics in indoor environments. Consequently, this work investigated the dynamics of fungal spores in indoor air, including fungal spore release and deposition, as well as investigating the mechanisms involved in the fungal spore fragmentation process. In relation to the investigation of fungal spore dynamics, it was found that the deposition rates of the bioaerosols (fungal propagules) were in the same range as the deposition rates of nonbiological particles and that they were a function of their aerodynamic diameters. It was also found that fungal particle deposition rates increased with increasing ventilation rates. These results (which are reported for the first time) are important for developing an understanding of the dynamics of fungal spores in the air. In relation to the process of fungal spore fragmentation, important information was generated concerning the airborne dynamics of the spores, as well as the part/s of the fungi which undergo fragmentation. The results obtained from these investigations into the dynamics of fungal propagules in indoor air significantly advance knowledge about the fate of fungal propagules in indoor air, as well as their deposition in the respiratory tract. The need to develop an advanced, real-time method for monitoring bioaerosols has become increasingly important in recent years, particularly as a result of the increased threat from biological weapons and bioterrorism. However, to date, the Ultraviolet Aerodynamic Particle Sizer (UVAPS, Model 3312, TSI, St Paul, MN) is the only commercially available instrument capable of monitoring and measuring viable airborne micro-organisms in real-time. Therefore (for the first time), this work also investigated the ability of the UVAPS to measure and characterise fungal spores in indoor air. The UVAPS was found to be sufficiently sensitive for detecting and measuring fungal propagules. Based on fungal spore size distributions, together with fluorescent percentages and intensities, it was also found to be capable of discriminating between two fungal spore species, under controlled laboratory conditions. In the field, however, it would not be possible to use the UVAPS to differentiate between different fungal spore species because the different micro-organisms present in the air may not only vary in age, but may have also been subjected to different environmental conditions. In addition, while the real-time UVAPS was found to be a good tool for the investigation of fungal particles under controlled conditions, it was not found to be selective for bioaerosols only (as per design specifications). In conclusion, the UVAPS is not recommended for use in the direct measurement of airborne viable bioaerosols in the field, including fungal particles, and further investigations into the nature of the micro-organisms, the UVAPS itself and/or its use in conjunction with other conventional biosamplers, are necessary in order to obtain more realistic results. Overall, the results obtained from this work on airborne fungal particle dynamics will contribute towards improving the detection capabilities of the UVAPS, so that it is capable of selectively monitoring and measuring bioaerosols, for which it was originally designed. This work will assist in finding and/or improving other technologies capable of the real-time monitoring of bioaerosols. The knowledge obtained from this work will also be of benefit in various other bioaerosol applications, such as understanding the transport of bioaerosols indoors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is considerable evidence that working memory impairment is a common feature of schizophrenia. The present study assessed working memory and executive function in 54 participants with schizophrenia, and a group of 54 normal controls matched to the patients on age, gender and estimated premorbid IQ, using traditional and newer measures of executive function and two dual tasks—Telephone Search with Counting and the Memory Span and Tracking Task. Results indicated that participants with schizophrenia were significantly impaired on all standardised measures of executive function with the exception of a composite measure of the Trail Making Test. Results for the dual task measures demonstrated that while the participants with schizophrenia were unimpaired on immediate digit span recall over a 2-min period, they recalled fewer digit strings and performed more poorly on a tracking task (box-crossing task) compared with controls. In addition, participants with schizophrenia performed more poorly on the tracking task when they were required to simultaneously recall digits strings than when they performed this task alone. Contrary to expectation, results of the telephone search task under dual conditions were not significantly different between groups. These results may reflect the insufficient complexity of the tone-counting task as an interference task. Overall, the present study showed that participants with schizophrenia appear to have a restricted impairment of their working memory system that is evident in tasks in which the visuospatial sketchpad slave system requires central executive control.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recently proposed data-driven background dataset refinement technique provides a means of selecting an informative background for support vector machine (SVM)-based speaker verification systems. This paper investigates the characteristics of the impostor examples in such highly-informative background datasets. Data-driven dataset refinement individually evaluates the suitability of candidate impostor examples for the SVM background prior to selecting the highest-ranking examples as a refined background dataset. Further, the characteristics of the refined dataset were analysed to investigate the desired traits of an informative SVM background. The most informative examples of the refined dataset were found to consist of large amounts of active speech and distinctive language characteristics. The data-driven refinement technique was shown to filter the set of candidate impostor examples to produce a more disperse representation of the impostor population in the SVM kernel space, thereby reducing the number of redundant and less-informative examples in the background dataset. Furthermore, data-driven refinement was shown to provide performance gains when applied to the difficult task of refining a small candidate dataset that was mis-matched to the evaluation conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background A complete explanation of the mechanisms by which Pb2+ exerts toxic effects on developmental central nervous system remains unknown. Glutamate is critical to the developing brain through various subtypes of ionotropic or metabotropic glutamate receptors (mGluRs). Ionotropic N-methyl-D-aspartate receptors have been considered as a principal target in lead-induced neurotoxicity. The relationship between mGluR3/mGluR7 and synaptic plasticity had been verified by many recent studies. The present study aimed to examine the role of mGluR3/mGluR7 in lead-induced neurotoxicity. Methods Twenty-four adult and female rats were randomly selected and placed on control or 0.2% lead acetate during gestation and lactation. Blood lead and hippocampal lead levels of pups were analyzed at weaning to evaluate the actual lead content at the end of the exposure. Impairments of short -term memory and long-term memory of pups were assessed by tests using Morris water maze and by detection of hippocampal ultrastructural alterations on electron microscopy. The impact of lead exposure on mGluR3 and mGluR7 mRNA expression in hippocampal tissue of pups were investigated by quantitative real-time polymerase chain reaction and its potential role in lead neurotoxicity were discussed. Results Lead levels of blood and hippocampi in the lead-exposed rats were significantly higher than those in the controls (P < 0.001). In tests using Morris Water Maze, the overall decrease in goal latency and swimming distance was taken to indicate that controls had shorter latencies and distance than lead-exposed rats (P = 0.001 and P < 0.001 by repeated-measures analysis of variance). On transmission electron microscopy neuronal ultrastructural alterations were observed and the results of real-time polymerase chain reaction showed that exposure to 0.2% lead acetate did not substantially change gene expression of mGluR3 and mGluR7 mRNA compared with controls. Conclusion Exposure to lead before and after birth can damage short-term and long-term memory ability of young rats and hippocampal ultrastructure. However, the current study does not provide evidence that the expression of rat hippocampal mGluR3 and mGluR7 can be altered by systemic administration of lead during gestation and lactation, which are informative for the field of lead-induced developmental neurotoxicity noting that it seems not to be worthwhile to include mGluR3 and mGluR7 in future studies. Background

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is a report of a study to explore what constitutes nurse-patient interactions and to ascertain patients' perceptions of these interactions. BACKGROUND: Nurses maintain patient integrity through caring practices. When patients feel disempowered or that their integrity is threatened they are more likely to make a complaint. When nurses develop a meaningful relationship with patients they recognize and address their concerns. It is increasingly identified in the literature that bureaucratic demands, including increased workloads and reduced staffing levels, result in situations where the development of a 'close' relationship is limited. METHOD: Data collection took two forms: twelve 4-hour observation periods of nurse-patient interactions in one cubicle (of four patients) in a medical and a surgical ward concurrently over a 4-week period; and questionnaires from inpatients of the two wards who were discharged during the 4-week data collection period in 2005. FINDINGS: Observation data showed that nurse-patient interactions were mostly friendly and informative. Opportunities to develop closeness were limited. Patients were mostly satisfied with interactions. The major source of dissatisfaction was when patients perceived that nurses were not readily available to respond to specific requests. Comparison of the observation and survey data indicated that patients still felt 'cared for' even when practices did not culminate in a 'connected' relationship. CONCLUSION: The findings suggest that patients believe that caring is demonstrated when nurses respond to specific requests. Patient satisfaction with the service is more likely to be improved if nurses can readily adapt their work to accommodate patients' requests or, alternatively, communicate why these requests cannot be immediately addressed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Expert knowledge is valuable in many modelling endeavours, particularly where data is not extensive or sufficiently robust. In Bayesian statistics, expert opinion may be formulated as informative priors, to provide an honest reflection of the current state of knowledge, before updating this with new information. Technology is increasingly being exploited to help support the process of eliciting such information. This paper reviews the benefits that have been gained from utilizing technology in this way. These benefits can be structured within a six-step elicitation design framework proposed recently (Low Choy et al., 2009). We assume that the purpose of elicitation is to formulate a Bayesian statistical prior, either to provide a standalone expert-defined model, or for updating new data within a Bayesian analysis. We also assume that the model has been pre-specified before selecting the software. In this case, technology has the most to offer to: targeting what experts know (E2), eliciting and encoding expert opinions (E4), whilst enhancing accuracy (E5), and providing an effective and efficient protocol (E6). Benefits include: -providing an environment with familiar nuances (to make the expert comfortable) where experts can explore their knowledge from various perspectives (E2); -automating tedious or repetitive tasks, thereby minimizing calculation errors, as well as encouraging interaction between elicitors and experts (E5); -cognitive gains by educating users, enabling instant feedback (E2, E4-E5), and providing alternative methods of communicating assessments and feedback information, since experts think and learn differently; and -ensuring a repeatable and transparent protocol is used (E6).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Safety interventions (e.g., median barriers, photo enforcement) and road features (e.g., median type and width) can influence crash severity, crash frequency, or both. Both dimensions—crash frequency and crash severity—are needed to obtain a full accounting of road safety. Extensive literature and common sense both dictate that crashes are not created equal, with fatalities costing society more than 1,000 times the cost of property damage crashes on average. Despite this glaring disparity, the profession has not unanimously embraced or successfully defended a nonarbitrary severity weighting approach for analyzing safety data and conducting safety analyses. It is argued here that the two dimensions (frequency and severity) are made available by intelligently and reliably weighting crash frequencies and converting all crashes to property-damage-only crash equivalents (PDOEs) by using comprehensive societal unit crash costs. This approach is analogous to calculating axle load equivalents in the prediction of pavement damage: for instance, a 40,000-lb truck causes 4,025 times more stress than does a 4,000-lb car and so simply counting axles is not sufficient. Calculating PDOEs using unit crash costs is the most defensible and nonarbitrary weighting scheme, allows for the simple incorporation of severity and frequency, and leads to crash models that are sensitive to factors that affect crash severity. Moreover, using PDOEs diminishes the errors introduced by underreporting of less severe crashes—an added benefit of the PDOE analysis approach. The method is illustrated with rural road segment data from South Korea (which in practice would develop PDOEs with Korean crash cost data).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SOMMARIO: 1. I fattori che incidono sulla funzione informativa del bilancio nelle imprese familiari. 2. Funzione, obiettivi e attese informative nella comunicazione esterna delle imprese familiari. 3. I caratteri del “familismo” nei prospetti di bilancio. 4. Verso un nuovo modello di bilancio per le imprese familiari: riflessioni critiche e spunti per la ricerca.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.