909 resultados para SHORT-CONTACT TIMES
Resumo:
A crucial process of chlamydial development involves differentiation of the replicative reticulate body (RB) into the infectious elementary body (EB). We present experimental evidence to provide support for a contact-dependent hypothesis for explaining the trigger involved in differentiation. We recorded live-imaging of Chlamydia trachomatis-infected McCoy cells at key times during development and tracked the temporospatial trajectories of individual chlamydial particles. We found that movement of the particles is related to development. Early to mid-developmental stages involved slight wobbling of RBs. The average speed of particles increased sharply at 24 h postinfection (after the estimated onset of RB to EB differentiation). We also investigated a penicillin-supplemented culture containing EBs, RBs, and aberrantly enlarged, stressed chlamydiae. Near-immobile enlarged particles are consistent with their continued tethering to the chlamydial inclusion membrane (CIM). We found a significantly negative, nonlinear association between speed and size/type of particles, providing further support for the hypothesis that particles become untethered near the onset of RB to EB differentiation. This study establishes the relationship between the motion properties of the chlamydiae and developmental stages, whereby wobbling RBs gradually lose contact with the CIM, and RB detachment from the CIM is coincidental with the onset of late differentiation.
Resumo:
In the 1930s and 1940s, Australian women writers published novels, poems, and short stories that pushed the boundaries of their national literary culture. From their position in the Pacific, they entered into a dialogue with a European modernism that they reworked to invigorate their own writing and to make cross-continental connections. My interest in the work of Australian women prose writers of this period stems from an appreciation of the extent of their engagement with interwar modernism (an engagement that is generally under-acknowledged) and the realization that there are commonalities of approach with the ways in which contemporaneous Chinese authors negotiated this transnational cultural traffic. China and Australia, it has been argued, share an imaginative and literal association of many centuries, and this psychic history produces a situation in which ‘Australians feel drawn towards China: they cannot leave it alone.’1 Equally, Chinese exploration of the great southern land began in the fifteenth century, prior to European contact. In recent times, the intensity of Australia’s cultural and commercial connections with Asia has led to a repositioning of the Australian sense of regionalism in general and, in particular, has activated yet another stage in the history of its relationship with China. In this context, the association of Australian and Chinese writing is instructive because the commonalities of approach and areas of interest between certain authors indicate that Australian writers were not alone in either the content or style of their response to European modernism. This recognition, in turn, advances discussions of modernism in Australia and reveals an alternative way of looking at the world from the Pacific Rim through literature. The intent is to examine selective Australian and Chinese authors who are part of this continuous history and whose writing demonstrates common thematic and stylistic features via the vector of modernism. I focus on the 1930s and 1940s because these are the decades in which Australia and China experienced wideranging conflict in the Pacific, and it is significant that war, both forthcoming and actual, features as an ominous soundtrack in the writing of Chinese and Australian women. I argue that, given the immensity of cultural difference between Australia and China, there is an especially interesting juncture in the ways in which the authors interrogate modernist practices and the challenge of modernism. The process in which writing from the Pacific Rim jointly negotiates the twin desires of engaging with European literary form and representing one’s own culture may be seen as what Jessica Berman identifies as a geomodernism, one of the ‘new possible geographies’ of modernism.2 My discussion centres on the work of the Australian women, to which the Chinese material serves as a point of reference, albeit a critical one. The Chinese writing examined here is restricted to authors who wrote at least some material in English and whose work is available in translation.
Resumo:
Presbyopia affects individuals from the age of 45 years onwards, resulting in difficulty in accurately focusing on near objects. There are many optical corrections available including spectacles or contact lenses that are designed to enable presbyopes to see clearly at both far and near distances. However, presbyopic vision corrections also disturb aspects of visual function under certain circumstances. The impact of these changes on activities of daily living such as driving are, however, poorly understood. Therefore, the aim of this study was to determine which aspects of driving performance might be affected by wearing different types of presbyopic vision corrections. In order to achieve this aim, three experiments were undertaken. The first experiment involved administration of a questionnaire to compare the subjective driving difficulties experienced when wearing a range of common presbyopic contact lens and spectacle corrections. The questionnaire was developed and piloted, and included a series of items regarding difficulties experienced while driving under day and night-time conditions. Two hundred and fifty five presbyopic patients responded to the questionnaire and were categorised into five groups, including those wearing no vision correction for driving (n = 50), bifocal spectacles (BIF, n = 54), progressive addition lenses spectacles (PAL, n = 50), monovision (MV, n = 53) and multifocal contact lenses (MTF CL, n = 48). Overall, ratings of satisfaction during daytime driving were relatively high for all correction types. However, MV and MTF CL wearers were significantly less satisfied with aspects of their vision during night-time than daytime driving, particularly with regard to disturbances from glare and haloes. Progressive addition lens wearers noticed more distortion of peripheral vision, while BIF wearers reported more difficulties with tasks requiring changes in focus and those who wore no vision correction for driving reported problems with intermediate and near tasks. Overall, the mean level of satisfaction for daytime driving was quite high for all of the groups (over 80%), with the BIF wearers being the least satisfied with their vision for driving. Conversely, at night, MTF CL wearers expressed the least satisfaction. Research into eye and head movements has become increasingly of interest in driving research as it provides a means of understanding how the driver responds to visual stimuli in traffic. Previous studies have found that wearing PAL can affect eye and head movement performance resulting in slower eye movement velocities and longer times to stabilize the gaze for fixation. These changes in eye and head movement patterns may have implications for driving safety, given that the visual tasks for driving include a range of dynamic search tasks. Therefore, the second study was designed to investigate the influence of different presbyopic corrections on driving-related eye and head movements under standardized laboratory-based conditions. Twenty presbyopes (mean age: 56.1 ± 5.7 years) who had no experience of wearing presbyopic vision corrections, apart from single vision reading spectacles, were recruited. Each participant wore five different types of vision correction: single vision distance lenses (SV), PAL, BIF, MV and MTF CL. For each visual condition, participants were required to view videotape recordings of traffic scenes, track a reference vehicle and identify a series of peripherally presented targets while their eye and head movements were recorded using the faceLAB® eye and head tracking system. Digital numerical display panels were also included as near visual stimuli (simulating the visual displays of a vehicle speedometer and radio). The results demonstrated that the path length of eye movements while viewing and responding to driving-related traffic scenes was significantly longer when wearing BIF and PAL than MV and MTF CL. The path length of head movements was greater with SV, BIF and PAL than MV and MTF CL. Target recognition was less accurate when the near stimulus was located at eccentricities inferiorly and to the left, rather than directly below the primary position of gaze, regardless of vision correction type. The third experiment aimed to investigate the real world driving performance of presbyopes while wearing different vision corrections measured on a closed-road circuit at night-time. Eye movements were recorded using the ASL Mobile Eye, eye tracking system (as the faceLAB® system proved to be impractical for use outside of the laboratory). Eleven participants (mean age: 57.25 ± 5.78 years) were fitted with four types of prescribed vision corrections (SV, PAL, MV and MTF CL). The measures of driving performance on the closed-road circuit included distance to sign recognition, near target recognition, peripheral light-emitting-diode (LED) recognition, low contrast road hazards recognition and avoidance, recognition of all the road signs, time to complete the course, and driving behaviours such as braking, accelerating, and cornering. The results demonstrated that driving performance at night was most affected by MTF CL compared to PAL, resulting in shorter distances to read signs, slower driving speeds, and longer times spent fixating road signs. Monovision resulted in worse performance in the task of distance to read a signs compared to SV and PAL. The SV condition resulted in significantly more errors made in interpreting information from in-vehicle devices, despite spending longer time fixating on these devices. Progressive addition lenses were ranked as the most preferred vision correction, while MTF CL were the least preferred vision correction for night-time driving. This thesis addressed the research question of how presbyopic vision corrections affect driving performance and the results of the three experiments demonstrated that the different types of presbyopic vision corrections (e.g. BIF, PAL, MV and MTF CL) can affect driving performance in different ways. Distance-related driving tasks showed reduced performance with MV and MTF CL, while tasks which involved viewing in-vehicle devices were significantly hampered by wearing SV corrections. Wearing spectacles such as SV, BIF and PAL induced greater eye and head movements in the simulated driving condition, however this did not directly translate to impaired performance on the closed- road circuit tasks. These findings are important for understanding the influence of presbyopic vision corrections on vision under real world driving conditions. They will also assist the eye care practitioner to understand and convey to patients the potential driving difficulties associated with wearing certain types of presbyopic vision corrections and accordingly to support them in the process of matching patients to optical corrections which meet their visual needs.
Resumo:
Purpose: The aim was to construct and advise on the use of a cost-per-wear model based on contact lens replacement frequency, to form an equitable basis for cost comparison. ---------- Methods: The annual cost of professional fees, contact lenses and solutions when wearing daily, two-weekly and monthly replacement contact lenses is determined in the context of the Australian market for spherical, toric and multifocal prescription types. This annual cost is divided by the number of times lenses are worn per year, resulting in a ‘cost-per-wear’. The model is presented graphically as the cost-per-wear versus the number of times lenses are worn each week for daily replacement and reusable (two-weekly and monthly replacement) lenses.---------- Results: The cost-per-wear for two-weekly and monthly replacement spherical lenses is almost identical but decreases with increasing frequency of wear. The cost-per-wear of daily replacement spherical lenses is lower than for reusable spherical lenses, when worn from one to four days per week but higher when worn six or seven days per week. The point at which the cost-per-wear is virtually the same for all three spherical lens replacement frequencies (approximately AUD$3.00) is five days of lens wear per week. A similar but upwardly displaced (higher cost) pattern is observed for toric lenses, with the cross-over point occurring between three and four days of wear per week (AUD$4.80). Multifocal lenses have the highest price, with cross-over points for daily versus two-weekly replacement lenses at between four and five days of wear per week (AUD$5.00) and for daily versus monthly replacement lenses at three days per week (AUD$5.50).---------- Conclusions: This cost-per-wear model can be used to assist practitioners and patients in making an informed decision in relation to the cost of contact lens wear as one of many considerations that must be taken into account when deciding on the most suitable lens replacement modality.
Resumo:
Purpose: The aim was to determine world-wide patterns of fitting contact lenses for the correction of presbyopia. Methods: Up to 1,000 survey forms were sent to contact lens fitters in each of 38 countries between January and March every year over five consecutive years (2005 to 2009). Practitioners were asked to record data relating to the first 10 contact lens fittings or refittings performed after receiving the survey form. Results: Data were received relating to 16,680 presbyopic (age 45 years or older) and 84,202 pre-presbyopic (15 to 44 years) contact lens wearers. Females are over-represented in presbyopic versus pre-presbyopic groups, possibly reflecting a stronger desire for the cosmetic benefits of contact lenses among older women. The extent to which multifocal and monovision lenses are prescribed for presbyopes varies considerably among nations, ranging from 79 per cent of all soft lenses in Portugal to zero in Singapore. There appears to be significant under-prescribing of contact lenses for the correction of presbyopia, although for those who do receive such corrections, three times more multifocal lenses are fitted compared with monovision fittings. Presbyopic corrections are most frequently prescribed for full-time wear and monthly replacement. Conclusions: Despite apparent improvements in multifocal design and an increase in available multifocal options in recent years, practitioners are still under-prescribing with respect to the provision of appropriate contact lenses for the correction of presbyopia. Training of contact lens practitioners in presbyopic contact lens fitting should be accelerated and clinical and laboratory research in this field should be intensified to enhance the prospects of meeting the needs of presbyopic contact lens wearers more fully.
Resumo:
Objectives: To measure tear film surface quality (TFSQ) using dynamic high-speed videokeratoscopy during short-term (8 hours) use of rigid and soft contact lenses. Methods: A group of fourteen subjects wore 3 different types of contact lenses on 3 different non-consecutive days (order randomized) in one eye only. Subjects were screened to exclude those with dry eye. The lenses included a PMMA hard, an RGP (Boston XO) and a soft silicone hydrogel lens. Three 30 second long high speed videokeratoscopy recordings were taken with contact lenses in-situ, in the morning and again after 8 hours of contact lens wear, both in normal and suppressed blinking conditions. Recordings were also made on a baseline day with no contact lens wear. Results: The presence of a contact lens in the eye had a significant effect on the mean TFSQ in both natural and suppressed blinking conditions (p=0.001 and p=0.01 respectively, repeated measures ANOVA). TFSQ was worse with all the lenses compared to no lens in the eye (in the afternoon during both normal and suppressed blinking conditions (all p<0.05). In natural blinking conditions, the mean TFSQ for the PMMA and RGP lenses was significantly worse than the baseline day (no lens) for both morning and afternoon measures (p<0.05). Conclusions: This study shows that both rigid and soft contact lenses adversely affect the TFSQ in both natural and suppressed blinking conditions. No significant differences were found between the lens types and materials. Keywords: Tear film surface quality, rigid contact lens, soft contact lens, dynamic high-speed videokeratoscopy
Time dependency of molecular rate estimates and systematic overestimation of recent divergence times
Resumo:
Studies of molecular evolutionary rates have yielded a wide range of rate estimates for various genes and taxa. Recent studies based on population-level and pedigree data have produced remarkably high estimates of mutation rate, which strongly contrast with substitution rates inferred in phylogenetic (species-level) studies. Using Bayesian analysis with a relaxed-clock model, we estimated rates for three groups of mitochondrial data: avian protein-coding genes, primate protein-coding genes, and primate d-loop sequences. In all three cases, we found a measurable transition between the high, short-term (<1–2 Myr) mutation rate and the low, long-term substitution rate. The relationship between the age of the calibration and the rate of change can be described by a vertically translated exponential decay curve, which may be used for correcting molecular date estimates. The phylogenetic substitution rates in mitochondria are approximately 0.5% per million years for avian protein-coding sequences and 1.5% per million years for primate protein-coding and d-loop sequences. Further analyses showed that purifying selection offers the most convincing explanation for the observed relationship between the estimated rate and the depth of the calibration. We rule out the possibility that it is a spurious result arising from sequence errors, and find it unlikely that the apparent decline in rates over time is caused by mutational saturation. Using a rate curve estimated from the d-loop data, several dates for last common ancestors were calculated: modern humans and Neandertals (354 ka; 222–705 ka), Neandertals (108 ka; 70–156 ka), and modern humans (76 ka; 47–110 ka). If the rate curve for a particular taxonomic group can be accurately estimated, it can be a useful tool for correcting divergence date estimates by taking the rate decay into account. Our results show that it is invalid to extrapolate molecular rates of change across different evolutionary timescales, which has important consequences for studies of populations, domestication, conservation genetics, and human evolution.
Resumo:
Background The genus Rattus is highly speciose and has a complex taxonomy that is not fully resolved. As shown previously there are two major groups within the genus, an Asian and an Australo-Papuan group. This study focuses on the Australo-Papuan group and particularly on the Australian rats. There are uncertainties regarding the number of species within the group and the relationships among them. We analysed 16 mitochondrial genomes, including seven novel genomes from six species, to help elucidate the evolutionary history of the Australian rats. We also demonstrate, from a larger dataset, the usefulness of short regions of the mitochondrial genome in identifying these rats at the species level. Results Analyses of 16 mitochondrial genomes representing species sampled from Australo-Papuan and Asian clades of Rattus indicate divergence of these two groups ~2.7 million years ago (Mya). Subsequent diversification of at least 4 lineages within the Australo-Papuan clade was rapid and occurred over the period from ~ 0.9-1.7 Mya, a finding that explains the difficulty in resolving some relationships within this clade. Phylogenetic analyses of our 126 taxon, but shorter sequence (1952 nucleotides long), Rattus database generally give well supported species clades. Conclusions Our whole mitochondrial genome analyses are concordant with a taxonomic division that places the native Australian rats into the Rattus fuscipes species group. We suggest the following order of divergence of the Australian species. R. fuscipes is the oldest lineage among the Australian rats and is not part of a New Guinean radiation. R. lutreolus is also within this Australian clade and shallower than R. tunneyi while the R. sordidus group is the shallowest lineage in the clade. The divergences within the R. sordidus and R. leucopus lineages occurring about half a million years ago support the hypotheses of more recent interchanges of rats between Australia and New Guinea. While problematic for inference of deeper divergences, we report that the analysis of shorter mitochondrial sequences is very useful for species identification in rats.
Resumo:
Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.
Resumo:
The 2012 OLT National Teaching Fellowship described in this report has mapped and analysed the complex and competing internal and external agencies impacting on the whole-of-curriculum design in contemporary higher education in Australia, particularly on degrees in Education with an emphasis on initial teacher education. The Fellowship was conducted at a time of both heightened public and political scrutiny of teacher education and the imposition of new nationally-consistent accreditation processes. This scrutiny culminated in a call by the previous Federal Government for TEQSA (Tertiary Education Quality Standards Agency) to conduct a comprehensive review of teacher education beginning in 2014 and the incoming Government announcing it will establish a short term ministerial advisory group to report on the “priority issue of improving teacher quality” (Pyne, 2013).
Resumo:
Objectives To characterize toric contact lens prescribing worldwide. Methods Up to 1,000 survey forms were sent to contact lens fitters in up to 39 countries between January and March every year for 5 consecutive years (2007–2011). Practitioners were asked to record data relating to the first 10 contact lens fits or refits performed after receiving the survey form. Only data for toric and spherical soft lens fits were analyzed. Survey data collected since 1996 were also analyzed for 7 nations to assess toric lens fitting trends since that time. Results Data were collected in relation to 21,150 toric fits (25%) and 62,150 spherical fits (75%). Toric prescribing ranged from 6% of lenses in Russia to 48% in Portugal. Compared with spherical fittings, toric fittings can be characterized as follows: older age (29.8 ± 11.4 years vs. 27.6 ± 10.8 years for spherical lenses); men are overrepresented (38% vs. 34%); greater proportion of new fits (39% vs. 32%); use of silicone hydrogel lenses (49% vs. 39%); and lower proportion of daily disposable lenses (14% vs. 28%). There has been a continuous increase in toric lens prescribing between 1996 and 2011. The proportion of toric lens fits was positively related to the gross domestic product at purchasing power parity per capita for year 2011 (r2 = 0.21; P=0.004). Conclusions At the present time, in the majority of countries surveyed, toric soft contact lens prescribing falls short of that required to correct clinically significant astigmatism (≥0.75 diopters) in all lens wearers.
Resumo:
Contact lenses are a successful and popular means to correct refractive error and are worn by just under 700,000 Australians1 and approximately 125 million people worldwide. The most serious complication of contact lens wear is microbial keratitis, a potentially sight-threatening corneal infection most often caused by bacteria. Gram-negative bacteria, in particular pseudomonas species, account for the majority of severe bacterial infections. Pathogens such as fungi or amoebae, which feature less often, are associated with significant morbidity. These unusual pathogens have come into the spotlight in recent times with an apparent association with specific lens cleaning solutions...
Resumo:
Importance Approximately one-third of patients with peripheral artery disease experience intermittent claudication, with consequent loss of quality of life. Objective To determine the efficacy of ramipril for improving walking ability, patient-perceived walking performance, and quality of life in patients with claudication. Design, Setting, and Patients Randomized, double-blind, placebo-controlled trial conducted among 212 patients with peripheral artery disease (mean age, 65.5 [SD, 6.2] years), initiated in May 2008 and completed in August 2011 and conducted at 3 hospitals in Australia. Intervention Patients were randomized to receive 10 mg/d of ramipril (n = 106) or matching placebo (n = 106) for 24 weeks. Main Outcome Measures Maximum and pain-free walking times were recorded during a standard treadmill test. The Walking Impairment Questionnaire (WIQ) and Short-Form 36 Health Survey (SF-36) were used to assess walking ability and quality of life, respectively. Results At 6 months, relative to placebo, ramipril was associated with a 75-second (95% CI, 60-89 seconds) increase in mean pain-free walking time (P < .001) and a 255-second (95% CI, 215-295 seconds) increase in maximum walking time (P < .001). Relative to placebo, ramipril improved the WIQ median distance score by 13.8 (Hodges-Lehmann 95% CI, 12.2-15.5), speed score by 13.3 (95% CI, 11.9-15.2), and stair climbing score by 25.2 (95% CI, 25.1-29.4) (P < .001 for all). The overall SF-36 median Physical Component Summary score improved by 8.2 (Hodges-Lehmann 95% CI, 3.6-11.4; P = .02) in the ramipril group relative to placebo. Ramipril did not affect the overall SF-36 median Mental Component Summary score. Conclusions and Relevance Among patients with intermittent claudication, 24-week treatment with ramipril resulted in significant increases in pain-free and maximum treadmill walking times compared with placebo. This was associated with a significant increase in the physical functioning component of the SF-36 score. Trial Registration clinicaltrials.gov Identifier: NCT00681226
Resumo:
Purpose:Multifocal contact lenses (MCLs) have been available for decades. A review of the literature suggests that while, historically, these lenses have been partially successful, they have struggled to compete with monovision (MV). More recent publications suggest that there has been an improvement in the performance of these lenses. This study set out to investigate whether the apparent improved lens performance reported in the literature is reflected in clinical practice. Methods:Data collected over the last 5yrs via the International Contact Lens Prescribing Survey Consortium was reviewed for patients over the age of 45yrs. The published reports of clinical trials were reviewed to assess lens performance over the time period. Results:Data review was of 16,680 presbyopic lens fits in 38 countries. The results are that 29% were fit with MCLs, 8% MV and 63% single vision (SV). A previous survey conducted in Australia during 1988-89 reported that 9% of presbyopes were fit with MCLs, 29% MV and 63% SV. The results from our survey for Australia alone were 28% (MV 13%) vs 9% (MV 29%) suggesting an increase in usage of MCLs from 1988-89 to 2010. A review of the literature indicates the reported level of visual acuities with MCLs in comparison to MV has remained equivalent over this time period, yet preference has switch from MV to MCLs. Conclusions:There is evidence that currently more MCLs than MV are being fit to presbyopes, compared to 1988-89. This increased use is likely due to the improved visual performance of these lenses, which is not demonstrated with acuity measures but reported by wearers, suggesting that patient-based subjective ratings are currently the best way to measure visual performance.
Resumo:
Insulated rail joints are designed in a similar way to butt jointed steel structural systems, the difference being a purpose made gap between the main rail members to maintain electrical insulation for the proper functioning of the track circuitry at all times of train operation. When loaded wheels pass the gap, they induce an impact loading with the corresponding strains in the railhead edges exceeding the plastic limit significantly, which lead to metal flow across the gap thereby increasing the risk of short circuiting and impeding the proper functioning of the signalling and broken rail identification circuitries, of which the joints are a critical part. The performance of insulated rail joints under the passage of the wheel loading is complex due to the presence of a number of interacting components and hence is not well understood. This paper presents a dynamic wheel-rail contact-impact modelling method for the determination of the impact loading; a brief description of a field experiment to capture strain signatures for validating the predicted impact loading is also presented. The process and the results of the characterisation of the materials from virgin, in-service and damaged insulated rail joints using neutron diffraction method are also discussed.