323 resultados para PHOTODETACHMENT THRESHOLD
Resumo:
This dissertation is primarily an applied statistical modelling investigation, motivated by a case study comprising real data and real questions. Theoretical questions on modelling and computation of normalization constants arose from pursuit of these data analytic questions. The essence of the thesis can be described as follows. Consider binary data observed on a two-dimensional lattice. A common problem with such data is the ambiguity of zeroes recorded. These may represent zero response given some threshold (presence) or that the threshold has not been triggered (absence). Suppose that the researcher wishes to estimate the effects of covariates on the binary responses, whilst taking into account underlying spatial variation, which is itself of some interest. This situation arises in many contexts and the dingo, cypress and toad case studies described in the motivation chapter are examples of this. Two main approaches to modelling and inference are investigated in this thesis. The first is frequentist and based on generalized linear models, with spatial variation modelled by using a block structure or by smoothing the residuals spatially. The EM algorithm can be used to obtain point estimates, coupled with bootstrapping or asymptotic MLE estimates for standard errors. The second approach is Bayesian and based on a three- or four-tier hierarchical model, comprising a logistic regression with covariates for the data layer, a binary Markov Random field (MRF) for the underlying spatial process, and suitable priors for parameters in these main models. The three-parameter autologistic model is a particular MRF of interest. Markov chain Monte Carlo (MCMC) methods comprising hybrid Metropolis/Gibbs samplers is suitable for computation in this situation. Model performance can be gauged by MCMC diagnostics. Model choice can be assessed by incorporating another tier in the modelling hierarchy. This requires evaluation of a normalization constant, a notoriously difficult problem. Difficulty with estimating the normalization constant for the MRF can be overcome by using a path integral approach, although this is a highly computationally intensive method. Different methods of estimating ratios of normalization constants (N Cs) are investigated, including importance sampling Monte Carlo (ISMC), dependent Monte Carlo based on MCMC simulations (MCMC), and reverse logistic regression (RLR). I develop an idea present though not fully developed in the literature, and propose the Integrated mean canonical statistic (IMCS) method for estimating log NC ratios for binary MRFs. The IMCS method falls within the framework of the newly identified path sampling methods of Gelman & Meng (1998) and outperforms ISMC, MCMC and RLR. It also does not rely on simplifying assumptions, such as ignoring spatio-temporal dependence in the process. A thorough investigation is made of the application of IMCS to the three-parameter Autologistic model. This work introduces background computations required for the full implementation of the four-tier model in Chapter 7. Two different extensions of the three-tier model to a four-tier version are investigated. The first extension incorporates temporal dependence in the underlying spatio-temporal process. The second extensions allows the successes and failures in the data layer to depend on time. The MCMC computational method is extended to incorporate the extra layer. A major contribution of the thesis is the development of a fully Bayesian approach to inference for these hierarchical models for the first time. Note: The author of this thesis has agreed to make it open access but invites people downloading the thesis to send her an email via the 'Contact Author' function.
Resumo:
Greenhouse gas emissions from a well established, unfertilized tropical grass-legume pasture were monitored over two consecutive years using high resolution automatic sampling. Nitrous oxide emissions were highest during the summer months and were highly episodic, related more to the size and distribution of rain events than WFPS alone. Mean annual emissions were significantly higher during 2008 (5.7 ± 1.0 g N2O-N/ha/day) than 2007 (3.9 ± 0.4 and g N2O-N/ha/day) despite receiving nearly 500 mm less rain. Mean CO2 (28.2 ± 1.5 kg CO2 C/ha/day) was not significantly different (P < 0.01) between measurement years, emissions being highly dependent on temperature. A negative correlation between CO2 and WFPS at >70% indicated a threshold for soil conditions favouring denitrification. The use of automatic chambers for high resolution greenhouse gas sampling can greatly reduce emission estimation errors associated with temperature and WFPS changes.
Resumo:
This study was designed to derive central and peripheral oxygen transmissibility (Dk/t) thresholds for soft contact lenses to avoid hypoxia-induced corneal swelling (increased corneal thickness) during open eye wear. Central and peripheral corneal thicknesses were measured in a masked and randomized fashion for the left eye of each of seven subjects before and after 3 h of afternoon wear of five conventional hydrogel and silicone hydrogel contact lens types offering a range of Dk/t from 2.4 units to 115.3 units. Curve fitting for plots of change in corneal thickness versus central and peripheral Dk/t found threshold values of 19.8 and 32.6 units to avoid corneal swelling during open eye contact lens wear for a typical wearer. Although some conventional hydrogel soft lenses are able to achieve this criterion for either central or peripheral lens areas (depending on lens power), in general, no conventional hydrogel soft lenses meet both the central and peripheral thresholds. Silicone hydrogel contact lenses typically meet both the central and peripheral thresholds and use of these lenses therefore avoids swelling in all regions of the cornea. ' 2009 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater 92B: 361–365, 2010
Resumo:
Purpose, Design/methodology / approach The acknowledgement of state significance in relation to development projects can result in special treatment by regulatory authorities, particularly in terms of environmental compliance and certain economic and other government support measures. However, defining just what constitutes a “significant project”, or a project of “state significance”, varies considerably between Australian states. In terms of establishing threshold levels, in Queensland there is even less clarity. Despite this lack of definition, the implications of “state significance” can nevertheless be considerable. For example, in Queensland if the Coordinator-General declares a project to be a “significant project” under the State Development and Public Works Organisation Act 1971, the environmental impact assessment process may become more streamlined – potentially circumventing certain provisions under The Integrated Planning Act 1997. If the project is not large enough to be so deemed, an extractive resource under the State Planning Policy 2/07 - Protection of Extractive Resources 2007 may be considered to be of State or regional significance and subsequently designated as a “Key Resource Area”. As a consequence, such a project is afforded some measure of resource protection but remains subject to the normal assessment process under the Integrated Development Assessment System, as well as the usual requirements of the vegetation management codes, and other regulations. Findings (Originality/value) & Research limitations / implications This paper explores the various meanings of “state significance” in Queensland and the ramifications for development projects in that state. It argues for a streamlining of the assessment process in order to avoid or minimise constraints acting on the state’s development. In so doing, it questions the existence of a strategic threat to the delivery of an already over-stretched infrastructure program.
Resumo:
Expoxy nanocomposites with multiwell carbon nanotubes (mwcnts) filler up to 0.3%wt were prepared by sheer mixing and good dispersion of the MWCNTS in the epoxy was successfully achieved. The electrical behaviour was characterized by measurements of the alternating current (ac) and direct current (dc) conductives at room temperature. Typical percolation behaviour was observed at a low percolation threshold of 0.055%. Frequency independent ac conductivity was observed at low frequencies but not at high frequencies. An equivalent circuit models was used to predict the impedence response in these nanocomposites.
Resumo:
Background: Early and persistent exposure to socioeconomic disadvantage impairs children’s health and wellbeing. However, it is unclear at what age health inequalities emerge or whether these relationships vary across ages and outcomes. We address these issues using cross-sectional Australian population data on the physical and developmental health of children at ages 0-1, 2-3, 4-5 and 6-7 years. Methods: 10 physical and developmental health outcomes were assessed in 2004 and 2006 for two cohorts each comprising around 5000 children. Socioeconomic position was measured as a composite of parental education, occupation and household income. Results: Lower socioeconomic position was associated with increased odds for poor outcomes. For physical health outcomes and socio-emotional competence, associations were similar across age groups and were consistent with either threshold effects (for poor general health, special healthcare needs and socio-emotional competence) or gradient effects (for illness with wheeze, sleep problems and injury). For socio-emotional difficulties, communication, vocabulary and emergent literacy, stronger socioeconomic associations were observed. The patterns were linear or accelerated and varied across ages. Conclusions: From very early childhood, social disadvantage was associated with poorer outcomes across most measures of physical and developmental health and showed no evidence of either strengthening or attenuating at older compared to younger ages. Findings confirm the importance of early childhood as a key focus for health promotion and prevention efforts.
Resumo:
It is possible to estimate the depth of focus (DOF) of the eye directly from wavefront measurements using various retinal image quality metrics (IQMs). In such methods, DOF is defined as the range of defocus error that degrades the retinal image quality calculated from IQMs to a certain level of the maximum value. Although different retinal image quality metrics are used, currently there have been two arbitrary threshold levels adopted, 50% and 80%. There has been limited study of the relationship between these threshold levels and the actual measured DOF. We measured the subjective DOF in a group of 17 normal subjects, and used through-focus augmented visual Strehl ratio based on optical transfer function (VSOTF) derived from their wavefront aberrations as the IQM. For each subject, a VSOTF threshold level was derived that would match the subjectively measured DOF. Significant correlation was found between the subject’s estimated threshold level and the HOA RMS (Pearson’s r=0.88, p<0.001). The linear correlation can be used to estimate the threshold level for each individual subject, subsequently leading to a method for estimating individual’s DOF from a single measurement of their wavefront aberrations.
Resumo:
In 2008 the Australian government decided to remove white blood cells from all blood products. This policy of universal leucodepletion was a change to the existing policy of supplying leucodepleted products to high risk patients only. The decision was made without strong information about the cost-effectiveness of universal leucodepletion. The aims for this policy analysis are to generate cost-effectiveness data about universal leucodepletion, and to add to our understanding of the role of evidence and the political reality of healthcare decision-making in Australia. The cost-effectiveness analysis revealed universal leucodepletion costs $398,943 to save one year of life. This exceeds the normal maximum threshold for Australia. We discuss this result within the context of how policy decisions are made about blood, and how it relates to the theory and process of policy making. We conclude that the absence of a strong voice for cost-effectiveness was an important omission in this decision.
Resumo:
To date, biodegradable networks and particularly their kinetic chain lengths have been characterized by analysis of their degradation products in solution. We characterize the network itself by NMR analysis in the solvent-swollen state under magic angle spinning conditions. The networks were prepared by photoinitiated cross-linking of poly(dl-lactide)−dimethacrylate macromers (5 kg/mol) in the presence of an unreactive diluent. Using diffusion filtering and 2D correlation spectroscopy techniques, all network components are identified. By quantification of network-bound photoinitiator fragments, an average kinetic chain length of 9 ± 2 methacrylate units is determined. The PDLLA macromer solution was also used with a dye to prepare computer-designed structures by stereolithography. For these networks structures, the average kinetic chain length is 24 ± 4 methacrylate units. In all cases the calculated molecular weights of the polymethacrylate chains after degradation are maximally 8.8 kg/mol, which is far below the threshold for renal clearance. Upon incubation in phosphate buffered saline at 37 °C, the networks show a similar mass loss profile in time as linear high-molecular-weight PDLLA (HMW PDLLA). The mechanical properties are preserved longer for the PDLLA networks than for HMW PDLLA. The initial tensile strength of 47 ± 2 MPa does not decrease significantly for the first 15 weeks, while HMW PDLLA lost 85 ± 5% of its strength within 5 weeks. The physical properties, kinetic chain length, and degradation profile of these photo-cross-linked PDLLA networks make them most suited materials for orthopedic applications and use in (bone) tissue engineering.
Resumo:
A total of 214 rainwater samples from 82 tanks were collected in urban Southeast Queensland (SEQ) in Australia and analysed for the zoonotic bacterial and protozoan pathogen using real-time binary PCR and quantitative PCR (qPCR). Quantitative Microbial Risk Assessment (QMRA) analysis was used to quantify the risk of infection associated with the exposure to potential pathogens from potable and non-potable uses of roof-harvested rainwater. Of the 214 samples tested, 10.7%, 9.8%, and 5.6%, and 0.4% samples were positive for Salmonella invA, Giardia lamblia β-giardin , Legionella pneumophila mip, and Campylobacter jejuni mapA genes. Cryptosporidium parvum could not be detected. The estimated numbers of viable Salmonella spp., G. lamblia β-giradin, and L. pneumophila genes ranged from 1.6 × 101 to 9.5 × 101 cells, 1.4 × 10-1 to 9.0 × 10-1 cysts, and 1.5 × 101 to 4.3 × 101 per 1000 ml of water, respectively. Six risk scenarios were considered from exposure to Salmonella spp., G. lamblia and L. pneumophila. For Salmonella spp., and G. lamblia, these scenarios were: (1) liquid ingestion due to drinking of rainwater on a daily basis (2) accidental liquid ingestion due to garden hosing twice a week (3) aerosol ingestion due to showering on a daily basis, and (4) aerosol ingestion due to hosing twice a week. For L. pneumophila, these scenarios were: (5) aerosol inhalation due to showering on a daily basis, and (6) aerosol inhalation due to hosing twice a week. The risk of infection from Salmonella spp., G. lamblia, and L. pneumophila associated with the use of rainwater for showering and garden hosing was calculated to be well below the threshold value of one extra infection per 10,000 persons per year in urban SEQ. However, the risk of infection from ingesting Salmonella spp. and G. lamblia via drinking exceeds this threshold value, and indicates that if undisinfected rainwater were ingested by drinking, then the gastrointestinal diseases of Salmonellosis and Giardiasis is expected to range from 5.0 × 100 to 2.8 × 101 (Salmonellosis) and 1.0 × 101 to 6.4 × 101 (Giardiasis) cases per 10,000 persons per year, respectively. Since this health risk seems higher than that expected from the reported incidences of gastroenteritis, the assumptions used to estimate these infection risks are critically examined. Nonetheless, it would seem prudent to disinfect rainwater for potable use.
Resumo:
Background: A bundled approach to central venous catheter care is currently being promoted as an effective way of preventing catheter-related bloodstream infection (CR-BSI). Consumables used in the bundled approach are relatively inexpensive which may lead to the conclusion that the bundle is cost-effective. However, this fails to consider the nontrivial costs of the monitoring and education activities required to implement the bundle, or that alternative strategies are available to prevent CR-BSI. We evaluated the cost-effectiveness of a bundle to prevent CR-BSI in Australian intensive care patients. ---------- Methods and Findings: A Markov decision model was used to evaluate the cost-effectiveness of the bundle relative to remaining with current practice (a non-bundled approach to catheter care and uncoated catheters), or use of antimicrobial catheters. We assumed the bundle reduced relative risk of CR-BSI to 0.34. Given uncertainty about the cost of the bundle, threshold analyses were used to determine the maximum cost at which the bundle remained cost-effective relative to the other approaches to infection control. Sensitivity analyses explored how this threshold alters under different assumptions about the economic value placed on bed-days and health benefits gained by preventing infection. If clinicians are prepared to use antimicrobial catheters, the bundle is cost-effective if national 18-month implementation costs are below $1.1 million. If antimicrobial catheters are not an option the bundle must cost less than $4.3 million. If decision makers are only interested in obtaining cash-savings for the unit, and place no economic value on either the bed-days or the health benefits gained through preventing infection, these cost thresholds are reduced by two-thirds.---------- Conclusions: A catheter care bundle has the potential to be cost-effective in the Australian intensive care setting. Rather than anticipating cash-savings from this intervention, decision makers must be prepared to invest resources in infection control to see efficiency improvements.
Resumo:
Aim Australian residential aged care does not have a system of quality assessment related to clinical outcomes, or comprehensive quality benchmarking. The Residential Care Quality Assessment was developed to fill this gap; and this paper discusses the process by which preliminary benchmarks representing high and low quality were developed for it. Methods Data were collected from all residents (n = 498) of nine facilities. Numerator–denominator analysis of clinical outcomes occurred at a facility-level, with rank-ordered results circulated to an expert panel. The panel identified threshold scores to indicate excellent and questionable care quality, and refined these through Delphi process. Results Clinical outcomes varied both within and between facilities; agreed thresholds for excellent and poor outcomes were finalised after three Delphi rounds. Conclusion Use of the Residential Care Quality Assessment provides a concrete means of monitoring care quality and allows benchmarking across facilities; its regular use could contribute to improved care outcomes within residential aged care in Australia.
Resumo:
This is the second part of a paper that explores a range of magico-religious experiences such as immaterial voices and visions, in terms of local cultural, moral and socio-political circumstances in an Aboriginal town in rural Queensland. This part of the paper explores the political and cultural symbolism and meaning of suicide. It charts the saliency of suicide amongst two groups of kin and cohorts and the social meaningfulness and problematic of the voices and visions in relation to suicide, to identity and family forms and to funerals and a heavily drinking lifestyle. I argue that voices and visions are used to reinterpret social experience and to establish meaning and that tragically suicide evokes connectivity rather than anomie and here cannot be understood merely as an individualistic act or evidence of individual pathology. Rather it is about transformation and crossing a threshold to join an enduring domain of Aboriginality. In this life world, where family is the highest social value and where a relational view of persons holds sway, the individualistic practice of psychiatric and other helping professions, is a considerable problem.
Resumo:
Background: A number of studies have examined the relationship between high ambient temperature and mortality. Recently, concern has arisen about whether this relationship is modified by socio-demographic factors. However, data for this type of study is relatively scarce in subtropical/tropical regions where people are well accustomed to warm temperatures. Objective: To investigate whether the relationship between daily mean temperature and daily all-cause mortality is modified by age, gender and socio-economic status (SES) in Brisbane, Australia. Methods: We obtained daily mean temperature and all-cause mortality data for Brisbane, Australia during 1996–2004. A generalised additive model was fitted to assess the percentage increase in all deaths with every one degree increment above the threshold temperature. Different age, gender and SES groups were included in the model as categorical variables and their modification effects were estimated separately. Results: A total of 53,316 non-external deaths were included during the study period. There was a clear increasing trend in the harmful effect of high temperature on mortality with age. The effect estimate among women was more than 20 times that among men. We did not find an SES effect on the percent increase associated with temperature. Conclusions: The effects of high temperature on all deaths were modified by age and gender but not by SES in Brisbane, Australia.