323 resultados para PHOTODETACHMENT THRESHOLD
Resumo:
The hexagonal resonator characteristics of an individual ZnO-nanonail’s head were investigated via spatially resolved cathodoluminescence (CL) at room temperature. The positions of most of distinct CL peaks in visible range were well matched to those of whispering gallery modes (WGMs) of a hexagonal dielectric cavity when we took birefringence and dispersion of refractive indices into account. The broad and weak peaks for TE polarization in long wavelength range were consistent with refractive-index values below the threshold for total internal inflection. CL peaks that were not matched to WGMs were identified as either triangular quasi-WGM or Fabry–Pérot resonance modes.
Bending and bundling of metal-free vertically aligned ZnO nanowires due to electrostatic interaction
Resumo:
Bending and bundling was observed from vertically aligned arrays of ZnO nanowires with flat (0001) top surfaces, which were synthesized using a vapor-phase method without metal catalysts. Sufficient evidence was found to exclude electron-beam bombardment during scanning electron microscopy as a cause for bending and bundling. We attribute the bending and bundling to electrostatic interactions due to charged (0001) polar surfaces, and also discussed the threshold surface charge densities for the bending and bundling based on a simple cantilever-bending model. Some growth features were indicative of the operation of electrostatic interactions during the growth.
Resumo:
In automatic facial expression recognition, an increasing number of techniques had been proposed for in the literature that exploits the temporal nature of facial expressions. As all facial expressions are known to evolve over time, it is crucially important for a classifier to be capable of modelling their dynamics. We establish that the method of sparse representation (SR) classifiers proves to be a suitable candidate for this purpose, and subsequently propose a framework for expression dynamics to be efficiently incorporated into its current formulation. We additionally show that for the SR method to be applied effectively, then a certain threshold on image dimensionality must be enforced (unlike in facial recognition problems). Thirdly, we determined that recognition rates may be significantly influenced by the size of the projection matrix \Phi. To demonstrate these, a battery of experiments had been conducted on the CK+ dataset for the recognition of the seven prototypic expressions - anger, contempt, disgust, fear, happiness, sadness and surprise - and comparisons have been made between the proposed temporal-SR against the static-SR framework and state-of-the-art support vector machine.
Resumo:
While purporting to enhance Australia’s sustainability, the federal government’s Population Strategy rejects the assessment of the limiting factors to future population growth, thus avoiding urgent threshold issues such as resource depletion and environmental destruction. A more forward-thinking and whole-system perspective would assess and incorporate critical biophysical limits into governance processes with suitable prioritisation. It would encourage communities to examine their individual and collective responsibilities in the context of these limits in order to most equitably optimise outcomes; and it would employ both a resource-based examination of minimum population requirements, and an impact-based assessment of maximum thresholds. This carrying capacity approach to planning could help guide society towards a more sustainable future.
Resumo:
The impact of climate change on the health of vulnerable groups such as the elderly has been of increasing concern. However, to date there has been no meta-analysis of current literature relating to the effects of temperature fluctuations upon mortality amongst the elderly. We synthesised risk estimates of the overall impact of daily mean temperature on elderly mortality across different continents. A comprehensive literature search was conducted using MEDLINE and PubMed to identify papers published up to December 2010. Selection criteria including suitable temperature indicators, endpoints, study-designs and identification of threshold were used. A two-stage Bayesian hierarchical model was performed to summarise the percent increase in mortality with a 1°C temperature increase (or decrease) with 95% confidence intervals in hot (or cold) days, with lagged effects also measured. Fifteen studies met the eligibility criteria and almost 13 million elderly deaths were included in this meta-analysis. In total, there was a 2-5% increase for a 1°C increment during hot temperature intervals, and a 1-2 % increase in all-cause mortality for a 1°C decrease during cold temperature intervals. Lags of up to 9 days in exposure to cold temperature intervals were substantially associated with all-cause mortality, but no substantial lagged effects were observed for hot intervals. Thus, both hot and cold temperatures substantially increased mortality among the elderly, but the magnitude of heat-related effects seemed to be larger than that of cold effects within a global context.
Resumo:
BACKGROUND: The relationship between temperature and mortality has been explored for decades and many temperature indicators have been applied separately. However, few data are available to show how the effects of different temperature indicators on different mortality categories, particularly in a typical subtropical climate. OBJECTIVE: To assess the associations between various temperature indicators and different mortality categories in Brisbane, Australia during 1996-2004. METHODS: We applied two methods to assess the threshold and temperature indicator for each age and death groups: mean temperature and the threshold assessed from all cause mortality was used for all mortality categories; the specific temperature indicator and the threshold for each mortality category were identified separately according to the minimisation of AIC. We conducted polynomial distributed lag non-linear model to identify effect estimates in mortality with one degree of temperature increase (or decrease) above (or below) the threshold on current days and lagged effects using both methods. RESULTS: Akaike's Information Criterion was minimized when mean temperature was used for all non-external deaths and deaths from 75 to 84 years; when minimum temperature was used for deaths from 0 to 64 years, 65-74 years, ≥ 85 years, and from the respiratory diseases; when maximum temperature was used for deaths from cardiovascular diseases. The effect estimates using certain temperature indicators were similar as mean temperature both for current day and lag effects. CONCLUSION: Different age groups and death categories were sensitive to different temperature indicators. However, the effect estimates from certain temperature indicators did not significantly differ from those of mean temperature.
Developing a model of embedding academic numeracy in university programs : a case study from nursing
Resumo:
This is a study of the academic numeracy of nursing students. This study develops a theoretical model for the design and delivery of university courses in academic numeracy. The following objectives are addressed: 1. To investigate nursing students' current knowledge of academic numeracy; 2. To investigate how nursing students’ knowledge and skills in academic numeracy can be enhanced using a developmental psychology framework; and 3. To utilise data derived from meeting objectives 1 and 2 to develop a theoretical model to embed academic numeracy in university programs. This study draws from Valsiner’s Human Development Theory (Valsiner, 1997, 2007). It is a quasi-experimental intervention case study (Faltis, 1997) and takes a multimethod approach using pre- and post-tests; observation notes; and semi-structured teaching sessions to document a series of microgenetic studies of student numeracy. Each microgenetic study is centered on the lived experience of students becoming more numerate. The method for this section is based on Vygotsky’s double stimulation (Valsiner, 2000a; 2007). Data collection includes interviews on students’ past experience with mathematics; their present feelings and experiences and how these present feelings and experiences are transformed. The findings from this study have provided evidence that the course developed for nursing students, underpinned by an appropriate framework, does improve academic numeracy. More specifically, students improved their content knowledge of and confidence in mathematics in areas that were directly related to their degree. The study used Valsiner’s microgenetic approach to development to trace the course as it was being taught and two students’ personal academic numeracy journeys. It highlighted particularly troublesome concepts, then outlined scaffolding and pathways used to develop understanding. This approach to academic numeracy development was summarised into a four-faceted model at the university, program, course and individual level. This model can be applied successfully to similar contexts. Thus the thesis advances both theory and practice in this under-researched and under-theorised area.
Resumo:
This thesis examines the role of mobile telephony in rural communities in Papua New Guinea (PNG). It is a threshold study which reports on research conducted in the earliest stages of mobile phone adoption in these areas. It explores the ways in which this new technology changes people’s lives, social structures and relationships. The research focuses on non-urban communities, which previously had little or no access to modern communication technologies, but which are in some cases still using traditional forms of communication such as drums. It has found that the introduction of mobile telecommunications has generally been viewed positively, although several negative concerns have been strongly felt. Specific benefits related to enhanced communication with relatives and friends living away from home villages, and use of the technology in time-critical emergencies or crises. Difficulties have arisen with respect to the cost of owning and operating a handset, as well as financial and logistical challenges when recharging handset batteries, particularly in areas with no mains electricity supply. Perceived damaging effects of mobile phone access related to sex, crime and pornography. The changes taking place are described through a social lens, by foregrounding the perceptions of villagers. The perspectives of key informants, such as telecommunication company managers, are also discussed. Employing the technique of triangulation (using different methods and sources) has helped to validate the findings of the research project. The sources constantly overlap and agree on the main themes, such as those outlined above. PNG is a developing country which performs poorly on a wide range of development indicators. A large majority of the people live outside of the major towns and cities. It is therefore worthwhile investigating the introduction of mobile phone technology in rural areas. These areas often have poor access to services, including transport, health, education and banking. Until 2007, communities in such regions fell outside of mobile phone coverage areas. In the case of all ten villages discussed in this thesis, there has never been any landline telephone infrastructure available. Therefore, this research on mobile phones is in effect documenting the first ever access to any kind of phone in these communities. This research makes a unique contribution to knowledge about the role of communication in PNG, and has implications for policy, practice and theory. In the policy arena, the thesis aids understanding of the impact which communication sector competition and regulation can have on rural and relatively isolated communities. There are three practical problems which have emerged from the research: cost, battery recharging difficulties and breakage are all major obstacles to uptake and use of mobile telephony in rural communities. Efforts to reduce usage costs, enable easier recharging, and design more robust handsets would allow for increased utilisation of mobile phones for a range of purposes. With respect to the realm of theory, this research sits amongst the most recent scholarship in the mobile phone field, located within the broader communication theory area. It recommends cautionary reading of any literature which suggests that mobile phones will reduce poverty and increase incomes in poor, rural communities in developing countries. Nonetheless, the present research adds weight to mobile phone studies which suggest that the primary advantages of mobile phones in such settings are for the satisfactions of communication of itself, and for social interaction among loved ones.
Resumo:
Hypertrophic scars arise when there is an overproduction of collagen during wound healing. These are often associated with poor regulation of the rate of programmed cell death(apoptosis) of the cells synthesizing the collagen or by an exuberant inflammatory response that prolongs collagen production and increases wound contraction. Severe contractures that occur, for example, after a deep burn can cause loss of function especially if the wound is over a joint such as the elbow or knee. Recently, we have developed a morphoelastic mathematical model for dermal repair that incorporates the chemical, cellular and mechanical aspects of dermal wound healing. Using this model, we examine pathological scarring in dermal repair by first assuming a smaller than usual apoptotic rate for myofibroblasts, and then considering a prolonged inflammatory response, in an attempt to determine a possible optimal intervention strategy to promote normal repair, or terminate the fibrotic scarring response. Our model predicts that in both cases it is best to apply the intervention strategy early in the wound healing response. Further, the earlier an intervention is made, the less aggressive the intervention required. Finally, if intervention is conducted at a late time during healing, a significant intervention is required; however, there is a threshold concentration of the drug or therapy applied, above which minimal further improvement to wound repair is obtained.
Resumo:
The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.
Resumo:
The relationship between weather and mortality has been observed for centuries. Recently, studies on temperature-related mortality have become a popular topic as climate change continues. Most of the previous studies found that exposure to hot or cold temperature affects mortality. This study aims to address three research questions: 1. What is the overall effect of daily mean temperature variation on the elderly mortality in the published literature using a meta-analysis approach? 2. Does the association between temperature and mortality differ with age, sex, or socio-economic status in Brisbane? 3. How is the magnitude of the lag effects of the daily mean temperature on mortality varied by age and cause-of-death groups in Brisbane? In the meta-analysis, there was a 1-2 % increase in all-cause mortality for a 1ºC decrease during cold temperature intervals and a 2-5% increase for a 1ºC increment during hot temperature intervals among the elderly. Lags of up to 9 days in exposure to cold temperature intervals were statistically significantly associated with all-cause mortality, but no significant lag effects were observed for hot temperature intervals. In Brisbane, the harmful effect of high temperature (over 24ºC) on mortality appeared to be greater among the elderly than other age groups. The effect estimate among women was greater than among men. However, No evidence was found that socio-economic status modified the temperature-mortality relationship. The results of this research also show longer lag effects in cold days and shorter lag effects in hot days. For 3-day hot effects associated with 1°C increase above the threshold, the highest percent increases in mortality occurred among people aged 85 years or over (5.4% (95% CI: 1.4%, 9.5%)) compared with all age group (3.2% (95% CI: 0.9%, 5.6%)). The effect estimate among cardiovascular deaths was slightly higher than those among all-cause mortality. For overall 21-day cold effects associated with a 1°C decrease below the threshold, the percent estimates in mortality for people aged 85 years or over, and from cardiovascular diseases were 3.9% (95% CI: 1.9%, 6.0%) and 3.4% (95% CI: 0.9%, 6.0%), respectively compared with all age group (2.0% (95% CI: 0.7%, 3.3%)). Little research of this kind has been conducted in the Southern Hemisphere. This PhD research may contribute to the quantitative assessment of the overall impact, effect modification and lag effects of temperature variation on mortality in Australia and The findings may provide useful information for the development and implementation of public health policies to reduce and prevent temperature-related health problems.
Resumo:
This thesis presents the outcomes of a comprehensive research study undertaken to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The knowledge created is expected to contribute to a greater understanding of urban stormwater quality and thereby enhance the design of stormwater quality treatment systems. The research study was undertaken based on selected urban catchments in Gold Coast, Australia. The research methodology included field investigations, laboratory testing, computer modelling and data analysis. Both univariate and multivariate data analysis techniques were used to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The rainfall characteristics investigated included average rainfall intensity and rainfall duration whilst catchment characteristics included land use, impervious area percentage, urban form and pervious area location. The catchment scale data for the analysis was obtained from four residential catchments, including rainfall-runoff records, drainage network data, stormwater quality data and land use and land cover data. Pollutants build-up samples were collected from twelve road surfaces in residential, commercial and industrial land use areas. The relationships between rainfall characteristics, catchment characteristics and urban stormwater quality were investigated based on residential catchments and then extended to other land uses. Based on the influence rainfall characteristics exert on urban stormwater quality, rainfall events can be classified into three different types, namely, high average intensity-short duration (Type 1), high average intensity-long duration (Type 2) and low average intensity-long duration (Type 3). This provides an innovative approach to conventional modelling which does not commonly relate stormwater quality to rainfall characteristics. Additionally, it was found that the threshold intensity for pollutant wash-off from urban catchments is much less than for rural catchments. High average intensity-short duration rainfall events are cumulatively responsible for the generation of a major fraction of the annual pollutants load compared to the other rainfall event types. Additionally, rainfall events less than 1 year ARI such as 6- month ARI should be considered for treatment design as they generate a significant fraction of the annual runoff volume and by implication a significant fraction of the pollutants load. This implies that stormwater treatment designs based on larger rainfall events would not be feasible in the context of cost-effectiveness, efficiency in treatment performance and possible savings in land area needed. This also suggests that the simulation of long-term continuous rainfall events for stormwater treatment design may not be needed and that event based simulations would be adequate. The investigations into the relationship between catchment characteristics and urban stormwater quality found that other than conventional catchment characteristics such as land use and impervious area percentage, other catchment characteristics such as urban form and pervious area location also play important roles in influencing urban stormwater quality. These outcomes point to the fact that the conventional modelling approach in the design of stormwater quality treatment systems which is commonly based on land use and impervious area percentage would be inadequate. It was also noted that the small uniformly urbanised areas within a larger mixed catchment produce relatively lower variations in stormwater quality and as expected lower runoff volume with the opposite being the case for large mixed use urbanised catchments. Therefore, a decentralised approach to water quality treatment would be more effective rather than an "end-of-pipe" approach. The investigation of pollutants build-up on different land uses showed that pollutant build-up characteristics vary even within the same land use. Therefore, the conventional approach in stormwater quality modelling, which is based solely on land use, may prove to be inappropriate. Industrial land use has relatively higher variability in maximum pollutant build-up, build-up rate and particle size distribution than the other two land uses. However, commercial and residential land uses had relatively higher variations of nutrients and organic carbon build-up. Additionally, it was found that particle size distribution had a relatively higher variability for all three land uses compared to the other build-up parameters. The high variability in particle size distribution for all land uses illustrate the dissimilarities associated with the fine and coarse particle size fractions even within the same land use and hence the variations in stormwater quality in relation to pollutants adsorbing to different sizes of particles.
Resumo:
This paper presents a road survey as part of a workshop conducted by the Texas Department of Transportation (TxDOT) to evaluate and improve the maintenance practices of the Texas highway system. Directors of maintenance from six peer states (California, Kansas, Georgia, Missouri, North Carolina, and Washington) were invited to this 3-day workshop. One of the important parts of this workshop was a Maintenance Test Section Survey (MTSS) to evaluate a number of pre-selected one-mile roadway sections. The workshop schedule allowed half a day to conduct the field survey and 34 sections were evaluated. Each of the evaluators was given a booklet and asked to rate the selected road sections. The goals of the MTSS were to: 1. Assess the threshold level at which maintenance activities are required as perceived by the evaluators from the peer states; 2. Assess the threshold level at which maintenance activities are required as perceived by evaluators from other TxDOT districts; and 3. Perform a pilot evaluation of the MTSS concept. This paper summarizes the information obtained from survey and discusses the major findings based on a statistical analysis of the data and comments from the survey participants.
Resumo:
PURPOSE. To assess whether there are any advantages of binocular over monocular vision under blur conditions. METHODS. We measured the effect of defocus, induced by positive lenses, on the pattern reversal Visual Evoked Potential (VEP) and on visual acuity (VA). Monocular (dominant eye) and binocular VEPs were recorded from thirteen volunteers (average age: 28±5 years, average spherical equivalent: -0.25±0.73 D) for defocus up to 2.00 D using positive powered lenses. VEPs were elicited using reversing 10 arcmin checks at a rate of 4 reversals/second. The stimulus subtended a circular field of 7 degrees with 100% contrast and mean luminance 30 cd/m2. VA was measured under the same conditions using ETDRS charts. All measurements were performed at 1m viewing distance with best spectacle sphero-cylindrical correction and natural pupils. RESULTS. With binocular stimulation, amplitudes and implicit times of the P100 component of the VEPs were greater and shorter, respectively, in all cases than for monocular stimulation. Mean binocular enhancement ratio in the P100 amplitude was 2.1 in-focus, increasing linearly with defocus to be 3.1 at +2.00 D defocus. Mean peak latency was 2.9 ms shorter in-focus with binocular than for monocular stimulation, with the difference increasing with defocus to 8.8 ms at +2.00 D. As for the VEP amplitude, VA was always better with binocular than with monocular vision, with the difference being greater for higher retinal blur. CONCLUSIONS. Both subjective and electrophysiological results show that binocular vision ameliorates the effect of defocus. The increased binocular facilitation observed with retinal blur may be due to the activation of a larger population of neurons at close-to-threshold detection under binocular stimulation.
Resumo:
The design of pre-contoured fracture fixation implants (plates and nails) that correctly fit the anatomy of a patient utilises 3D models of long bones with accurate geometric representation. 3D data is usually available from computed tomography (CT) scans of human cadavers that generally represent the above 60 year old age group. Thus, despite the fact that half of the seriously injured population comes from the 30 year age group and below, virtually no data exists from these younger age groups to inform the design of implants that optimally fit patients from these groups. Hence, relevant bone data from these age groups is required. The current gold standard for acquiring such data–CT–involves ionising radiation and cannot be used to scan healthy human volunteers. Magnetic resonance imaging (MRI) has been shown to be a potential alternative in the previous studies conducted using small bones (tarsal bones) and parts of the long bones. However, in order to use MRI effectively for 3D reconstruction of human long bones, further validations using long bones and appropriate reference standards are required. Accurate reconstruction of 3D models from CT or MRI data sets requires an accurate image segmentation method. Currently available sophisticated segmentation methods involve complex programming and mathematics that researchers are not trained to perform. Therefore, an accurate but relatively simple segmentation method is required for segmentation of CT and MRI data. Furthermore, some of the limitations of 1.5T MRI such as very long scanning times and poor contrast in articular regions can potentially be reduced by using higher field 3T MRI imaging. However, a quantification of the signal to noise ratio (SNR) gain at the bone - soft tissue interface should be performed; this is not reported in the literature. As MRI scanning of long bones has very long scanning times, the acquired images are more prone to motion artefacts due to random movements of the subject‟s limbs. One of the artefacts observed is the step artefact that is believed to occur from the random movements of the volunteer during a scan. This needs to be corrected before the models can be used for implant design. As the first aim, this study investigated two segmentation methods: intensity thresholding and Canny edge detection as accurate but simple segmentation methods for segmentation of MRI and CT data. The second aim was to investigate the usability of MRI as a radiation free imaging alternative to CT for reconstruction of 3D models of long bones. The third aim was to use 3T MRI to improve the poor contrast in articular regions and long scanning times of current MRI. The fourth and final aim was to minimise the step artefact using 3D modelling techniques. The segmentation methods were investigated using CT scans of five ovine femora. The single level thresholding was performed using a visually selected threshold level to segment the complete femur. For multilevel thresholding, multiple threshold levels calculated from the threshold selection method were used for the proximal, diaphyseal and distal regions of the femur. Canny edge detection was used by delineating the outer and inner contour of 2D images and then combining them to generate the 3D model. Models generated from these methods were compared to the reference standard generated using the mechanical contact scans of the denuded bone. The second aim was achieved using CT and MRI scans of five ovine femora and segmenting them using the multilevel threshold method. A surface geometric comparison was conducted between CT based, MRI based and reference models. To quantitatively compare the 1.5T images to the 3T MRI images, the right lower limbs of five healthy volunteers were scanned using scanners from the same manufacturer. The images obtained using the identical protocols were compared by means of SNR and contrast to noise ratio (CNR) of muscle, bone marrow and bone. In order to correct the step artefact in the final 3D models, the step was simulated in five ovine femora scanned with a 3T MRI scanner. The step was corrected using the iterative closest point (ICP) algorithm based aligning method. The present study demonstrated that the multi-threshold approach in combination with the threshold selection method can generate 3D models from long bones with an average deviation of 0.18 mm. The same was 0.24 mm of the single threshold method. There was a significant statistical difference between the accuracy of models generated by the two methods. In comparison, the Canny edge detection method generated average deviation of 0.20 mm. MRI based models exhibited 0.23 mm average deviation in comparison to the 0.18 mm average deviation of CT based models. The differences were not statistically significant. 3T MRI improved the contrast in the bone–muscle interfaces of most anatomical regions of femora and tibiae, potentially improving the inaccuracies conferred by poor contrast of the articular regions. Using the robust ICP algorithm to align the 3D surfaces, the step artefact that occurred by the volunteer moving the leg was corrected, generating errors of 0.32 ± 0.02 mm when compared with the reference standard. The study concludes that magnetic resonance imaging, together with simple multilevel thresholding segmentation, is able to produce 3D models of long bones with accurate geometric representations. The method is, therefore, a potential alternative to the current gold standard CT imaging.