339 resultados para risque minimal
Resumo:
Sing & Grow is a short term early intervention music therapy program for at risk families. Sing & Grow uses music to strengthen parent-child relationships by increasing positive parent-child interactions, assisting parents to bond with their children, and extending the repertoire of parents’ skills in relating to their child through interactive . Both the Australian and New Zealand governments are looking for evidence based research to highlight the effectiveness of funded programs in early childhood. As a government funded program, independent evaluation is a requirement of the delivery of the service. This paper explains the process involved in setting up and managing this large scale evaluation from engaging the evaluators and designing the project, to the data gathering stage. It describes the various challenges encountered and concludes that a highly collaborative and communicative partnership bet en researchers and clinicians is essential to ensure data can be gathered with minimal disturbance to clinical music therapy practice.
Resumo:
An experimental laboratory investigation was carried out to assess the structural adequacy of a disused PHO Class Flat Bottom Rail Wagon (FRW) for a single lane low volume road bridge application as per the design provisions of the Australian Bridge Design Standard AS 5100(2004). The investigation also encompassed a review into the risk associated with the pre-existing damage in wagons incurred during their service life on rail. The main objective of the laboratory testing of the FRW was to physically measure its performance under the same applied traffic loading it would be required to resist as a road bridge deck. In order to achieve this a full width (5.2m) single lane, single span (approximately 10m), simply supported bridge would be required to be constructed and tested in a structural laboratory. However, the available clear spacing between the columns of the loading portal frame encountered within the laboratory was insufficient to accommodate the 5.2m wide bridge deck excluding clearance normally considered necessary in structural testing. Therefore, only half of the full scale bridge deck (single FRW of width 2.6m) was able to be accommodated and tested; with the continuity of the bridge deck in the lateral direction applied as boundary constraints along the full length of the FRW at six selected locations. This represents a novel approach not yet reported in the literature for bridge deck testing to the best of the knowledge of the author. The test was carried out under two loadings provided in AS 5100 (2004) – one stationary W80 wheel load and the second a moving axle load M1600. As the bridge investigated in the study is a single lane single span low volume road bridge, the risk of pre-existing damage and the expected high cycle fatigue failure potential was assessed as being minimal and hence the bridge deck was not tested structurally for fatigue/ fracture. The high axle load requirements have instead been focussed upon the investigation into the serviceability and ultimate limit state requirements. The testing regime adopted however involved extensive recording of strains and deflections at several critical locations of the FRW. Three locations of W80 point load and two locations of the M1600 Axle load were considered for the serviceability testing; the FRW was also tested under the ultimate load dictated by the M1600. The outcomes of the experimental investigation have demonstrated that the FRW is structurally adequate to resist the prescribed traffic loadings outlaid in AS 5100 (2004). As the loading was directly applied on to the FRW, the laboratory testing is assessed as being significantly conservative. The FRW bridge deck in the field would only resist the load transferred by the running platform, where, depending on the design, composite action might exist – thereby the share of the loading which needs to be resisted by the FRW would be smaller than the system tested in the lab. On this basis, a demonstration bridge is under construction at the time of writing this thesis and future research will involve field testing in order to assess its performance.
Resumo:
The most common human cancers are malignant neoplasms of the skin. Incidence of cutaneous melanoma is rising especially steeply, with minimal progress in non-surgical treatment of advanced disease. Despite significant effort to identify independent predictors of melanoma outcome, no accepted histopathological, molecular or immunohistochemical marker defines subsets of this neoplasm. Accordingly, though melanoma is thought to present with different 'taxonomic' forms, these are considered part of a continuous spectrum rather than discrete entities. Here we report the discovery of a subset of melanomas identified by mathematical analysis of gene expression in a series of samples. Remarkably, many genes underlying the classification of this subset are differentially regulated in invasive melanomas that form primitive tubular networks in vitro, a feature of some highly aggressive metastatic melanomas. Global transcript analysis can identify unrecognized subtypes of cutaneous melanoma and predict experimentally verifiable phenotypic characteristics that may be of importance to disease progression.
Resumo:
We consider the problem of how to construct robust designs for Poisson regression models. An analytical expression is derived for robust designs for first-order Poisson regression models where uncertainty exists in the prior parameter estimates. Given certain constraints in the methodology, it may be necessary to extend the robust designs for implementation in practical experiments. With these extensions, our methodology constructs designs which perform similarly, in terms of estimation, to current techniques, and offers the solution in a more timely manner. We further apply this analytic result to cases where uncertainty exists in the linear predictor. The application of this methodology to practical design problems such as screening experiments is explored. Given the minimal prior knowledge that is usually available when conducting such experiments, it is recommended to derive designs robust across a variety of systems. However, incorporating such uncertainty into the design process can be a computationally intense exercise. Hence, our analytic approach is explored as an alternative.
Resumo:
Conceptual modeling continues to be an important means for graphically capturing the requirements of an information system. Observations of modeling practice suggest that modelers often use multiple modeling grammars in combination to articulate various aspects of real-world domains. We extend an ontological theory of representation to suggest why and how users employ multiple conceptual modeling grammars in combination. We provide an empirical test of the extended theory using survey data and structured interviews about the use of traditional and structured analysis grammars within an automated tool environment. We find that users of the analyzed tool combine grammars to overcome the ontological incompleteness that exists in each grammar. Users further selected their starting grammar from a predicted subset of grammars only. The qualitative data provides insights as to why some of the predicted deficiencies manifest in practice differently than predicted.
Resumo:
This thesis investigates profiling and differentiating customers through the use of statistical data mining techniques. The business application of our work centres on examining individuals’ seldomly studied yet critical consumption behaviour over an extensive time period within the context of the wireless telecommunication industry; consumption behaviour (as oppose to purchasing behaviour) is behaviour that has been performed so frequently that it become habitual and involves minimal intentions or decision making. Key variables investigated are the activity initialised timestamp and cell tower location as well as the activity type and usage quantity (e.g., voice call with duration in seconds); and the research focuses are on customers’ spatial and temporal usage behaviour. The main methodological emphasis is on the development of clustering models based on Gaussian mixture models (GMMs) which are fitted with the use of the recently developed variational Bayesian (VB) method. VB is an efficient deterministic alternative to the popular but computationally demandingMarkov chainMonte Carlo (MCMC) methods. The standard VBGMMalgorithm is extended by allowing component splitting such that it is robust to initial parameter choices and can automatically and efficiently determine the number of components. The new algorithm we propose allows more effective modelling of individuals’ highly heterogeneous and spiky spatial usage behaviour, or more generally human mobility patterns; the term spiky describes data patterns with large areas of low probability mixed with small areas of high probability. Customers are then characterised and segmented based on the fitted GMM which corresponds to how each of them uses the products/services spatially in their daily lives; this is essentially their likely lifestyle and occupational traits. Other significant research contributions include fitting GMMs using VB to circular data i.e., the temporal usage behaviour, and developing clustering algorithms suitable for high dimensional data based on the use of VB-GMM.
Resumo:
In many applications, e.g., bioinformatics, web access traces, system utilisation logs, etc., the data is naturally in the form of sequences. People have taken great interest in analysing the sequential data and finding the inherent characteristics or relationships within the data. Sequential association rule mining is one of the possible methods used to analyse this data. As conventional sequential association rule mining very often generates a huge number of association rules, of which many are redundant, it is desirable to find a solution to get rid of those unnecessary association rules. Because of the complexity and temporal ordered characteristics of sequential data, current research on sequential association rule mining is limited. Although several sequential association rule prediction models using either sequence constraints or temporal constraints have been proposed, none of them considered the redundancy problem in rule mining. The main contribution of this research is to propose a non-redundant association rule mining method based on closed frequent sequences and minimal sequential generators. We also give a definition for the non-redundant sequential rules, which are sequential rules with minimal antecedents but maximal consequents. A new algorithm called CSGM (closed sequential and generator mining) for generating closed sequences and minimal sequential generators is also introduced. A further experiment has been done to compare the performance of generating non-redundant sequential rules and full sequential rules, meanwhile, performance evaluation of our CSGM and other closed sequential pattern mining or generator mining algorithms has also been conducted. We also use generated non-redundant sequential rules for query expansion in order to improve recommendations for infrequently purchased products.
Resumo:
This study examined the lifetime and 4-week prevalence of postcoital dysphoria (PCD) and its relationship with psychological distress and reports of past sexual abuse. Amongst 222 female university students, 32.9% reported having ever experienced PCD while 10% reported experiencing PCD in the previous four weeks. Multiple regression analyses revealed support for the hypothesis that lifetime and 4-week prevalence of PCD would be positively correlated with psychological distress. Lifetime prevalence of PCD, but not 4-week prevalence, correlated with reports of childhood sexual abuse. These factors explained only minimal variance in PCD prevalence, prompting further research into this significantly under-investigated sexual difficulty.
Resumo:
A new approach to pattern recognition using invariant parameters based on higher order spectra is presented. In particular, invariant parameters derived from the bispectrum are used to classify one-dimensional shapes. The bispectrum, which is translation invariant, is integrated along straight lines passing through the origin in bifrequency space. The phase of the integrated bispectrum is shown to be scale and amplification invariant, as well. A minimal set of these invariants is selected as the feature vector for pattern classification, and a minimum distance classifier using a statistical distance measure is used to classify test patterns. The classification technique is shown to distinguish two similar, but different bolts given their one-dimensional profiles. Pattern recognition using higher order spectral invariants is fast, suited for parallel implementation, and has high immunity to additive Gaussian noise. Simulation results show very high classification accuracy, even for low signal-to-noise ratios.
Resumo:
Taxes are an important component of investing that is commonly overlooked in both the literature and in practice. For example, many understand that taxes will reduce an investment’s return, but less understood is the risk-sharing nature of taxes that also reduces the investment’s risk. This thesis examines how taxes affect the optimal asset allocation and asset location decision in an Australian environment. It advances the model of Horan & Al Zaman (2008), improving the method by which the present value of tax liabilities are calculated, by using an after-tax risk-free discount rate, and incorporating any new or reduced tax liabilities generated into its expected risk and return estimates. The asset allocation problem is examined for a range of different scenarios using Australian parameters, including different risk aversion levels, personal marginal tax rates, investment horizons, borrowing premiums, high or low inflation environments, and different starting cost bases. The findings support the Horan & Al Zaman (2008) conclusion that equities should be held in the taxable account. In fact, these findings are strengthened with most of the efficient frontier maximising equity holdings in the taxable account instead of only half. Furthermore, these findings transfer to the Australian case, where it is found that taxed Australian investors should always invest into equities first through the taxable account before investing in super. However, untaxed Australian investors should invest their equity first through superannuation. With borrowings allowed in the taxable account (no borrowing premium), Australian taxed investors should hold 100% of the superannuation account in the risk-free asset, while undertaking leverage in the taxable account to achieve the desired risk-return. Introducing a borrowing premium decreases the likelihood of holding 100% of super in the risk-free asset for taxable investors. The findings also suggest that the higher the marginal tax rate, the higher the borrowing premium in order to overcome this effect. Finally, as the investor’s marginal tax rate increases, the overall allocation to equities should increase due to the increased risk and return sharing caused by taxation, and in order to achieve the same risk/return level as the lower taxation level, the investor must take on more equity exposure. The investment horizon has a minimal impact on the optimal allocation decision in the absence of factors such as mean reversion and human capital.
Resumo:
In Viet Nam, standards of nursing care fail to meet international competency standards. This increases risks to patient safety (eg. hospital acquired infection), consequently the Ministry of Health identified the need to strengthen nurse education in Viet Nam. This paper presents experiences of a piloted clinical teaching model developed in Ha Noi, to strengthen nurse led institutional capacity for in-service education and clinical teaching. Historically 90% of nursing education was conducted by physicians and professional development in hospitals for nurses was limited. There was minimal communication between hospitals and nursing schools about expectations of students and assessment and quality of the learning experience. As a result when students came to the clinical sites, no-one understood how to plan their learning objectives and utilise teaching and learning approaches appropriate to their level. Therefore student learning outcomes were variable. They focussed on procedures and techniques and “learning how to do” rather than learning how to plan, implement and evaluate patient care. This project is part of a multi-component capacity building program designed to improve nurse education in Viet Nam. The project was funded jointly by Queensland University of Technology (QUT) and the Australian Agency for International Development. Its aim was to develop a collaborative clinically-based model of teaching to create an environment that encourages evidence-based, student-centred clinical learning. Accordingly, strategies introduced promoted clinical teaching of competency based nursing practice utilising the regionally endorsed nurse core competency standards. Thirty nurse teachers from Viet Duc University Hospital and Hanoi Medical College participated in the program. These nurses and nurse teachers undertook face to face education in three workshops, and completed three assessment items. Assessment was applied, where participants integrated the concepts learned in each workshop and completed assessment tasks related to planning, implementing and evaluating teaching in the clinical area. Twenty of these participants were then selected to undertake a two week study tour in Brisbane, Australia where the clinical teaching model was refined and an action plan developed to integrate into both organisations with possible implementation across Viet Nam. Participants on this study tour also experienced clinical teaching and learning at QUT by attending classes held at the university, and were able to visit selected hospitals to experience clinical teaching in these settings as well. Effectiveness of the project was measured throughout the implementation phase and in follow up visits to the clinical site. To date changes have been noted on an individual and organisational level. There is also significant planning underway to incorporate the clinical teaching model developed across the organisation and how this may be implemented in other regions. Two participants have also been involved in disseminating aspects of this approach to clinical teaching in Ho Chi Minh, with further plans for more in-depth dissemination to occur throughout the country.
Resumo:
Introduction Buildings, which account for approximately half of all annual energy and greenhouse gas emissions, are an important target area for any strategy addressing climate change. Whilst new commercial buildings increasingly address sustainability considerations, incorporating green technology in the refurbishment process of older buildings is technically, financially and socially challenging. This research explores the expectations and experiences of commercial office building tenants, whose building was under-going green refurbishment. Methodology Semi-structured in-depth interviews with seven residents and neighbours of a large case-study building under-going green refurbishment in Melbourne, Australia. Built in 1979, the 7,008m² ‘B’ grade building consists of 11 upper levels of office accommodation, ground floor retail, and a basement area leased as a licensed restaurant. After refurbishment, which included the installation of chilled water pumps, solar water heating, waterless urinals, insulation, disabled toilets, and automatic dimming lights, it was expected that the environmental performance of the building would move from a non-existent zero ABGR (Australian Building Greenhouse Rating) star rating to 3.5 stars, with a 40% reduction in water consumption and 20% reduction in energy consumption. Interviews were transcribed, with responses analysed using a thematic approach, identifying categories, themes and patterns. Results Commercial property tenants are on a journey to sustainability - they are interested and willing to engage in discussions about sustainability initiatives, but the process, costs and benefits need to be clear. Critically, whilst sustainability was an essential and non-negotiable criterion in building selection for government and larger corporate tenants, sustainability was not yet a core business value for smaller organisations – whilst they could see it as an emerging issue, they wanted detailed cost-benefit analyses, pay-back calculations of proposed technologies and, ideally, wished they could trial the technology first-hand in some way. Although extremely interested in learning more, most participants reported relatively minimal knowledge of specific sustainability features, designs or products. In discussions about different sustainable technologies (e.g., waterless urinals, green-rated carpets), participants frequently commented that they knew little about the technology, had not heard of it or were not sure exactly how it worked. Whilst participants viewed sustainable commercial buildings as the future, they had varied expectations about the fate of existing older buildings – most felt that they would have to be retrofitted at some point to meet market expectations and predicted the emergence of a ‘non-sustainability discount’ for residing in a building without sustainable features. Discussion This research offers a beginning point for understanding the difficulty of integrating green technology in older commercial buildings. Tenants currently have limited understandings of technology and potential building performance outcomes, which ultimately could impede the implementation of sustainable initiatives in older buildings. Whilst the commercial property market is interested in learning about sustainability in the built environment, the findings highlight the importance of developing a strong business case, communication and transition plan for implementing sustainability retrofits in existing commercial buildings.
Practical improvements to simultaneous computation of multi-view geometry and radial lens distortion
Resumo:
This paper discusses practical issues related to the use of the division model for lens distortion in multi-view geometry computation. A data normalisation strategy is presented, which has been absent from previous discussions on the topic. The convergence properties of the Rectangular Quadric Eigenvalue Problem solution for computing division model distortion are examined. It is shown that the existing method can require more than 1000 iterations when dealing with severe distortion. A method is presented for accelerating convergence to less than 10 iterations for any amount of distortion. The new method is shown to produce equivalent or better results than the existing method with up to two orders of magnitude reduction in iterations. Through detailed simulation it is found that the number of data points used to compute geometry and lens distortion has a strong influence on convergence speed and solution accuracy. It is recommended that more than the minimal number of data points be used when computing geometry using a robust estimator such as RANSAC. Adding two to four extra samples improves the convergence rate and accuracy sufficiently to compensate for the increased number of samples required by the RANSAC process.
Resumo:
Background: Few studies have specifically investigated the functional effects of uncorrected astigmatism on measures of reading fluency. This information is important to provide evidence for the development of clinical guidelines for the correction of astigmatism. Methods: Participants included 30 visually normal, young adults (mean age 21.7 ± 3.4 years). Distance and near visual acuity and reading fluency were assessed with optimal spectacle correction (baseline) and for two levels of astigmatism, 1.00DC and 2.00DC, at two axes (90° and 180°) to induce both against-the-rule (ATR) and with-the-rule (WTR) astigmatism. Reading and eye movement fluency were assessed using standardized clinical measures including the test of Discrete Reading Rate (DRR), the Developmental Eye Movement (DEM) test and by recording eye movement patterns with the Visagraph (III) during reading for comprehension. Results: Both distance and near acuity were significantly decreased compared to baseline for all of the astigmatic lens conditions (p < 0.001). Reading speed with the DRR for N16 print size was significantly reduced for the 2.00DC ATR condition (a reduction of 10%), while for smaller text sizes reading speed was reduced by up to 24% for the 1.00DC ATR and 2.00DC condition in both axis directions (p<0.05). For the DEM, sub-test completion speeds were significantly impaired, with the 2.00DC condition affecting both vertical and horizontal times and the 1.00DC ATR condition affecting only horizontal times (p<0.05). Visagraph reading eye movements were not significantly affected by the induced astigmatism. Conclusions: Induced astigmatism impaired performance on selected tests of reading fluency, with ATR astigmatism having significantly greater effects on performance than did WTR, even for relatively small amounts of astigmatic blur of 1.00DC. These findings have implications for the minimal prescribing criteria for astigmatic refractive errors.
Resumo:
The need for accessible housing in Australia is acute. Both government and the community service sector recognise the importance of well designed accessible housing to optimise the integration of older people and people with disability, to encourage a prudent use of scarce health and community services and to enhance the liveability of our cities. In 2010, the housing industry, negotiated with the Australian Government and community representatives to adopt a nationally consistent voluntary code (Livable Housing Design) and a strategy to provide minimal level of accessibility in all new housing by 2020. Evidence from the implementation of such programs in the United Kingdom and USA, however, serves to question whether this aspirational goal can be achieved through voluntary codes. Minimal demand at the point of new sale, and problems in the production of housing to the required standards have raised questions regarding the application of program principles in the context of a voluntary code. In addressing the latter issue, this paper presents early findings from the analysis of qualitative interviews conducted with developers, builders and designers in various housing contexts. It identifies their “logics in use” in the production of housing in response to Livable Housing Design’s voluntary code and indicates factors that are likely to assist and impede the attainment of the 2020 aspirational goal.