891 resultados para potential models
Resumo:
This study investigates the short-run dynamics and long-run equilibrium relationship between residential electricity demand and factors influencing demand - per capita income, price of electricity, price of kerosene oil and price of liquefied petroleum gas - using annual data for Sri Lanka for the period, 1960-2007. The study uses unit root, cointegration and error-correction models. The long-run demand elasticities of income, own price and price of kerosene oil (substitute) were estimated to be 0.78, - 0.62, and 0.14 respectively. The short-run elasticities for the same variables were estimated to be 032, - 0.16 and 0.10 respectively. Liquefied petroleum (LP) gas is a substitute for electricity only in the short-run with an elasticity 0.09. The main findings of the paper support the following (1) increasing the price of electricity is not the most effective tool to reduce electricity consumption (2) existing subsidies on electricity consumption can be removed without reducing government revenue (3) the long-run income elasticity of demand shows that any future increase in household incomes is likely to significantly increase the demand for electricity and(4) any power generation plans which consider only current per capita consumption and population growth should be revised taking into account the potential future income increases in order to avoid power shortages ill the country.
Resumo:
The androgen receptor (AR) is a ligand-activated transcription factor of the nuclear receptor superfamily that plays a critical role in male physiology and pathology. Activated by binding of the native androgens testosterone and 5-dihydrotestosterone, the AR regulates transcription of genes involved in the development and maintenance of male phenotype and male reproductive function as well as other tissues such as bone and muscle. Deregulation of AR signaling can cause a diverse range of clinical conditions, including the X-linked androgen insensitivity syndrome, a form of motor neuron disease known as Kennedy’s disease, and male infertility. In addition, there is now compelling evidence that the AR is involved in all stages of prostate tumorigenesis including initiation, progression, and treatment resistance. To better understand the role of AR signaling in the pathogenesis of these conditions, it is important to have a comprehensive understanding of the key determinants of AR structure and function. Binding of androgens to the AR induces receptor dimerization, facilitating DNA binding and the recruitment of cofactors and transcriptional machinery to regulate expression of target genes. Various models of dimerization have been described for the AR, the most well characterized interaction being DNA-binding domain- mediated dimerization, which is essential for the AR to bind DNA and regulate transcription. Additional AR interactions with potential to contribute to receptor dimerization include the intermolecular interaction between the AR amino terminal domain and ligand-binding domain known as the N-terminal/C-terminal interaction, and ligand-binding domain dimerization. In this review, we discuss each form of dimerization utilized by the AR to achieve transcriptional competence and highlight that dimerization through multiple domains is necessary for optimal AR signaling.
Resumo:
This paper explores the potential therapeutic role of the naturally occurring sugar heparan sulfate (HS) for the augmentation of bone repair. Scaffolds comprising fibrin glue loaded with 5 lg of embryonically derived HS were assessed, firstly as a release-reservoir, and secondly as a scaffold to stimulate bone regeneration in a critical size rat cranial defect. We show HS-loaded scaffolds have a uniform distribution of HS, which was readily released with a typical burst phase, quickly followed by a prolonged delivery lasting several days. Importantly, the released HS contributed to improved wound healing over a 3-month period as determined by microcomputed tomography (lCT) scanning, histology, histomorphometry, and PCR for osteogenic markers. In all cases, only minimal healing was observed after 1 and 3 months in the absence of HS. In contrast, marked healing was observed by 3 months following HS treatment, with nearly full closure of the defect site. PCR analysis showed significant increases in the gene expression of the osteogenic markers Runx2, alkaline phosphatase, and osteopontin in the heparin sulfate group compared with controls. These results further emphasize the important role HS plays in augmenting wound healing, and its successful delivery in a hydrogel provides a novel alternative to autologous bone graft and growth factorbased therapies.
Resumo:
The evaluation of satisfaction levels related to performance is an important aspect in increasing market share, improving profitability and enlarging opportunities for repeat business and can lead to the determination of areas to be improved, improving harmonious working relationships and conflict avoidance. In the construction industry, this can also result in improved project quality, enhanced reputation and increased competitiveness. Many conceptual models have been developed to measure satisfaction levels - typically to gauge client satisfaction, customer satisfaction and home buyer satisfaction - but limited empirical research has been carried out, especially in investigating the satisfaction of construction contractors. In addressing this, this paper provides a unique conceptual model or framework for contractor satisfaction based on attributes identified by interviews with practitioners in Malaysia. In addition to progressing research in this topic and being of potential benefit to Malaysian contractors, it is anticipated that the framework will also be useful for other parties - clients, designers, subcontractors and suppliers - in enhancing the quality of products and/or services generally.
Resumo:
Machine vision represents a particularly attractive solution for sensing and detecting potential collision-course targets due to the relatively low cost, size, weight, and power requirements of the sensors involved (as opposed to radar). This paper describes the development and evaluation of a vision-based collision detection algorithm suitable for fixed-wing aerial robotics. The system was evaluated using highly realistic vision data of the moments leading up to a collision. Based on the collected data, our detection approaches were able to detect targets at distances ranging from 400m to about 900m. These distances (with some assumptions about closing speeds and aircraft trajectories) translate to an advanced warning of between 8-10 seconds ahead of impact, which approaches the 12.5 second response time recommended for human pilots. We make use of the enormous potential of graphic processing units to achieve processing rates of 30Hz (for images of size 1024-by- 768). Currently, integration in the final platform is under way.
Resumo:
To navigate successfully in a novel environment a robot needs to be able to Simultaneously Localize And Map (SLAM) its surroundings. The most successful solutions to this problem so far have involved probabilistic algorithms, but there has been much promising work involving systems based on the workings of part of the rodent brain known as the hippocampus. In this paper we present a biologically plausible system called RatSLAM that uses competitive attractor networks to carry out SLAM in a probabilistic manner. The system can effectively perform parameter self-calibration and SLAM in one dimension. Tests in two dimensional environments revealed the inability of the RatSLAM system to maintain multiple pose hypotheses in the face of ambiguous visual input. These results support recent rat experimentation that suggest current competitive attractor models are not a complete solution to the hippocampal modelling problem.
Resumo:
Globally, the main contributors to morbidity and mortality are chronic diseases, including cardiovascular disease and diabetes. Chronic diseases are costly and partially avoidable, with around sixty percent of deaths and nearly fifty percent of the global disease burden attributable to these conditions. By 2020, chronic illnesses will likely be the leading cause of disability worldwide. Existing health care systems, both national and international, that focus on acute episodic health conditions, cannot address the worldwide transition to chronic illness; nor are they appropriate for the ongoing care and management of those already afflicted with chronic diseases. International and Australian strategic planning documents articulate similar elements to manage chronic disease; including the need for aligning sectoral policies for health, forming partnerships and engaging communities in decision-making. The Australian National Chronic Disease Strategy focuses on four core areas for managing chronic disease; prevention across the continuum, early detection and treatment, integrated and coordinated care, and self-management. Such a comprehensive approach incorporates the entire population continuum, from the ‘healthy’, to those with risk factors, through to people suffering from chronic conditions and their sequelae. This chapter examines comprehensive approach to the prevention, management and care of the population with non-communicable, chronic diseases and communicable diseases. It analyses models of care in the context of need, service delivery options and the potential to prevent or manage early intervention for chronic and communicable diseases. Approaches to chronic diseases require integrated approaches that incorporate interventions targeted at both individuals and populations, and emphasise the shared risk factors of different conditions. Communicable diseases are a common and significant contributor to ill health throughout the world. In many countries, this impact has been minimised by the combined efforts of preventative health measures and improved treatment of infectious diseases. However in underdeveloped nations, communicable diseases continue to contribute significantly to the burden of disease. The aim of this chapter is to outline the impact that chronic and communicable diseases have on the health of the community, the public health strategies that are used to reduce the burden of those diseases and the old and emerging risks to public health from infectious diseases.
Resumo:
This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.
Resumo:
Digital production and distribution technologies may create new opportunities for filmmaking in Australia. A culture of new approaches to filmmaking is emerging driven by ‘next generation filmmakers’ who are willing to consider new business models: from online web series to short films produced for mobile phones. At the same time cultural representation itself is transforming within an interactive, social media driven environment. Yet there is very little research into next generation filmmaking. The aim of this paper is to scope and discuss three key aspects of next generation filmmaking, namely: digital trends in film distribution and marketing; processes and strategies of ‘next generation’ filmmakers; and case studies of viable next generation business models and filmmaking practices. We conclude with a brief examination of the implications for media and cultural policy which suggests the future possibility of a rapprochement between creative industries discourse and cultural policy.
Resumo:
Purpose – One of the critical issues for change management, particularly in relation to the implementation of new technologies, is the existence of prior knowledge and established mental models which may hinder change efforts. Understanding unlearning and how it might assist during organizational change is a way to address this resistance. The purpose of this paper is to present research designed to identify specific factors that facilitate unlearning. Design/methodology/approach – Drawing together issues identified as potential influencers of unlearning, a survey questionnaire was developed and administered in an Australian corporation undergoing large-scale change due to the implementation of an enterprise information system. The results were analyzed to identify specific factors that impact on unlearning. Findings – Findings from this paper identify factors that hinder or help the unlearning process during times of change including understanding the need for change, the level of organizational support and training, assessment of the change, positive experience and informal support, the organization's history of change, individual's prior outlooks, and individuals' feelings and expectations. Research limitations/implications – The use of only one organization does not allow for comparisons between organizations of different sizes, cultures or industries and therefore extension of this research is recommended. Practical implications – For practitioners, this paper provides specific elements at both the level of individuals and the organization that need to be considered for optimal unlearning during times of change. Originality/value – Previous literature on unlearning has been predominantly conceptual and theoretical. These empirical findings serve to further an earlier model based on qualitative research into potential influencers of unlearning.
Resumo:
In this thesis, the issue of incorporating uncertainty for environmental modelling informed by imagery is explored by considering uncertainty in deterministic modelling, measurement uncertainty and uncertainty in image composition. Incorporating uncertainty in deterministic modelling is extended for use with imagery using the Bayesian melding approach. In the application presented, slope steepness is shown to be the main contributor to total uncertainty in the Revised Universal Soil Loss Equation. A spatial sampling procedure is also proposed to assist in implementing Bayesian melding given the increased data size with models informed by imagery. Measurement error models are another approach to incorporating uncertainty when data is informed by imagery. These models for measurement uncertainty, considered in a Bayesian conditional independence framework, are applied to ecological data generated from imagery. The models are shown to be appropriate and useful in certain situations. Measurement uncertainty is also considered in the context of change detection when two images are not co-registered. An approach for detecting change in two successive images is proposed that is not affected by registration. The procedure uses the Kolmogorov-Smirnov test on homogeneous segments of an image to detect change, with the homogeneous segments determined using a Bayesian mixture model of pixel values. Using the mixture model to segment an image also allows for uncertainty in the composition of an image. This thesis concludes by comparing several different Bayesian image segmentation approaches that allow for uncertainty regarding the allocation of pixels to different ground components. Each segmentation approach is applied to a data set of chlorophyll values and shown to have different benefits and drawbacks depending on the aims of the analysis.
Resumo:
Ecologically sustainable development has become a major feature of legal systems at the international, national and local levels throughout the world. In Australia, governments have responded to environmental crises by enacting legislation imposing obligations and restrictions over privately-owned land. Whilst these obligations and restrictions may well be necessary to achieve sustainability, the approach to management of information concerning these instruments is problematic. For example, management of information concerning obligations and restrictions in Queensland is fragmented, with some instruments registered or recorded on the land title register, some on external registers, and some information only available in the legislation itself. This approach is used in most Australian jurisdictions. This fragmented approach has led to two separate but interconnected problems. First, the Torrens system is no longer meeting its goal of providing a complete and accurate picture of title. Second, this uncoordinated approach to the management of land titles, and obligations and restrictions on land use, has created a barrier to sustainable management of natural resources. This is because compliance with environmental laws is impaired in the absence of easily accessible and accurate information. These problems demonstrate a clear need for reform in this area. To determine how information concerning these obligations and restrictions may be most effectively managed, this thesis will apply a comparative methodology and consider three case studies, which each utilise different models for management of this information. These jurisdictions will be assessed according to a set of guidelines for comparison to identify which features of their systems provide for effective management of information concerning obligations and restrictions on title and use. Based on this comparison, this thesis will devise a series of recommendations for an effective system for the management of information concerning obligations and restrictions on land title and use, taking into account any potential legal issues and barriers to implementation. This series of recommendations for reform will be supplemented by suggested draft legislative provisions.
Resumo:
In November 2006, the Australian Research Council Centre of Excellence for Creative Industries and Innovation (CCi), in conjunction with the Queensland University of Technology, hosted the CCau Industry Forum, a research-focused industry engagement event. The event was run by the CCi ccClinic and CC + OCL Research projects, and aimed to evaluate understanding of and attitudes towards copyright, OCL and CC in Australia. The Forum focused on the government, education and the creative industries sectors. Unlocking the Potential Through Creative Commons: An Industry Engagement and Action Agenda evaluates and responds to the outcomes of this Forum and presents a strategy for continued research into Creative Commons in Australia.
Resumo:
The thrust towards constructivist learning and critical thinking in the National Curricular Framework (2005) of India implies shifts in pedagogical practices. In this context, drawing on grounded theory, focus group interviews were conducted with 40 preservice teachers to ascertain the contextual situation and the likely outcomes of applying critical literacy across the curriculum. Central themes that emerged in the discussion were: being teacher centred/ learner centred, and conformity/autonomy in teaching and learning. The paper argues that within the present Indian context, while there is scope for changes to pedagogy and learning styles, yet these must be adequately contextualised.
Resumo:
A pragmatic method for assessing the accuracy and precision of a given processing pipeline required for converting computed tomography (CT) image data of bones into representative three dimensional (3D) models of bone shapes is proposed. The method is based on coprocessing a control object with known geometry which enables the assessment of the quality of resulting 3D models. At three stages of the conversion process, distance measurements were obtained and statistically evaluated. For this study, 31 CT datasets were processed. The final 3D model of the control object contained an average deviation from reference values of −1.07±0.52 mm standard deviation (SD) for edge distances and −0.647±0.43 mm SD for parallel side distances of the control object. Coprocessing a reference object enables the assessment of the accuracy and precision of a given processing pipeline for creating CTbased 3D bone models and is suitable for detecting most systematic or human errors when processing a CT-scan. Typical errors have about the same size as the scan resolution.