892 resultados para Partial Credit Model
Resumo:
The 1AR has two binding sites which can be activated to cause cardiostimulation. The first, termed, 1HAR (high affinity site of 1AR) is activated by noradrenaline and adrenaline and is blocked by relatively low concentrations of β-blockers including carvedilol (Kaumann and Molenaar, 2008). The other, termed, 1LAR (low affinity site of 1AR) has lower affinity for noradrenaline and adrenaline and is activated by some β-blockers including CGP12177 and pindolol, at higher concentrations than those required to block the receptor (Kaumann and Molenaar, 2008). (-)-CGP12177 is a non-conventional partial agonist that causes modest and transient increases of contractile force in human atrial trabeculae (Kaumann and Molenaar, 2008). These effects are markedly increased and maintained by inhibition of phosphodiesterase PDE3. The stimulant effects of (-)-CGP12177 at human β1ARs was verified with recombinant receptors (Kaumann and Molenaar, 2008). However, in a recent report it was proposed that the positive inotropic effects of CGP12177 are mediated through 3ARs in human right atrium (Skeberdis et al 2008). This proposal was not consistent with the lack of blockade of (-)-CGP12177 inotropic effects or increases in L-type Ca2+ current (ICa-L ) by the β3AR blocker 1 μM LY748,337 (Christ et al, 2010). On the otherhand, (-)-CGP12177 increases in inotropic effects and ICa-L were blocked by (-)-bupranolol 1-10 μM (Christ et al, 2010). Chronic infusion of (-)-CGP 12177 (10 mg/Kg/24 hours) for four weeks in an aortic constriction mouse model of heart failure caused an increase in left ventricular wall thickness, fibrosis and inflammation-related left ventricular gene expression levels. Christ T et al (2010) Br J Pharmacol, In press Kaumann A and Molenaar P (2008) Pharmacol Ther 118, 303-336 Skeberdis VA et al (2008) J Clin Invest, 118, 3219-3227
Resumo:
Given the present worldwide epidemic of obesity, it is pertinent to ask how effective exercise could be in helping people to lose weight or to prevent weight gain. There is a widely held belief that exercise is futile for weight reduction because any energy expended in exercise is automatically compensated for by a corresponding increase in energy intake (EI). In other words, exercise elevates the intensity of hunger and drives food consumption. This “commonsense” view appears to originate in an energy-balance model of appetite control, which stipulates that energy expended will drive EI as a consequence of the regulation of energy balance. However, it is very clear that EI (food consumption or eating) is not just a biological matter. Eating does not occur solely to rectify some internal need state. Indeed, an examination of the relation between exercise and appetite control has shown a very weak coupling; most studies have demonstrated that food intake does not immediately rise after exercise, even after very high energy expenditure (EE).[1] The processes of exercise-induced EE and food consumption do not appear to be tightly linked. After exercise, there is only slow and partial compensation for the energy expended. Therefore, exercise can be very useful in helping to bring about weight loss and is even more important in preventing weight gain or weight regain. This editorial explores this issue.
Resumo:
Osteoporotic spinal fractures are a major concern in ageing Western societies. This study develops a multi-scale finite element (FE) model of the osteoporotic lumbar vertebral body to study the mechanics of vertebral compression fracture at both the apparent (whole vertebral body) and micro-structural (internal trabecular bone core)levels. Model predictions were verified against experimental data, and found to provide a reasonably good representation of the mechanics of the osteoporotic vertebral body. This novel modelling methodology will allow detailed investigation of how trabecular bone loss in osteoporosis affects vertebral stiffness and strength in the lumbar spine.
Resumo:
Purpose – In recent years, knowledge-based urban development (KBUD) has introduced as a new strategic development approach for the regeneration of industrial cities. It aims to create a knowledge city consists of planning strategies, IT networks and infrastructures that achieved through supporting the continuous creation, sharing, evaluation, renewal and update of knowledge. Improving urban amenities and ecosystem services by creating sustainable urban environment is one of the fundamental components for KBUD. In this context, environmental assessment plays an important role in adjusting urban environment and economic development towards a sustainable way. The purpose of this paper is to present the role of assessment tools for environmental decision making process of knowledge cities. Design/methodology/approach – The paper proposes a new assessment tool to figure a template of a decision support system which will enable to evaluate the possible environmental impacts in an existing and future urban context. The paper presents the methodology of the proposed model named ‘ASSURE’ which consists of four main phases. Originality/value –The proposed model provides a useful guidance to evaluate the urban development and its environmental impacts to achieve sustainable knowledge-based urban futures. Practical implications – The proposed model will be an innovative approach to provide the resilience and function of urban natural systems secure against the environmental changes while maintaining the economic development of cities.
Resumo:
Uninhabited aerial vehicles (UAVs) are a cutting-edge technology that is at the forefront of aviation/aerospace research and development worldwide. Many consider their current military and defence applications as just a token of their enormous potential. Unlocking and fully exploiting this potential will see UAVs in a multitude of civilian applications and routinely operating alongside piloted aircraft. The key to realising the full potential of UAVs lies in addressing a host of regulatory, public relation, and technological challenges never encountered be- fore. Aircraft collision avoidance is considered to be one of the most important issues to be addressed, given its safety critical nature. The collision avoidance problem can be roughly organised into three areas: 1) Sense; 2) Detect; and 3) Avoid. Sensing is concerned with obtaining accurate and reliable information about other aircraft in the air; detection involves identifying potential collision threats based on available information; avoidance deals with the formulation and execution of appropriate manoeuvres to maintain safe separation. This thesis tackles the detection aspect of collision avoidance, via the development of a target detection algorithm that is capable of real-time operation onboard a UAV platform. One of the key challenges of the detection problem is the need to provide early warning. This translates to detecting potential threats whilst they are still far away, when their presence is likely to be obscured and hidden by noise. Another important consideration is the choice of sensors to capture target information, which has implications for the design and practical implementation of the detection algorithm. The main contributions of the thesis are: 1) the proposal of a dim target detection algorithm combining image morphology and hidden Markov model (HMM) filtering approaches; 2) the novel use of relative entropy rate (RER) concepts for HMM filter design; 3) the characterisation of algorithm detection performance based on simulated data as well as real in-flight target image data; and 4) the demonstration of the proposed algorithm's capacity for real-time target detection. We also consider the extension of HMM filtering techniques and the application of RER concepts for target heading angle estimation. In this thesis we propose a computer-vision based detection solution, due to the commercial-off-the-shelf (COTS) availability of camera hardware and the hardware's relatively low cost, power, and size requirements. The proposed target detection algorithm adopts a two-stage processing paradigm that begins with an image enhancement pre-processing stage followed by a track-before-detect (TBD) temporal processing stage that has been shown to be effective in dim target detection. We compare the performance of two candidate morphological filters for the image pre-processing stage, and propose a multiple hidden Markov model (MHMM) filter for the TBD temporal processing stage. The role of the morphological pre-processing stage is to exploit the spatial features of potential collision threats, while the MHMM filter serves to exploit the temporal characteristics or dynamics. The problem of optimising our proposed MHMM filter has been examined in detail. Our investigation has produced a novel design process for the MHMM filter that exploits information theory and entropy related concepts. The filter design process is posed as a mini-max optimisation problem based on a joint RER cost criterion. We provide proof that this joint RER cost criterion provides a bound on the conditional mean estimate (CME) performance of our MHMM filter, and this in turn establishes a strong theoretical basis connecting our filter design process to filter performance. Through this connection we can intelligently compare and optimise candidate filter models at the design stage, rather than having to resort to time consuming Monte Carlo simulations to gauge the relative performance of candidate designs. Moreover, the underlying entropy concepts are not constrained to any particular model type. This suggests that the RER concepts established here may be generalised to provide a useful design criterion for multiple model filtering approaches outside the class of HMM filters. In this thesis we also evaluate the performance of our proposed target detection algorithm under realistic operation conditions, and give consideration to the practical deployment of the detection algorithm onboard a UAV platform. Two fixed-wing UAVs were engaged to recreate various collision-course scenarios to capture highly realistic vision (from an onboard camera perspective) of the moments leading up to a collision. Based on this collected data, our proposed detection approach was able to detect targets out to distances ranging from about 400m to 900m. These distances, (with some assumptions about closing speeds and aircraft trajectories) translate to an advanced warning ahead of impact that approaches the 12.5 second response time recommended for human pilots. Furthermore, readily available graphic processing unit (GPU) based hardware is exploited for its parallel computing capabilities to demonstrate the practical feasibility of the proposed target detection algorithm. A prototype hardware-in- the-loop system has been found to be capable of achieving data processing rates sufficient for real-time operation. There is also scope for further improvement in performance through code optimisations. Overall, our proposed image-based target detection algorithm offers UAVs a cost-effective real-time target detection capability that is a step forward in ad- dressing the collision avoidance issue that is currently one of the most significant obstacles preventing widespread civilian applications of uninhabited aircraft. We also highlight that the algorithm development process has led to the discovery of a powerful multiple HMM filtering approach and a novel RER-based multiple filter design process. The utility of our multiple HMM filtering approach and RER concepts, however, extend beyond the target detection problem. This is demonstrated by our application of HMM filters and RER concepts to a heading angle estimation problem.
Resumo:
Authorised users (insiders) are behind the majority of security incidents with high financial impacts. Because authorisation is the process of controlling users’ access to resources, improving authorisation techniques may mitigate the insider threat. Current approaches to authorisation suffer from the assumption that users will (can) not depart from the expected behaviour implicit in the authorisation policy. In reality however, users can and do depart from the canonical behaviour. This paper argues that the conflict of interest between insiders and authorisation mechanisms is analogous to the subset of problems formally studied in the field of game theory. It proposes a game theoretic authorisation model that can ensure users’ potential misuse of a resource is explicitly considered while making an authorisation decision. The resulting authorisation model is dynamic in the sense that its access decisions vary according to the changes in explicit factors that influence the cost of misuse for both the authorisation mechanism and the insider.
Resumo:
This paper presents the results of a structural equation model (SEM) for describing and quantifying the fundamental factors that affect contract disputes between owners and contractors in the construction industry. Through this example, the potential impact of SEM analysis in construction engineering and management research is illustrated. The purpose of the specific model developed in this research is to explain how and why contract related construction problems occur. This study builds upon earlier work, which developed a disputes potential index, and the likelihood of construction disputes was modeled using logistic regression. In this earlier study, questionnaires were completed on 159 construction projects, which measured both qualitative and quantitative aspects of contract disputes, management ability, financial planning, risk allocation, and project scope definition for both owners and contractors. The SEM approach offers several advantages over the previously employed logistic regression methodology. The final set of structural equations provides insight into the interaction of the variables that was not apparent in the original logistic regression modeling methodology.
Resumo:
There has been a worldwide trend to increase axle loads and train speeds. This means that railway track degradation will be accelerated, and track maintenance costs will be increased significantly. There is a need to investigate the consequences of increasing traffic load. The aim of the research is to develop a model for the analysis of physical degradation of railway tracks in response to changes in traffic parameters, especially increased axle loads and train speeds. This research has developed an integrated track degradation model (ITDM) by integrating several models into a comprehensive framework. Mechanistic relationships for track degradation hav~ ?een used wherever possible in each of the models contained in ITDM. This overcc:mes the deficiency of the traditional statistical track models which rely heavily on historical degradation data, which is generally not available in many railway systems. In addition statistical models lack the flexibility of incorporating future changes in traffic patterns or maintenance practices. The research starts with reviewing railway track related studies both in Australia and overseas to develop a comprehensive understanding of track performance under various traffic conditions. Existing railway related models are then examined for their suitability for track degradation analysis for Australian situations. The ITDM model is subsequently developed by modifying suitable existing models, and developing new models where necessary. The ITDM model contains four interrelated submodels for rails, sleepers, ballast and subgrade, and track modulus. The rail submodel is for rail wear analysis and is developed from a theoretical concept. The sleeper submodel is for timber sleepers damage prediction. The submodel is developed by modifying and extending an existing model developed elsewhere. The submodel has also incorporated an analysis for the likelihood of concrete sleeper cracking. The ballast and subgrade submodel is evolved from a concept developed in the USA. Substantial modifications and improvements have been made. The track modulus submodel is developed from a conceptual method. Corrections for more global track conditions have been made. The integration of these submodels into one comprehensive package has enabled the interaction between individual track components to be taken into account. This is done by calculating wheel load distribution with time and updating track conditions periodically in the process of track degradation simulation. A Windows-based computer program ~ssociated with ITDM has also been developed. The program enables the user to carry out analysis of degradation of individual track components and to investigate the inter relationships between these track components and their deterioration. The successful implementation of this research has provided essential information for prediction of increased maintenance as a consequence of railway trackdegradation. The model, having been presented at various conferences and seminars, has attracted wide interest. It is anticipated that the model will be put into practical use among Australian railways, enabling track maintenance planning to be optimized and potentially saving Australian railway systems millions of dollars in operating costs.
Resumo:
Speaker verification is the process of verifying the identity of a person by analysing their speech. There are several important applications for automatic speaker verification (ASV) technology including suspect identification, tracking terrorists and detecting a person’s presence at a remote location in the surveillance domain, as well as person authentication for phone banking and credit card transactions in the private sector. Telephones and telephony networks provide a natural medium for these applications. The aim of this work is to improve the usefulness of ASV technology for practical applications in the presence of adverse conditions. In a telephony environment, background noise, handset mismatch, channel distortions, room acoustics and restrictions on the available testing and training data are common sources of errors for ASV systems. Two research themes were pursued to overcome these adverse conditions: Modelling mismatch and modelling uncertainty. To directly address the performance degradation incurred through mismatched conditions it was proposed to directly model this mismatch. Feature mapping was evaluated for combating handset mismatch and was extended through the use of a blind clustering algorithm to remove the need for accurate handset labels for the training data. Mismatch modelling was then generalised by explicitly modelling the session conditions as a constrained offset of the speaker model means. This session variability modelling approach enabled the modelling of arbitrary sources of mismatch, including handset type, and halved the error rates in many cases. Methods to model the uncertainty in speaker model estimates and verification scores were developed to address the difficulties of limited training and testing data. The Bayes factor was introduced to account for the uncertainty of the speaker model estimates in testing by applying Bayesian theory to the verification criterion, with improved performance in matched conditions. Modelling the uncertainty in the verification score itself met with significant success. Estimating a confidence interval for the "true" verification score enabled an order of magnitude reduction in the average quantity of speech required to make a confident verification decision based on a threshold. The confidence measures developed in this work may also have significant applications for forensic speaker verification tasks.