78 resultados para exponential decay


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phase-type distributions represent the time to absorption for a finite state Markov chain in continuous time, generalising the exponential distribution and providing a flexible and useful modelling tool. We present a new reversible jump Markov chain Monte Carlo scheme for performing a fully Bayesian analysis of the popular Coxian subclass of phase-type models; the convenient Coxian representation involves fewer parameters than a more general phase-type model. The key novelty of our approach is that we model covariate dependence in the mean whilst using the Coxian phase-type model as a very general residual distribution. Such incorporation of covariates into the model has not previously been attempted in the Bayesian literature. A further novelty is that we also propose a reversible jump scheme for investigating structural changes to the model brought about by the introduction of Erlang phases. Our approach addresses more questions of inference than previous Bayesian treatments of this model and is automatic in nature. We analyse an example dataset comprising lengths of hospital stays of a sample of patients collected from two Australian hospitals to produce a model for a patient's expected length of stay which incorporates the effects of several covariates. This leads to interesting conclusions about what contributes to length of hospital stay with implications for hospital planning. We compare our results with an alternative classical analysis of these data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the oldest problems in philosophy concerns the relationship between free will and moral responsibility. If we adopt the position that we lack free will, in the absolute sense—as have most philosophers who have addressed this issue—how can we truly be held accountable for what we do? This paper will contend that the most significant and interesting challenge to the long-standing status-quo on the matter comes not from philosophy, jurisprudence, or even physics, but rather from psychology. By examining this debate through the lens of contemporary behaviour disorders, such as ADHD, it will be argued that notions of free will, along with its correlate, moral responsibility, are being eroded through the logic of psychology which is steadily reconfiguring large swathes of familiar human conduct as pathology. The intention of the paper is not only to raise some concerns over the exponential growth of behaviour disorders, but also, and more significantly, to flag the ongoing relevance of philosophy for prying open contemporary educational problems in new and interesting ways.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the oldest problems in philosophy concerns the relationship between free will and moral responsibility. If we adopt the position that we lack free will, in the absolute sense—as have most philosophers who have addressed this issue—how can we truly be held accountable for what we do? This paper will contend that the most significant and interesting challenge to the long-standing status-quo on the matter comes not from philosophy, jurisprudence, or even physics, but rather from psychology. By examining this debate through the lens of contemporary behaviour disorders, such as ADHD, it will be argued that notions of free will, along with its correlate, moral responsibility, are being eroded through the logic of psychology which is steadily reconfiguring large swathes of familiar human conduct as pathology. The intention of the paper is not only to raise some concerns over the exponential growth of behaviour disorders, but also, and more significantly, to flag the ongoing relevance of philosophy for prying open contemporary educational problems in new and interesting ways.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The population Monte Carlo algorithm is an iterative importance sampling scheme for solving static problems. We examine the population Monte Carlo algorithm in a simplified setting, a single step of the general algorithm, and study a fundamental problem that occurs in applying importance sampling to high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of estimate under conditions on the importance function. We demonstrate the exponential growth of the asymptotic variance with the dimension and show that the optimal covariance matrix for the importance function can be estimated in special cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Zoonotic infections are a growing threat to global health. Chlamydia pneumoniae is a major human pathogen that is widespread in human populations, causing acute respiratory disease, and has been associated with chronic disease. C. pneumoniae was first identified solely in human populations; however, its host range now includes other mammals, marsupials, amphibians, and reptiles. Australian koalas (Phascolarctos cinereus) are widely infected with two species of Chlamydia, C. pecorum and C. pneumoniae. Transmission of C. pneumoniae between animals and humans has not been reported; however, two other chlamydial species, C. psittaci and C. abortus, are known zoonotic pathogens. We have sequenced the 1,241,024-bp chromosome and a 7.5-kb cryptic chlamydial plasmid of the koala strain of C. pneumoniae (LPCoLN) using the whole-genome shotgun method. Comparative genomic analysis, including pseudogene and single-nucleotide polymorphism (SNP) distribution, and phylogenetic analysis of conserved genes and SNPs against the human isolates of C. pneumoniae show that the LPCoLN isolate is basal to human isolates. Thus, we propose based on compelling genomic and phylogenetic evidence that humans were originally infected zoonotically by an animal isolate(s) of C. pneumoniae which adapted to humans primarily through the processes of gene decay and plasmid loss, to the point where the animal reservoir is no longer required for transmission.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper will investigate the suitability of existing performance measures under the assumption of a clearly defined benchmark. A range of measures are examined including the Sortino Ratio, the Sharpe Selection ratio (SSR), the Student’s t-test and a decay rate measure. A simulation study is used to assess the power and bias of these measures based on variations in sample size and mean performance of two simulated funds. The Sortino Ratio is found to be the superior performance measure exhibiting more power and less bias than the SSR when the distribution of excess returns are skewed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In situ near-IR transmittance measurements have been used to characterize the density of trapped electrons in dye-sensitized solar cells (DSCs). Measurements have been made under a range experimental conditions including during open circuit photovoltage decay and during recording of the IV characteristic. The optical cross section of electrons at 940 nm was determined by relating the IR absorbance to the density of trapped electrons measured by charge extraction. The value, σn = 5.4 × 10-18 cm2, was used to compare the trapped electron densities in illuminated DSCs under open and short circuit conditions in order to quantify the difference in the quasi Fermi level, nEF. It was found that nEF for the cells studied was 250 meV over wide range of illuminat on intensities. IR transmittance measurements have also been used to quantify shifts in conduction band energy associated with dye adsorption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: To investigate the acute effects of isolated eccentric and concentric calf muscle exercise on Achilles tendon sagittal thickness. ---------- Design: Within-subject, counterbalanced, mixed design. ---------- Setting: Institutional. ---------- Participants: 11 healthy, recreationally active male adults. ---------- Interventions: Participants performed an exercise protocol, which involved isolated eccentric loading of the Achilles tendon of a single limb and isolated concentric loading of the contralateral, both with the addition of 20% bodyweight. ---------- Main outcome measurements: Sagittal sonograms were acquired prior to, immediately following and 3, 6, 12 and 24 h after exercise. Tendon thickness was measured 2 cm proximal to the superior aspect of the calcaneus. ---------- Results: Both loading conditions resulted in an immediate decrease in normalised Achilles tendon thickness. Eccentric loading induced a significantly greater decrease than concentric loading despite a similar impulse (−0.21 vs −0.05, p<0.05). Post-exercise, eccentrically loaded tendons recovered exponentially, with a recovery time constant of 2.5 h. The same exponential function did not adequately model changes in tendon thickness resulting from concentric loading. Even so, recovery pathways subsequent to the 3 h time point were comparable. Regardless of the exercise protocol, full tendon thickness recovery was not observed until 24 h. ---------- Conclusions: Eccentric loading invokes a greater reduction in Achilles tendon thickness immediately after exercise but appears to recover fully in a similar time frame to concentric loading.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article examines the growing phenomenon of online dating and intimacy in the 21st century. The exponential rise of communications technologies, which is both reflective and constitutive of an increasingly networked and globalized society, has the potential to significantly influence the nature of intimacy in everyday life. Yet, to date, there has been a minimal response by sociologists to seek, describe and understand this influence. In this article, we present some of the key findings of our research on online dating in Australia, in order to foster a debate about the sociological impacts on intimacy in the postmodern world. Based on a web audit of more than 60 online dating sites and in-depth interviews with 23 users of online dating services, we argue that recent global trends are influencing the uptake of online technologies for the purposes of forming intimate relations. Further, some of the mediating effects of these technologies – in particular, the hypercommunication – may have specific implications for the nature of intimacy in the global era.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technological environment in which contemporary small and medium-sized enterprises (SMEs) operate can only be described as dynamic. The exponential rate of technological change, characterised by perceived increases in the benefits associated with various technologies, shortening product life cycles and changing standards, provides for the SME a complex and challenging operational context. The primary aim of this research was to concentrate on those SMEs that had already adopted technology in order to identify their needs for the new mobile data technologies (MDT), the mobile Internet. The research design utilised a mixed approach whereby both qualitative and quantitative data was collected to address the question. Overall, the needs of these SMEs for MDT can be conceptualised into three areas where the technology will assist business practices; communication, eCommerce and security.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technological environment in which contemporary small and medium-sized enterprises (SMEs) operate can only be described as dynamic. The exponential rate of technological change, characterised by perceived increases in the benefits associated with various technologies, shortening product life cycles and changing standards, provides the SME a complex and challenging operational context. The primary aim of this research was to identify the needs of SMEs in regional areas for mobile data technologies (MDT). In this study a distinction was drawn between those respondents who were full-adopters of technology, those who were partial-adopters and those who were non-adopters and these three segments articulated different needs and requirements for MDT. Overall the needs of regional SMEs for MDT can be conceptualised into three areas where the technology will assist business practices, communication, e-commerce and security.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The seemingly exponential nature of technological change provides SMEs with a complex and challenging operational context. The development of infrastructures capable of supporting the wireless application protocol (WAP) and associated 'wireless' applications represents the latest generation of technological innovation with potential appeals to SMEs and end-users alike. This paper aims to understand the mobile data technology needs of SMEs in a regional setting. The research was especially concerned with perceived needs across three market segments : non-adopters, partial-adopters and full-adopters of new technology. The research was exploratory in nature as the phenomenon under scrutiny is relatively new and the uses unclear, thus focus groups were conducted with each of the segments. The paper provides insights for business, industry and academics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technological environment in which contemporary small- and medium-sized enterprises (SMEs) operate can only be described as dynamic. The exponential rate of technological change, characterised by perceived increases in the benefits associated with various technologies, shortening product life cycles and changing standards, provides for the SME a complex and challenging operational context. The primary aim of this research was to identify the needs of SMEs in regional areas for mobile data technologies (MDT). In this study a distinction was drawn between those respondents who were full-adopters of technology, those who were partial-adopters, and those who were non-adopters and these three segments articulated different needs and requirements for MDT. Overall, the needs of regional SMEs for MDT can be conceptualised into three areas where the technology will assist business practices; communication, e-commerce and security

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technological environment in which contemporary small and medium-sized enterprises (SMEs) operate can only be described as dynamic. The seemingly exponential nature of technological change, characterised by perceived increases in the benefits associated with various technologies, shortening product life cycles and changing standards, provides for the small and medium-sized enterprise a complex and challenging operational context. The development of infrastructures capable of supporting the Wireless Application Protocol (WAP)and associated 'wireless' applications represents the latest generation of technological innovation with potential appeal to SMEs and end-users alike. The primary aim of this research was to understand the mobile data technology needs of SMEs in a regional setting. The research was especially concerned with perceived needs across three market segments; non-adopters of new technology, partial-adopters of new technology and full-adopters of new technology. Working with an industry partner, focus groups were conducted with each of these segments with the discussions focused on the use of the latest WP products and services. Some of the results are presented in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.