137 resultados para exponential sums
Resumo:
Extensive research has highlighted the positive and exponential relationship between vehicle speed and crash risk and severity. Speed enforcement policies and practices throughout the world have developed dramatically as new technology becomes available, however speeding remains a pervasive problem internationally that significantly contributes to road trauma. This paper adopted a three-pronged approach to review speed enforcement policies and practices by: (i) describing and comparing policies and practices adopted in a cross-section of international jurisdictions; (ii) reviewing the available empirical evidence evaluating the effectiveness of various approaches, and; (iii) providing recommendations for the optimisation speed enforcement. The review shows the enforcement strategies adopted in various countries differ both in terms of the approaches used and how they are specifically applied. The literature review suggests strong and consistent evidence that police speed enforcement, in particular speed cameras, can be an effective tool for reducing vehicle speeds and subsequent traffic crashes. Drawing from this evidence, recommendations for best practice are proposed, including the specific instances in which various speed enforcement approaches typically produce the greatest road safety benefits, and perhaps most importantly, that speed enforcement programs must utilise a variety of strategies tailored to specific situations, rather than a one-size-fits-all approach.
Resumo:
This paper presents a novel framework for the modelling of passenger facilitation in a complex environment. The research is motivated by the challenges in the airport complex system, where there are multiple stakeholders, differing operational objectives and complex interactions and interdependencies between different parts of the airport system. Traditional methods for airport terminal modelling do not explicitly address the need for understanding causal relationships in a dynamic environment. Additionally, existing Bayesian Network (BN) models, which provide a means for capturing causal relationships, only present a static snapshot of a system. A method to integrate a BN complex systems model with stochastic queuing theory is developed based on the properties of the Poisson and exponential distributions. The resultant Hybrid Queue-based Bayesian Network (HQBN) framework enables the simulation of arbitrary factors, their relationships, and their effects on passenger flow and vice versa. A case study implementation of the framework is demonstrated on the inbound passenger facilitation process at Brisbane International Airport. The predicted outputs of the model, in terms of cumulative passenger flow at intermediary and end points in the inbound process, are found to have an R2 goodness of fit of 0.9994 and 0.9982 respectively over a 10 h test period. The utility of the framework is demonstrated on a number of usage scenarios including causal analysis and ‘what-if’ analysis. This framework provides the ability to analyse and simulate a dynamic complex system, and can be applied to other socio-technical systems such as hospitals.
Resumo:
Background The benign reputation of Plasmodium vivax is at odds with the burden and severity of the disease. This reputation, combined with restricted in vitro techniques, has slowed efforts to gain an understanding of the parasite biology and interaction with its human host. Methods A simulation model of the within-host dynamics of P. vivax infection is described, incorporating distinctive characteristics of the parasite such as the preferential invasion of reticulocytes and hypnozoite production. The developed model is fitted using digitized time-series’ from historic neurosyphilis studies, and subsequently validated against summary statistics from a larger study of the same population. The Chesson relapse pattern was used to demonstrate the impact of released hypnozoites. Results The typical pattern for dynamics of the parasite population is a rapid exponential increase in the first 10 days, followed by a gradual decline. Gametocyte counts follow a similar trend, but are approximately two orders of magnitude lower. The model predicts that, on average, an infected naïve host in the absence of treatment becomes infectious 7.9 days post patency and is infectious for a mean of 34.4 days. In the absence of treatment, the effect of hypnozoite release was not apparent as newly released parasites were obscured by the existing infection. Conclusions The results from the model provides useful insights into the dynamics of P. vivax infection in human hosts, in particular the timing of host infectiousness and the role of the hypnozoite in perpetuating infection.
Resumo:
UAVs could one day save the lives of lost civilians and those sent to find them, and a competition in outback Australia is proving how soon that day might come. We have all seen news stories of people who ventured beyond the day-to-day reach of the community and got lost: search parties are formed, aircraft drafted in, and often large sums of money expended in the quest to find them.
Resumo:
Most standard algorithms for prediction with expert advice depend on a parameter called the learning rate. This learning rate needs to be large enough to fit the data well, but small enough to prevent overfitting. For the exponential weights algorithm, a sequence of prior work has established theoretical guarantees for higher and higher data-dependent tunings of the learning rate, which allow for increasingly aggressive learning. But in practice such theoretical tunings often still perform worse (as measured by their regret) than ad hoc tuning with an even higher learning rate. To close the gap between theory and practice we introduce an approach to learn the learning rate. Up to a factor that is at most (poly)logarithmic in the number of experts and the inverse of the learning rate, our method performs as well as if we would know the empirically best learning rate from a large range that includes both conservative small values and values that are much higher than those for which formal guarantees were previously available. Our method employs a grid of learning rates, yet runs in linear time regardless of the size of the grid.
Resumo:
The fractional Fokker-Planck equation is an important physical model for simulating anomalous diffusions with external forces. Because of the non-local property of the fractional derivative an interesting problem is to explore high accuracy numerical methods for fractional differential equations. In this paper, a space-time spectral method is presented for the numerical solution of the time fractional Fokker-Planck initial-boundary value problem. The proposed method employs the Jacobi polynomials for the temporal discretization and Fourier-like basis functions for the spatial discretization. Due to the diagonalizable trait of the Fourier-like basis functions, this leads to a reduced representation of the inner product in the Galerkin analysis. We prove that the time fractional Fokker-Planck equation attains the same approximation order as the time fractional diffusion equation developed in [23] by using the present method. That indicates an exponential decay may be achieved if the exact solution is sufficiently smooth. Finally, some numerical results are given to demonstrate the high order accuracy and efficiency of the new numerical scheme. The results show that the errors of the numerical solutions obtained by the space-time spectral method decay exponentially.
Resumo:
As the society matures, there was an increasing pressure to preserve historic buildings. The economic cost in maintaining these important heritage legacies has become the prime consideration of every state. Dedicated intelligent monitoring systems supplementing the traditional building inspections will enable the stakeholder to carry out not only timely reactive response but also plan the maintenance in a more vigilant approach; thus, preventing further degradation which was very costly and difficult to address if neglected. The application of the intelligent structural health monitoring system in this case studies of ‘modern heritage’ buildings is on its infancy but it is an innovative approach in building maintenance. ‘Modern heritage’ buildings were the product of technological change and were made of synthetic materials such as reinforced concrete and steel. Architectural buildings that was very common in Oceania and The Pacific. Engineering problems that arose from this type of building calls for immediate engineering solution since the deterioration rate is exponential. The application of this newly emerging monitoring system will improve the traditional maintenance system on heritage conservation. Savings in time and resources can be achieved if only pathological results were on hand. This case study will validate that approach. This publication will serve as a position paper to the on-going research regarding application of (Structural Health Monitoring) SHM systems to heritage buildings in Brisbane, Australia. It will be investigated with the application of the SHM systems and devices to validate the integrity of the recent structural restoration of the newly re-strengthened heritage building, the Brisbane City Hall.
Resumo:
Advancements in sleep medicine have been escalating ever since research began appearing in the 1950s. As with most early clinical trials, women were excluded from participation. Even if researchers included women or addressed sex differences by age, reproductive stage was seldom considered. Recently, there has been an exponential increase in research on sleep in midlife and older women. This Practice Pearl briefly reviews the importance of adequate sleep, clinical assessment for sleep disorders, and guidelines for practice.
Resumo:
Introduction A pedagogical relationship - the relationship produced through teaching and learning - is, according to phenomenologist Max van Maanen, ‘the most profound relationship an adult can have with a child’ (van Maanen 1982). But what does it mean for a teacher to have a ‘profound’ relationship with a student in digital times? What, indeed, is an optimal pedagogical relationship at a time when the exponential proliferation and transformation of information across the globe is making for unprecedented social and cultural change? Does it involve both parties in a Facebook friendship? Being snappy with Snapchat? Tumbling around on Tumblr? There is now ample evidence of a growing trend to displace face-to-face interaction by virtual connections. One effect of these technologically mediated relationships is that a growing number of young people experience relationships as ‘mile-wide, inch-deep’ phenomena. It is timely, in this context, to explore how pedagogical relationships are being transmuted by Big Data, and to ask about the implications this has for current and future generations of professional educators.
Resumo:
The world is facing an energy crisis due to exponential population growth and limited availability of fossil fuels. Carbon, one of the most abundant materials found on earth, and its allotrope forms have been proposed in this project for novel energy generation and storage devices. This studied investigated the synthesis and properties of these carbon nanomaterials for applications in organic solar cells and supercapacitors.
Resumo:
The access to mobile technologies is growing at an exponential rate in developed and developing countries, with some developing countries surpassing developed countries in terms of device ownership. It is both the demand for, and high usage of mobile technologies that have driven new and emerging pedagogical practices in higher education. These technologies have also exponentially increased access to information in a knowledge economy. While differences are often drawn between developing and developed countries in terms of the access and use of information and communication technologies (ICT), this paper will report on a study detailing how higher education students use mobile technologies and social media in their studies and in their personal lives. It will contrast the similarities in how students from an Australian and Vietnamese university access and use mobile and social media technologies while also highlighting ways in which these technologies can be embraced by academics to connect and engage with students.
Resumo:
High-angular resolution diffusion imaging (HARDI) can reconstruct fiber pathways in the brain with extraordinary detail, identifying anatomical features and connections not seen with conventional MRI. HARDI overcomes several limitations of standard diffusion tensor imaging, which fails to model diffusion correctly in regions where fibers cross or mix. As HARDI can accurately resolve sharp signal peaks in angular space where fibers cross, we studied how many gradients are required in practice to compute accurate orientation density functions, to better understand the tradeoff between longer scanning times and more angular precision. We computed orientation density functions analytically from tensor distribution functions (TDFs) which model the HARDI signal at each point as a unit-mass probability density on the 6D manifold of symmetric positive definite tensors. In simulated two-fiber systems with varying Rician noise, we assessed how many diffusionsensitized gradients were sufficient to (1) accurately resolve the diffusion profile, and (2) measure the exponential isotropy (EI), a TDF-derived measure of fiber integrity that exploits the full multidirectional HARDI signal. At lower SNR, the reconstruction accuracy, measured using the Kullback-Leibler divergence, rapidly increased with additional gradients, and EI estimation accuracy plateaued at around 70 gradients.
Resumo:
The concept of big data has already outperformed traditional data management efforts in almost all industries. Other instances it has succeeded in obtaining promising results that provide value from large-scale integration and analysis of heterogeneous data sources for example Genomic and proteomic information. Big data analytics have become increasingly important in describing the data sets and analytical techniques in software applications that are so large and complex due to its significant advantages including better business decisions, cost reduction and delivery of new product and services [1]. In a similar context, the health community has experienced not only more complex and large data content, but also information systems that contain a large number of data sources with interrelated and interconnected data attributes. That have resulted in challenging, and highly dynamic environments leading to creation of big data with its enumerate complexities, for instant sharing of information with the expected security requirements of stakeholders. When comparing big data analysis with other sectors, the health sector is still in its early stages. Key challenges include accommodating the volume, velocity and variety of healthcare data with the current deluge of exponential growth. Given the complexity of big data, it is understood that while data storage and accessibility are technically manageable, the implementation of Information Accountability measures to healthcare big data might be a practical solution in support of information security, privacy and traceability measures. Transparency is one important measure that can demonstrate integrity which is a vital factor in the healthcare service. Clarity about performance expectations is considered to be another Information Accountability measure which is necessary to avoid data ambiguity and controversy about interpretation and finally, liability [2]. According to current studies [3] Electronic Health Records (EHR) are key information resources for big data analysis and is also composed of varied co-created values [3]. Common healthcare information originates from and is used by different actors and groups that facilitate understanding of the relationship for other data sources. Consequently, healthcare services often serve as an integrated service bundle. Although a critical requirement in healthcare services and analytics, it is difficult to find a comprehensive set of guidelines to adopt EHR to fulfil the big data analysis requirements. Therefore as a remedy, this research work focus on a systematic approach containing comprehensive guidelines with the accurate data that must be provided to apply and evaluate big data analysis until the necessary decision making requirements are fulfilled to improve quality of healthcare services. Hence, we believe that this approach would subsequently improve quality of life.
Resumo:
The measurement of radon ((222)Rn) activity flux using activated charcoal canisters was examined to investigate the distribution of the adsorbed (222)Rn in the charcoal bed and the relationship between (222)Rn activity flux and exposure time. The activity flux of (222)Rn from five sources of varying strengths was measured for exposure times of one, two, three, five, seven, 10, and 14 days. The distribution of the adsorbed (222)Rn in the charcoal bed was obtained by dividing the bed into six layers and counting each layer separately after the exposure. (222)Rn activity decreased in the layers that were away from the exposed surface. Nevertheless, the results demonstrated that only a small correction might be required in the actual application of charcoal canisters for activity flux measurement, where calibration standards were often prepared by the uniform mixing of radium ((226)Ra) in the matrix. This was because the diffusion of (222)Rn in the charcoal bed and the detection efficiency as a function of the charcoal depth tended to counterbalance each other. The influence of exposure time on the measured (222)Rn activity flux was observed in two situations of the canister exposure layout: (a) canister sealed to an open bed of the material and (b) canister sealed over a jar containing the material. The measured (222)Rn activity flux decreased as the exposure time increased. The change in the former situation was significant with an exponential decrease as the exposure time increased. In the latter case, lesser reduction was noticed in the observed activity flux with respect to exposure time. This reduction might have been related to certain factors, such as absorption site saturation or the back diffusion of (222)Rn gas occurring at the canister-soil interface.
Resumo:
Based on protein molecular dynamics, we investigate the fractal properties of energy, pressure and volume time series using the multifractal detrended fluctuation analysis (MF-DFA) and the topological and fractal properties of their converted horizontal visibility graphs (HVGs). The energy parameters of protein dynamics we considered are bonded potential, angle potential, dihedral potential, improper potential, kinetic energy, Van der Waals potential, electrostatic potential, total energy and potential energy. The shape of the h(q)h(q) curves from MF-DFA indicates that these time series are multifractal. The numerical values of the exponent h(2)h(2) of MF-DFA show that the series of total energy and potential energy are non-stationary and anti-persistent; the other time series are stationary and persistent apart from series of pressure (with H≈0.5H≈0.5 indicating the absence of long-range correlation). The degree distributions of their converted HVGs show that these networks are exponential. The results of fractal analysis show that fractality exists in these converted HVGs. For each energy, pressure or volume parameter, it is found that the values of h(2)h(2) of MF-DFA on the time series, exponent λλ of the exponential degree distribution and fractal dimension dBdB of their converted HVGs do not change much for different proteins (indicating some universality). We also found that after taking average over all proteins, there is a linear relationship between 〈h(2)〉〈h(2)〉 (from MF-DFA on time series) and 〈dB〉〈dB〉 of the converted HVGs for different energy, pressure and volume.