939 resultados para Bayesian belief network
Resumo:
In this paper it is demonstrated how the Bayesian parametric bootstrap can be adapted to models with intractable likelihoods. The approach is most appealing when the semi-automatic approximate Bayesian computation (ABC) summary statistics are selected. After a pilot run of ABC, the likelihood-free parametric bootstrap approach requires very few model simulations to produce an approximate posterior, which can be a useful approximation in its own right. An alternative is to use this approximation as a proposal distribution in ABC algorithms to make them more efficient. In this paper, the parametric bootstrap approximation is used to form the initial importance distribution for the sequential Monte Carlo and the ABC importance and rejection sampling algorithms. The new approach is illustrated through a simulation study of the univariate g-and- k quantile distribution, and is used to infer parameter values of a stochastic model describing expanding melanoma cell colonies.
Resumo:
The inverse temperature hyperparameter of the hidden Potts model governs the strength of spatial cohesion and therefore has a substantial influence over the resulting model fit. The difficulty arises from the dependence of an intractable normalising constant on the value of the inverse temperature, thus there is no closed form solution for sampling from the distribution directly. We review three computational approaches for addressing this issue, namely pseudolikelihood, path sampling, and the approximate exchange algorithm. We compare the accuracy and scalability of these methods using a simulation study.
Resumo:
While enhanced cybersecurity options, mainly based around cryptographic functions, are needed overall speed and performance of a healthcare network may take priority in many circumstances. As such the overall security and performance metrics of those cryptographic functions in their embedded context needs to be understood. Understanding those metrics has been the main aim of this research activity. This research reports on an implementation of one network security technology, Internet Protocol Security (IPSec), to assess security performance. This research simulates sensitive healthcare information being transferred over networks, and then measures data delivery times with selected security parameters for various communication scenarios on Linux-based and Windows-based systems. Based on our test results, this research has revealed a number of network security metrics that need to be considered when designing and managing network security for healthcare-specific or non-healthcare-specific systems from security, performance and manageability perspectives. This research proposes practical recommendations based on the test results for the effective selection of network security controls to achieve an appropriate balance between network security and performance
Resumo:
To further investigate susceptibility loci identified by genome-wide association studies, we genotyped 5,500 SNPs across 14 associated regions in 8,000 samples from a control group and 3 diseases: type 2 diabetes (T2D), coronary artery disease (CAD) and Graves' disease. We defined, using Bayes theorem, credible sets of SNPs that were 95% likely, based on posterior probability, to contain the causal disease-associated SNPs. In 3 of the 14 regions, TCF7L2 (T2D), CTLA4 (Graves' disease) and CDKN2A-CDKN2B (T2D), much of the posterior probability rested on a single SNP, and, in 4 other regions (CDKN2A-CDKN2B (CAD) and CDKAL1, FTO and HHEX (T2D)), the 95% sets were small, thereby excluding most SNPs as potentially causal. Very few SNPs in our credible sets had annotated functions, illustrating the limitations in understanding the mechanisms underlying susceptibility to common diseases. Our results also show the value of more detailed mapping to target sequences for functional studies. © 2012 Nature America, Inc. All rights reserved.
Resumo:
Background Multilevel and spatial models are being increasingly used to obtain substantive information on area-level inequalities in cancer survival. Multilevel models assume independent geographical areas, whereas spatial models explicitly incorporate geographical correlation, often via a conditional autoregressive prior. However the relative merits of these methods for large population-based studies have not been explored. Using a case-study approach, we report on the implications of using multilevel and spatial survival models to study geographical inequalities in all-cause survival. Methods Multilevel discrete-time and Bayesian spatial survival models were used to study geographical inequalities in all-cause survival for a population-based colorectal cancer cohort of 22,727 cases aged 20–84 years diagnosed during 1997–2007 from Queensland, Australia. Results Both approaches were viable on this large dataset, and produced similar estimates of the fixed effects. After adding area-level covariates, the between-area variability in survival using multilevel discrete-time models was no longer significant. Spatial inequalities in survival were also markedly reduced after adjusting for aggregated area-level covariates. Only the multilevel approach however, provided an estimation of the contribution of geographical variation to the total variation in survival between individual patients. Conclusions With little difference observed between the two approaches in the estimation of fixed effects, multilevel models should be favored if there is a clear hierarchical data structure and measuring the independent impact of individual- and area-level effects on survival differences is of primary interest. Bayesian spatial analyses may be preferred if spatial correlation between areas is important and if the priority is to assess small-area variations in survival and map spatial patterns. Both approaches can be readily fitted to geographically enabled survival data from international settings
Resumo:
Based on protein molecular dynamics, we investigate the fractal properties of energy, pressure and volume time series using the multifractal detrended fluctuation analysis (MF-DFA) and the topological and fractal properties of their converted horizontal visibility graphs (HVGs). The energy parameters of protein dynamics we considered are bonded potential, angle potential, dihedral potential, improper potential, kinetic energy, Van der Waals potential, electrostatic potential, total energy and potential energy. The shape of the h(q)h(q) curves from MF-DFA indicates that these time series are multifractal. The numerical values of the exponent h(2)h(2) of MF-DFA show that the series of total energy and potential energy are non-stationary and anti-persistent; the other time series are stationary and persistent apart from series of pressure (with H≈0.5H≈0.5 indicating the absence of long-range correlation). The degree distributions of their converted HVGs show that these networks are exponential. The results of fractal analysis show that fractality exists in these converted HVGs. For each energy, pressure or volume parameter, it is found that the values of h(2)h(2) of MF-DFA on the time series, exponent λλ of the exponential degree distribution and fractal dimension dBdB of their converted HVGs do not change much for different proteins (indicating some universality). We also found that after taking average over all proteins, there is a linear relationship between 〈h(2)〉〈h(2)〉 (from MF-DFA on time series) and 〈dB〉〈dB〉 of the converted HVGs for different energy, pressure and volume.
Resumo:
Many studies have shown that we can gain additional information on time series by investigating their accompanying complex networks. In this work, we investigate the fundamental topological and fractal properties of recurrence networks constructed from fractional Brownian motions (FBMs). First, our results indicate that the constructed recurrence networks have exponential degree distributions; the average degree exponent 〈λ〉 increases first and then decreases with the increase of Hurst index H of the associated FBMs; the relationship between H and 〈λ〉 can be represented by a cubic polynomial function. We next focus on the motif rank distribution of recurrence networks, so that we can better understand networks at the local structure level. We find the interesting superfamily phenomenon, i.e., the recurrence networks with the same motif rank pattern being grouped into two superfamilies. Last, we numerically analyze the fractal and multifractal properties of recurrence networks. We find that the average fractal dimension 〈dB〉 of recurrence networks decreases with the Hurst index H of the associated FBMs, and their dependence approximately satisfies the linear formula 〈dB〉≈2-H, which means that the fractal dimension of the associated recurrence network is close to that of the graph of the FBM. Moreover, our numerical results of multifractal analysis show that the multifractality exists in these recurrence networks, and the multifractality of these networks becomes stronger at first and then weaker when the Hurst index of the associated time series becomes larger from 0.4 to 0.95. In particular, the recurrence network with the Hurst index H=0.5 possesses the strongest multifractality. In addition, the dependence relationships of the average information dimension 〈D(1)〉 and the average correlation dimension 〈D(2)〉 on the Hurst index H can also be fitted well with linear functions. Our results strongly suggest that the recurrence network inherits the basic characteristic and the fractal nature of the associated FBM series.
Resumo:
Statistical comparison of oil samples is an integral part of oil spill identification, which deals with the process of linking an oil spill with its source of origin. In current practice, a frequentist hypothesis test is often used to evaluate evidence in support of a match between a spill and a source sample. As frequentist tests are only able to evaluate evidence against a hypothesis but not in support of it, we argue that this leads to unsound statistical reasoning. Moreover, currently only verbal conclusions on a very coarse scale can be made about the match between two samples, whereas a finer quantitative assessment would often be preferred. To address these issues, we propose a Bayesian predictive approach for evaluating the similarity between the chemical compositions of two oil samples. We derive the underlying statistical model from some basic assumptions on modeling assays in analytical chemistry, and to further facilitate and improve numerical evaluations, we develop analytical expressions for the key elements of Bayesian inference for this model. The approach is illustrated with both simulated and real data and is shown to have appealing properties in comparison with both standard frequentist and Bayesian approaches
Resumo:
Crashes at any particular transport network location consist of a chain of events arising from a multitude of potential causes and/or contributing factors whose nature is likely to reflect geometric characteristics of the road, spatial effects of the surrounding environment, and human behavioural factors. It is postulated that these potential contributing factors do not arise from the same underlying risk process, and thus should be explicitly modelled and understood. The state of the practice in road safety network management applies a safety performance function that represents a single risk process to explain crash variability across network sites. This study aims to elucidate the importance of differentiating among various underlying risk processes contributing to the observed crash count at any particular network location. To demonstrate the principle of this theoretical and corresponding methodological approach, the study explores engineering (e.g. segment length, speed limit) and unobserved spatial factors (e.g. climatic factors, presence of schools) as two explicit sources of crash contributing factors. A Bayesian Latent Class (BLC) analysis is used to explore these two sources and to incorporate prior information about their contribution to crash occurrence. The methodology is applied to the state controlled roads in Queensland, Australia and the results are compared with the traditional Negative Binomial (NB) model. A comparison of goodness of fit measures indicates that the model with a double risk process outperforms the single risk process NB model, and thus indicating the need for further research to capture all the three crash generation processes into the SPFs.
Resumo:
The safety and performance of bridges could be monitored and evaluated by Structural Health Monitoring (SHM) systems. These systems try to identify and locate the damages in a structure and estimate their severities. Current SHM systems are applied to a single bridge, and they have not been used to monitor the structural condition of a network of bridges. This paper propose a new method which will be used in Synthetic Rating Procedures (SRP) developed by the authors of this paper and utilizes SHM systems for monitoring and evaluating the condition of a network of bridges. Synthetic rating procedures are used to assess the condition of a network of bridges and identify their ratings. As an additional part of the SRP, the method proposed in this paper can continuously monitor the behaviour of a network of bridges and therefore it can assist to prevent the sudden collapses of bridges or the disruptions to their serviceability. The method could be an important part of a bridge management system (BMS) for managers and engineers who work on condition assessment of a network of bridges.
Resumo:
This contribution is a long-term study of the evolving use of the organization-wide groupware in a service network. We are describing the practices related to organization-wide groupware in conjunction with local groupware-related practices and how they have proceeded since the organization was established. In the discussion of these practices we are focussing on issues such as: 1. tendencies for proliferation and integration, 2. local appropriations of a variety of systems, 3. creative appropriations, including the creation of a unique heterogeneous groupware fabric, 4. the design strategy of multiple parallel experimental use an; 5. the relation between disparate local meanings and successful computer supported cooperative practice. As an overarching theme we are exploring the explanatory value of the concepts of objectification and appropriation as compared to the concepts of design vs. use.
Resumo:
Sensor networks for environmental monitoring present enormous benefits to the community and society as a whole. Currently there is a need for low cost, compact, solar powered sensors suitable for deployment in rural areas. The purpose of this research is to develop both a ground based wireless sensor network and data collection using unmanned aerial vehicles. The ground based sensor system is capable of measuring environmental data such as temperature or air quality using cost effective low power sensors. The sensor will be configured such that its data is stored on an ATMega16 microcontroller which will have the capability of communicating with a UAV flying overhead using UAV communication protocols. The data is then either sent to the ground in real time or stored on the UAV using a microcontroller until it lands or is close enough to enable the transmission of data to the ground station.
Resumo:
This technical report describes a Light Detection and Ranging (LiDAR) augmented optimal path planning at low level flight methodology for remote sensing and sampling Unmanned Aerial Vehicles (UAV). The UAV is used to perform remote air sampling and data acquisition from a network of sensors on the ground. The data that contains information on the terrain is in the form of a 3D point clouds maps is processed by the algorithms to find an optimal path. The results show that the method and algorithm are able to use the LiDAR data to avoid obstacles when planning a path from a start to a target point. The report compares the performance of the method as the resolution of the LIDAR map is increased and when a Digital Elevation Model (DEM) is included. From a practical point of view, the optimal path plan is loaded and works seemingly with the UAV ground station and also shows the UAV ground station software augmented with more accurate LIDAR data.
Resumo:
The rights of individuals to self-determination and participation in social, political and economic life are recognised and supported by Articles 1, 3 and 25 of the International Covenant on Civil and Political Rights 1966.4 Article 1 of the United Nations’ Human Rights Council’s Resolution on the Promotion and Protection of Human Rights on the Internet of July 2012 confirms individuals have the same rights online as offline. Access to the internet is essential and as such the UN: Calls upon all States to promote and facilitate access to the Internet and international cooperation aimed at the development of media and information and communications facilities in all countries (Article 3) Accordingly, access to the internet per se is a fundamental human right, which requires direct State recognition and support.5 The obligations of the State to ensure its citizens are able, and are enabled, to access the internet, are not matters that should be delegated to commercial parties. Quite simply – access to the internet, and high-speed broadband, by whatever means are “essential services” and therefore “should be treated as any other utility service”...