718 resultados para Non-Markov


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australian climate, soils and agricultural management practices are significantly different from those of the northern hemisphere nations. Consequently, experimental data on greenhouse gas production from European and North American agricultural soils and its interpretation are unlikely to be directly applicable to Australian systems. A programme of studies of non-CO2 greenhouse gas emissions from agriculture has been established that is designed to reduce uncertainty of non-CO2 greenhouse gas emissions in the Australian National Greenhouse Gas Inventory and provide outputs that will enable better on-farm management practices for reducing non-CO2 greenhouse gas emissions, particularly nitrous oxide. The systems being examined and their locations are irrigated pasture (Kyabram Victoria), irrigated cotton (Narrabri, NSW), irrigated maize (Griffith, NSW), rain-fed wheat (Rutherglen, Victoria) and rain-fed wheat (Cunderdin, WA). The field studies include treatments with and without fertilizer addition, stubble burning versus stubble retention, conventional cultivation versus direct drilling and crop rotation to determine emission factors and treatment possibilities for best management options. The data to date suggest that nitrous oxide emissions from nitrogen fertilizer, applied to irrigated dairy pastures and rain-fed winter wheat, appear much lower than the average of northern hemisphere grain and pasture studies. More variable emissions have been found in studies of irrigated cotton/vetch/wheat rotation and substantially higher emissions from irrigated maize.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This instrument was used in the project named Teachers Reporting Child Sexual Abuse: Towards Evidence-based Reform of Law, Policy and Practice (ARC DP0664847)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This instrument was used in the project named Teachers Reporting Child Sexual Abuse: Towards Evidence-based Reform of Law, Policy and Practice (ARC DP0664847)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A time-resolved inverse spatially offset Raman spectrometer was constructed for depth profiling of Raman-active substances under both the lab and the field environments. The system operating principles and performance are discussed along with its advantages relative to traditional continuous wave spatially offset Raman spectrometer. The developed spectrometer uses a combination of space- and time-resolved detection in order to obtain high-quality Raman spectra from substances hidden behind coloured opaque surface layers, such as plastic and garments, with a single measurement. The time-gated spatially offset Raman spectrometer was successfully used to detect concealed explosives and drug precursors under incandescent and fluorescent background light as well as under daylight. The average screening time was 50 s per measurement. The excitation energy requirements were relatively low (20 mW) which makes the probe safe for screening hazardous substances. The unit has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than previous picosecond-based systems, to provide a functional platform for in-line or in-field sensing of chemical substances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, spatially offset Raman spectroscopy (SORS) is demonstrated for non-invasively investigating the composition of drug mixtures inside an opaque plastic container. The mixtures consisted of three components including a target drug (acetaminophen or phenylephrine hydrochloride) and two diluents (glucose and caffeine). The target drug concentrations ranged from 5% to 100%. After conducting SORS analysis to ascertain the Raman spectra of the concealed mixtures, principal component analysis (PCA) was performed on the SORS spectra to reveal trends within the data. Partial least squares (PLS) regression was used to construct models that predicted the concentration of each target drug, in the presence of the other two diluents. The PLS models were able to predict the concentration of acetaminophen in the validation samples with a root-mean-square error of prediction (RMSEP) of 3.8% and the concentration of phenylephrine hydrochloride with an RMSEP of 4.6%. This work demonstrates the potential of SORS, used in conjunction with multivariate statistical techniques, to perform non-invasive, quantitative analysis on mixtures inside opaque containers. This has applications for pharmaceutical analysis, such as monitoring the degradation of pharmaceutical products on the shelf, in forensic investigations of counterfeit drugs, and for the analysis of illicit drug mixtures which may contain multiple components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Discrete Markov random field models provide a natural framework for representing images or spatial datasets. They model the spatial association present while providing a convenient Markovian dependency structure and strong edge-preservation properties. However, parameter estimation for discrete Markov random field models is difficult due to the complex form of the associated normalizing constant for the likelihood function. For large lattices, the reduced dependence approximation to the normalizing constant is based on the concept of performing computationally efficient and feasible forward recursions on smaller sublattices which are then suitably combined to estimate the constant for the whole lattice. We present an efficient computational extension of the forward recursion approach for the autologistic model to lattices that have an irregularly shaped boundary and which may contain regions with no data; these lattices are typical in applications. Consequently, we also extend the reduced dependence approximation to these scenarios enabling us to implement a practical and efficient non-simulation based approach for spatial data analysis within the variational Bayesian framework. The methodology is illustrated through application to simulated data and example images. The supplemental materials include our C++ source code for computing the approximate normalizing constant and simulation studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction This study reports on the development of a self report assessment tool to increase the efficacy of crash prediction within Australian Fleet settings Over last 20 years an array of measures have been produced (Driver anger scale, Driving Skill Inventory, Manchester Driver Behaviour Questionnaire, Driver Attitude Questionnaire, Driver Stress Inventory, Safety Climate Questionnaire) While these tools are useful, research has demonstrated limited ability to accurately identify individuals most likely to be involved in a crash. Reasons cited include; - Crashes are relatively rare - Other competing factors may influence crash event - Ongoing questions regarding the validity of self report measures (common method variance etc) - Lack of contemporary issues relating to fleet driving performance

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current research in secure messaging for Vehicular Ad hoc Networks (VANETs) appears to focus on employing a digital certificate-based Public Key Cryptosystem (PKC) to support security. The security overhead of such a scheme, however, creates a transmission delay and introduces a time-consuming verification process to VANET communications. This paper proposes a non-certificate-based public key management for VANETs. A comprehensive evaluation of performance and scalability of the proposed public key management regime is presented, which is compared to a certificate-based PKC by employing a number of quantified analyses and simulations. Not only does this paper demonstrate that the proposal can maintain security, but it also asserts that it can improve overall performance and scalability at a lower cost, compared to the certificate-based PKC. It is believed that the proposed scheme will add a new dimension to the key management and verification services for VANETs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently an innovative composite panel system was developed, where a thin insulation layer was used externally between two plasterboards to improve the fire performance of light gauge cold-formed steel frame walls. In this research, finite-element thermal models of both the traditional light gauge cold-formed steel frame wall panels with cavity insulation and the new light gauge cold-formed steel frame composite wall panels were developed to simulate their thermal behaviour under standard and realistic fire conditions. Suitable apparent thermal properties of gypsum plasterboard, insulation materials and steel were proposed and used. The developed models were then validated by comparing their results with available fire test results. This article presents the details of the developed finite-element models of small-scale non-load-bearing light gauge cold-formed steel frame wall panels and the results of the thermal analysis. It has been shown that accurate finite-element models can be used to simulate the thermal behaviour of small-scale light gauge cold-formed steel frame walls with varying configurations of insulations and plasterboards. The numerical results show that the use of cavity insulation was detrimental to the fire rating of light gauge cold-formed steel frame walls, while the use of external insulation offered superior thermal protection to them. The effects of real fire conditions are also presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A precise definition of interaction behavior between services is a prerequisite for successful business-to-business integration. Service choreographies provide a view on message exchanges and their ordering constraints from a global perspective. Assuming message sending and receiving as one atomic step allows to reduce the modelers’ effort. As downside, problematic race conditions resulting in deadlocks might appear when realizing the choreography using services that exchange messages asynchronously. This paper presents typical issues when desynchronizing service choreographies. Solutions from practice are discussed and a formal approach based on Petri nets is introduced for identifying desynchronizable choreographies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate reliability prediction for large-scale, long lived engineering is a crucial foundation for effective asset risk management and optimal maintenance decision making. However, a lack of failure data for assets that fail infrequently, and changing operational conditions over long periods of time, make accurate reliability prediction for such assets very challenging. To address this issue, we present a Bayesian-Marko best approach to reliability prediction using prior knowledge and condition monitoring data. In this approach, the Bayesian theory is used to incorporate prior information about failure probabilities and current information about asset health to make statistical inferences, while Markov chains are used to update and predict the health of assets based on condition monitoring data. The prior information can be supplied by domain experts, extracted from previous comparable cases or derived from basic engineering principles. Our approach differs from existing hybrid Bayesian models which are normally used to update the parameter estimation of a given distribution such as the Weibull-Bayesian distribution or the transition probabilities of a Markov chain. Instead, our new approach can be used to update predictions of failure probabilities when failure data are sparse or nonexistent, as is often the case for large-scale long-lived engineering assets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deep Raman spectroscopy has been utilized for the standoff detection of concealed chemical threat agents from a distance of 15 meters under real life background illumination conditions. By using combined time and space resolved measurements, various explosive precursors hidden in opaque plastic containers were identified non-invasively. Our results confirm that combined time and space resolved Raman spectroscopy leads to higher selectivity towards the sub-layer over the surface layer as well as enhanced rejection of fluorescence from the container surface when compared to standoff spatially offset Raman spectroscopy. Raman spectra that have minimal interference from the packaging material and good signal-to-noise ratio were acquired within 5 seconds of measurement time. A new combined time and space resolved Raman spectrometer has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than picosecond-based laboratory systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper seeks to identify and quantify sources of the lagging productivity in Singapore’s retail sector as reported in the Economic Strategies Committee 2010 report. A two-stage analysis is adopted. In the first stage, the Malmquist productivity index is employed which provides measures of productivity change, technological change and efficiency change. In the second stage, technical efficiency estimates are regressed against explanatory variables based on a truncated regression model. Sources of technical efficiency were attributed to quality of workers while product assortment and competition negatively impacted on efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositionality is a frequently made assumption in linguistics, and yet many human subjects reveal highly non-compositional word associations when confronted with novel concept combinations. This article will show how a non-compositional account of concept combinations can be supplied by modelling them as interacting quantum systems.