878 resultados para semi-Markov decision process
Resumo:
Multi temporal land use information were derived using two decades remote sensing data and simulated for 2012 and 2020 with Cellular Automata (CA) considering scenarios, change probabilities (through Markov chain) and Multi Criteria Evaluation (MCE). Agents and constraints were considered for modeling the urbanization process. Agents were nornmlized through fiizzyfication and priority weights were assigned through Analytical Hierarchical Process (AHP) pairwise comparison for each factor (in MCE) to derive behavior-oriented rules of transition for each land use class. Simulation shows a good agreement with the classified data. Fuzzy and AHP helped in analyzing the effects of agents of growth clearly and CA-Markov proved as a powerful tool in modelling and helped in capturing and visualizing the spatiotemporal patterns of urbanization. This provided rapid land evaluation framework with the essential insights of the urban trajectory for effective sustainable city planning.
Resumo:
Wildlife conservation in human-dominated landscapes requires that we understand how animals, when making habitat-use decisions, obtain diverse and dynamically occurring resources while avoiding risks, induced by both natural predators and anthropogenic threats. Little is known about the underlying processes that enable wild animals to persist in densely populated human-dominated landscapes, particularly in developing countries. In a complex, semi-arid, fragmented, human-dominated agricultural landscape, we analyzed the habitat-use of blackbuck, a large herbivore endemic to the Indian sub-continent. We hypothesized that blackbuck would show flexible habitat-use behaviour and be risk averse when resource quality in the landscape is high, and less sensitive to risk otherwise. Overall, blackbuck appeared to be strongly influenced by human activity and they offset risks by using small protected patches (similar to 3 km(2)) when they could afford to do so. Blackbuck habitat use varied dynamically corresponding with seasonally-changing levels of resources and risks, with protected habitats registering maximum use. The findings show that human activities can strongly influence and perhaps limit ungulate habitat-use and behaviour, but spatial heterogeneity in risk, particularly the presence of refuges, can allow ungulates to persist in landscapes with high human and livestock densities.
Resumo:
This work describes the deposition and characterisation of semi-insulating oxygen-doped silicon films for the development of high voltage polycrystalline silicon (poly-Si) circuitry on glass. The performance of a novel poly-Si High Voltage Thin Film Transistor (HVTFT) structure, incorporating a layer of semi-insulating material, has been investigated using a two dimensional device simulator. The semi-insulating layer increases the operating voltage of the HVTFT structure by linearising the potential distribution in the device offset region. A glass compatible semi-insulating layer, suitable for HVTFT applications, has been deposited by the Plasma Enhanced Chemical Vapour Deposition (PECVD) technique from silane (SiH4), nitrous oxide (N2O) and helium (He) gas mixtures. The as-deposited films are furnace annealed at 600°C which is the maximum process temperature. By varying the N2O/SiH4 ratio the conductivity of the annealed films can be accurately controlled up to a maximum of around 10-7 Ω-1cm-1. Helium dilution of the reactant gases improves both film uniformity and reproducibility. Raman analysis shows the as-deposited and annealed films to be completely amorphous. A model for the microstructure of these Semi-Insulating Amorphous Oxygen-Doped Silicon (SIAOS) films is proposed to explain the observed physical and electrical properties.
Resumo:
This work describes the deposition, annealing and characterisation of semi-insulating oxygen-doped silicon films at temperatures compatible with polysilicon circuitry on glass. The semi-insulating layers are deposited by the plasma enhanced chemical vapour deposition technique from silane (SiH4), nitrous oxide (N2O) and helium (He) gas mixtures at a temperature of 350 °C. The as-deposited films are then furnace annealed at 600 °C which is the maximum process temperature. Raman analysis shows the as-deposited and annealed films to be completely amorphous. The most important deposition variable is the N2O SiH4 gas ratio. By varying the N2O SiH4 ratio the conductivity of the annealed films can be accurately controlled, for the first time, down to a minimum of ≈10-7Ω-1cm-1 where they exhibit a T -1 4 temperature dependence indicative of a hopping conduction mechanism. Helium dilution of the reactant gases is shown to improve both film uniformity and reproducibility. A model for the microstructure of these semi-insulating amorphous oxygen-doped silicon films is proposed to explain the observed physical and electrical properties. © 1995.
Resumo:
This work describes the annealing and characterisation of semi-insulating oxygen-doped silicon films deposited by the Plasma Enhanced Chemical Vapour Deposition (PECVD) technique from silane (SiH4), nitrous oxide (N2O) and helium (He) gas mixtures. The maximum process temperature is chosen to be compatible with large area polycrystalline silicon (poly-Si) circuitry on glass. The most important deposition variable is shown to be the N2O SiH4 gas ratio. Helium dilution results in improved film uniformity and reproducibility. Raman analysis shows the 'as-deposited' and annealed films to be completely amorphous. A model for the microstructure of these Semi-Insulating Amorphous Oxygen-doped Silicon (SIAOS) films is proposed to explain the observed physical and electrical properties. © 1995.
Resumo:
The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finitedimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets. Copyright 2009.
Resumo:
The inhomogeneous Poisson process is a point process that has varying intensity across its domain (usually time or space). For nonparametric Bayesian modeling, the Gaussian process is a useful way to place a prior distribution on this intensity. The combination of a Poisson process and GP is known as a Gaussian Cox process, or doubly-stochastic Poisson process. Likelihood-based inference in these models requires an intractable integral over an infinite-dimensional random function. In this paper we present the first approach to Gaussian Cox processes in which it is possible to perform inference without introducing approximations or finite-dimensional proxy distributions. We call our method the Sigmoidal Gaussian Cox Process, which uses a generative model for Poisson data to enable tractable inference via Markov chain Monte Carlo. We compare our methods to competing methods on synthetic data and apply it to several real-world data sets.
A Semi-Empirical Equation of Penetration Depth on Concrete Target Impacted by Ogive-Nose Projectiles
Resumo:
In this paper, the penetration process of ogive-nose projectiles into the semi-infinite concrete target is investigated by the dimensional analysis method and FEM simulation. With the dimensional analysis, main non-dimensional parameters which control the penetration depth are obtained with some reasonable hypothesis. Then, a new semi-empirical equation is present based on the original work of Forrestal et al., has only two non-dimensional combined variables with definite physical meanings. To verify this equation, prediction results are compared with experiments in a wide variation region of velocity. Then, a commercial FEM code, LS-DYNA, is used to simulate the complex penetration process, that also show the novel semi-empirical equation is reasonable for determining the penetration depth in a concrete target.
Resumo:
Statistical Process Control (SPC) technique are well established across a wide range of industries. In particular, the plotting of key steady state variables with their statistical limit against time (Shewart charting) is a common approach for monitoring the normality of production. This paper aims with extending Shewart charting techniques to the quality monitoring of variables driven by uncertain dynamic processes, which has particular application in the process industries where it is desirable to monitor process variables on-line as well as final product. The robust approach to dynamic SPC is based on previous work on guaranteed cost filtering for linear systems and is intended to provide a basis for both a wide application of SPC monitoring and also motivate unstructured fault detection.
Resumo:
Many problems in control and signal processing can be formulated as sequential decision problems for general state space models. However, except for some simple models one cannot obtain analytical solutions and has to resort to approximation. In this thesis, we have investigated problems where Sequential Monte Carlo (SMC) methods can be combined with a gradient based search to provide solutions to online optimisation problems. We summarise the main contributions of the thesis as follows. Chapter 4 focuses on solving the sensor scheduling problem when cast as a controlled Hidden Markov Model. We consider the case in which the state, observation and action spaces are continuous. This general case is important as it is the natural framework for many applications. In sensor scheduling, our aim is to minimise the variance of the estimation error of the hidden state with respect to the action sequence. We present a novel SMC method that uses a stochastic gradient algorithm to find optimal actions. This is in contrast to existing works in the literature that only solve approximations to the original problem. In Chapter 5 we presented how an SMC can be used to solve a risk sensitive control problem. We adopt the use of the Feynman-Kac representation of a controlled Markov chain flow and exploit the properties of the logarithmic Lyapunov exponent, which lead to a policy gradient solution for the parameterised problem. The resulting SMC algorithm follows a similar structure with the Recursive Maximum Likelihood(RML) algorithm for online parameter estimation. In Chapters 6, 7 and 8, dynamic Graphical models were combined with with state space models for the purpose of online decentralised inference. We have concentrated more on the distributed parameter estimation problem using two Maximum Likelihood techniques, namely Recursive Maximum Likelihood (RML) and Expectation Maximization (EM). The resulting algorithms can be interpreted as an extension of the Belief Propagation (BP) algorithm to compute likelihood gradients. In order to design an SMC algorithm, in Chapter 8 uses a nonparametric approximations for Belief Propagation. The algorithms were successfully applied to solve the sensor localisation problem for sensor networks of small and medium size.
Resumo:
This paper analyzes the stationarity of this ratio in the context of a Markov-switching model à la Hamilton (1989) where an asymmetric speed of adjustment is introduced. This particular specification robustly supports a nonlinear reversion process and identifies two relevant episodes: the post-war period from the mid-50’s to the mid-70’s and the so called “90’s boom” period. A three-regime Markov-switching model displays the best regime identification and reveals that only the first part of the 90’s boom (1985-1995) and the post-war period are near-nonstationary states. Interestingly, the last part of the 90’s boom (1996-2000), characterized by a growing price-dividend ratio, is entirely attributed to a regime featuring a highly reverting process.
A Semi-Empirical Equation of Penetration Depth on Concrete Target Impacted by Ogive-Nose Projectiles
Resumo:
In this paper, the penetration process of ogive-nose projectiles into the semi-infinite concrete target is investigated by the dimensional analysis method and FEM simulation. With the dimensional analysis, main non-dimensional parameters which control the penetration depth are obtained with some reasonable hypothesis. Then, a new semi-empirical equation is present based on the original work of Forrestal et al., has only two non-dimensional combined variables with definite physical meanings. To verify this equation, prediction results are compared with experiments in a wide variation region of velocity. Then, a commercial FEM code, LS-DYNA, is used to simulate the complex penetration process, that also show the novel semi-empirical equation is reasonable for determining the penetration depth in a concrete target.
Resumo:
In a time when Technology Supported Learning Systems are being widely used, there is a lack of tools that allows their development in an automatic or semi-automatic way. Technology Supported Learning Systems require an appropriate Domain Module, ie. the pedagogical representation of the domain to be mastered, in order to be effective. However, content authoring is a time and effort consuming task, therefore, efforts in automatising the Domain Module acquisition are necessary.Traditionally, textbooks have been used as the main mechanism to maintain and transmit the knowledge of a certain subject or domain. Textbooks have been authored by domain experts who have organised the contents in a means that facilitate understanding and learning, considering pedagogical issues.Given that textbooks are appropriate sources of information, they can be used to facilitate the development of the Domain Module allowing the identification of the topics to be mastered and the pedagogical relationships among them, as well as the extraction of Learning Objects, ie. meaningful fragments of the textbook with educational purpose.Consequently, in this work DOM-Sortze, a framework for the semi-automatic construction of Domain Modules from electronic textbooks, has been developed. DOM-Sortze uses NLP techniques, heuristic reasoning and ontologies to fulfill its work. DOM-Sortze has been designed and developed with the aim of automatising the development of the Domain Module, regardless of the subject, promoting the knowledge reuse and facilitating the collaboration of the users during the process.
Resumo:
Case study on how Pembrokeshire College is using a digital tool and methodology called VocalEyes to support democratic decision-making and improve learner involvement and satisfaction.
Resumo:
Population pressure in coastal New Hampshire challenges land use decision-making and threatens the ecological health and functioning of Great Bay, an estuary designated as both a NOAA National Estuarine Research Reserve and an EPA National Estuary Program site. Regional population in the seacoast has quadrupled in four decades resulting in sprawl, increased impervious surface cover and larger lot rural development (Zankel, et.al., 2006). All of Great Bay’s contributing watersheds face these challenges, resulting in calls for strategies addressing growth, development and land use planning. The communities within the Lamprey River watershed comprise this case study. Do these towns communicate upstream and downstream when making land use decisions? Are cumulative effects considered while debating development? Do town land use groups consider the Bay or the coasts in their decision-making? This presentation, a follow-up from the TCS 2008 conference and a completed dissertation, will discuss a novel social science approach to analyze and understand the social landscape of land use decision-making in the towns of the Lamprey River watershed. The methods include semi-structured interviews with GIS based maps in a grounded theory analytical strategy. The discussion will include key findings, opportunities and challenges in moving towards a watershed approach for land use planning. This presentation reviews the results of the case study and developed methodology, which can be used in watersheds elsewhere to map out the potential for moving towns towards EBM and watershed-scaled, land use planning. (PDF contains 4 pages)