898 resultados para Two Approaches
Resumo:
This study gives an overview of the theoretical foundations, empirical procedures and derived results of the literature identifying determinants of land prices. Special attention is given to the effects of different government support policies on land prices. Since almost all empirical studies on the determination of land prices refer either to the net present value method or the hedonic pricing approach as a theoretical basis, a short review of these models is provided. While the two approaches have different theoretical bases, their empirical implementation converges. Empirical studies use a broad range of variables to explain land values and we systematise those into six categories. In order to investigate the influence of different measures of government support on land prices, a meta-regression analysis is carried out. Our results reveal a significantly higher rate of capitalisation for decoupled direct payments and a significantly lower rate of capitalisation for agri-environmental payments, as compared to the rest of government support. Furthermore, the results show that taking theoretically consistent land rents (returns to land) and including non-agricultural variables like urban pressure in the regression implies lower elasticities of capitalisation. In addition, we find a significant influence of the land type, the data type and estimation techniques on the capitalisation rate.
Resumo:
We compare eight pollen records reflecting climatic and environmental change from the tropical Andes. Our analysis focuses on the last 50 ka, with particular emphasis on the Pleistocene to Holocene transition. We explore ecological grouping and downcore ordination results as two approaches for extracting environmental variability from pollen records. We also use the records of aquatic and shoreline vegetation as markers for lake level fluctuations, and precipitation change. Our analysis focuses on the signature of millennial-scale variability in the tropical Andes, in particular, Heinrich stadials and Greenland interstadials. We identify rapid responses of the tropical vegetation to this climate variability, and relate differences between sites to moisture sources and site sensitivity.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Thesis (D.M.A.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
We employ two different methods, based on belief propagation and TAP,for decoding corrupted messages encoded by employing Sourlas's method, where the code word comprises products of K bits selected randomly from the original message. We show that the equations obtained by the two approaches are similar and provide the same solution as the one obtained by the replica approach in some cases K=2. However, we also show that for K>=3 and unbiased messages the iterative solution is sensitive to the initial conditions and is likely to provide erroneous solutions; and that it is generally beneficial to use Nishimori's temperature, especially in the case of biased messages.
Resumo:
This thesis is a study of three techniques to improve performance of some standard fore-casting models, application to the energy demand and prices. We focus on forecasting demand and price one-day ahead. First, the wavelet transform was used as a pre-processing procedure with two approaches: multicomponent-forecasts and direct-forecasts. We have empirically compared these approaches and found that the former consistently outperformed the latter. Second, adaptive models were introduced to continuously update model parameters in the testing period by combining ?lters with standard forecasting methods. Among these adaptive models, the adaptive LR-GARCH model was proposed for the fi?rst time in the thesis. Third, with regard to noise distributions of the dependent variables in the forecasting models, we used either Gaussian or Student-t distributions. This thesis proposed a novel algorithm to infer parameters of Student-t noise models. The method is an extension of earlier work for models that are linear in parameters to the non-linear multilayer perceptron. Therefore, the proposed method broadens the range of models that can use a Student-t noise distribution. Because these techniques cannot stand alone, they must be combined with prediction models to improve their performance. We combined these techniques with some standard forecasting models: multilayer perceptron, radial basis functions, linear regression, and linear regression with GARCH. These techniques and forecasting models were applied to two datasets from the UK energy markets: daily electricity demand (which is stationary) and gas forward prices (non-stationary). The results showed that these techniques provided good improvement to prediction performance.
Resumo:
Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control
Resumo:
To reduce global biodiversity loss, there is an urgent need to determine the most efficient allocation of conservation resources. Recently, there has been a growing trend for many governments to supplement public ownership and management of reserves with incentive programs for conservation on private land. This raises important questions, such as the extent to which private land conservation can improve conservation outcomes, and how it should be mixed with more traditional public land conservation. We address these questions, using a general framework for modelling environmental policies and a case study examining the conservation of endangered native grasslands to the west of Melbourne, Australia. Specifically, we examine three policies that involve i) spending all resources on creating public conservation areas; ii) spending all resources on an ongoing incentive program where private landholders are paid to manage vegetation on their property with 5-year contracts; and iii) splitting resources between these two approaches. The performance of each strategy is quantified with a vegetation condition change model that predicts future changes in grassland quality. Of the policies tested, no one policy was always best and policy performance depended on the objectives of those enacting the policy. Although policies to promote conservation on private land are proposed and implemented in many areas, they are rarely evaluated in terms of their ecological consequences. This work demonstrates a general method for evaluating environmental policies and highlights the utility of a model which combines ecological and socioeconomic processes.
Resumo:
This work studies the development of polymer membranes for the separation of hydrogen and carbon monoxide from a syngas produced by the partial oxidation of natural gas. The CO product is then used for the large scale manufacture of acetic acid by reaction with methanol. A method of economic evaluation has been developed for the process as a whole and a comparison is made between separation of the H2/CO mixture by a membrane system and the conventional method of cryogenic distillation. Costs are based on bids obtained from suppliers for several different specifications for the purity of the CO fed to the acetic acid reactor. When the purity of the CO is set at that obtained by cryogenic distillation it is shown that the membrane separator offers only a marginal cost advantage. Cost parameters for the membrane separation systems have been defined in terms of effective selectivity and cost permeability. These new parameters, obtained from an analysis of the bids, are then used in a procedure which defines the optimum degree of separation and recovery of carbon monoxide for a minimum cost of manufacture of acetic acid. It is shown that a significant cost reduction is achieved with a membrane separator at the optimum process conditions. A method of "targeting" the properties of new membranes has been developed. This involves defining the properties for new (hypothetical -yet to be developed) membranes such that their use for the hydrogen/carbon monoxide separation will produce a reduced cost of acetic acid manufacture. The use of the targeting method is illustrated in the development of new membranes for the separation of hydrogen and carbon monoxide. The selection of polymeric materials for new membranes is based on molecular design methods which predict the polymer properties from the molecular groups making up the polymer molecule. Two approaches have been used. One method develops the analogy between gas solubility in liquids and that in polymers. The UNIFAC group contribution method is then used to predict gas solubility in liquids. In the second method the polymer Permachor number, developed by Salame, has been correlated with hydrogen and carbon monoxide permeabilities. These correlations are used to predict the permeabilities of gases through polymers. Materials have been tested for hydrogen and carbon monoxide permeabilities and improvements in expected economic performance have been achieved.
Resumo:
This paper presents some forecasting techniques for energy demand and price prediction, one day ahead. These techniques combine wavelet transform (WT) with fixed and adaptive machine learning/time series models (multi-layer perceptron (MLP), radial basis functions, linear regression, or GARCH). To create an adaptive model, we use an extended Kalman filter or particle filter to update the parameters continuously on the test set. The adaptive GARCH model is a new contribution, broadening the applicability of GARCH methods. We empirically compared two approaches of combining the WT with prediction models: multicomponent forecasts and direct forecasts. These techniques are applied to large sets of real data (both stationary and non-stationary) from the UK energy markets, so as to provide comparative results that are statistically stronger than those previously reported. The results showed that the forecasting accuracy is significantly improved by using the WT and adaptive models. The best models on the electricity demand/gas price forecast are the adaptive MLP/GARCH with the multicomponent forecast; their MSEs are 0.02314 and 0.15384 respectively.
Resumo:
We address the question of how to obtain effective fusion of identification information such that it is robust to the quality of this information. As well as technical issues data fusion is encumbered with a collection of (potentially confusing) practical considerations. These considerations are described during the early chapters in which a framework for data fusion is developed. Following this process of diversification it becomes clear that the original question is not well posed and requires more precise specification. We use the framework to focus on some of the technical issues relevant to the question being addressed. We show that fusion of hard decisions through use of an adaptive version of the maximum a posteriori decision rule yields acceptable performance. Better performance is possible using probability level fusion as long as the probabilities are accurate. Of particular interest is the prevalence of overconfidence and the effect it has on fused performance. The production of accurate probabilities from poor quality data forms the latter part of the thesis. Two approaches are taken. Firstly the probabilities may be moderated at source (either analytically or numerically). Secondly, the probabilities may be transformed at the fusion centre. In each case an improvement in fused performance is demonstrated. We therefore conclude that in order to obtain robust fusion care should be taken to model the probabilities accurately; either at the source or centrally.
Resumo:
The introduction of agent technology raises several security issues that are beyond conventional security mechanisms capability and considerations, but research in protecting the agent from malicious host attack is evolving. This research proposes two approaches to protecting an agent from being attacked by a malicious host. The first approach consists of an obfuscation algorithm that is able to protect the confidentiality of an agent and make it more difficult for a malicious host to spy on the agent. The algorithm uses multiple polynomial functions with multiple random inputs to convert an agent's critical data to a value that is meaningless to the malicious host. The effectiveness of the obfuscation algorithm is enhanced by addition of noise code. The second approach consists of a mechanism that is able to protect the integrity of the agent using state information, recorded during the agent execution process in a remote host environment, to detect a manipulation attack by a malicious host. Both approaches are implemented using a master-slave agent architecture that operates on a distributed migration pattern. Two sets of experimental test were conducted. The first set of experiments measures the migration and migration+computation overheads of the itinerary and distributed migration patterns. The second set of experiments is used to measure the security overhead of the proposed approaches. The protection of the agent is assessed by analysis of its effectiveness under known attacks. Finally, an agent-based application, known as Secure Flight Finder Agent-based System (SecureFAS) is developed, in order to prove the function of the proposed approaches.
Resumo:
Jackson System Development (JSD) is an operational software development method which addresses most of the software lifecycle either directly or by providing a framework into which more specialised techniques can fit. The method has two major phases: first an abstract specification is derived that is in principle executable; second the specification is implemented using a variety of transformations. The object oriented paradigm is based on data abstraction and encapsulation coupled to an inheritance architecture that is able to support software reuse. Its claims of improved programmer productivity and easier program maintenance make it an important technology to be considered for building complex software systems. The mapping of JSD specifications into procedural languages typified by Cobol, Ada, etc., involves techniques such as inversion and state vector separation to produce executable systems of acceptable performance. However, at present, no strategy exists to map JSD specifications into object oriented languages. The aim of this research is to investigate the relationship between JSD and the object oriented paradigm, and to identify and implement transformations capable of mapping JSD specifications into an object oriented language typified by Smalltalk-80. The direction which the transformational strategy follows is one whereby the concurrency of a specification is removed. Two approaches implementing inversion - an architectural transformation resulting in a simulated coroutine mechanism being generated - are described in detail. The first approach directly realises inversions by manipulating Smalltalk-80 system contexts. This is possible in Smalltalk-80 because contexts are first class objects and are accessible to the user like any other system object. However, problems associated with this approach are expounded. The second approach realises coroutine-like behaviour in a structure called a `followmap'. A followmap is the results of a transformation on a JSD process in which a collection of followsets is generated. Each followset represents all possible state transitions a process can undergo from the current state of the process. Followsets, together with exploitation of the class/instance mechanism for implementing state vector separation, form the basis for mapping JSD specifications into Smalltalk-80. A tool, which is also built in Smalltalk-80, supports these derived transformations and enables a user to generate Smalltalk-80 prototypes of JSD specifications.
Resumo:
Many of the applications of geometric modelling are concerned with the computation of well-defined properties of the model. The applications which have received less attention are those which address questions to which there is no unique answer. This thesis describes such an application: the automatic production of a dimensioned engineering drawing. One distinctive feature of this operation is the requirement for sophisticated decision-making algorithms at each stage in the processing of the geometric model. Hence, the thesis is focussed upon the design, development and implementation of such algorithms. Various techniques for geometric modelling are briefly examined and then details are given of the modelling package that was developed for this project, The principles of orthographic projection and dimensioning are treated and some published work on the theory of dimensioning is examined. A new theoretical approach to dimensioning is presented and discussed. The existing body of knowledge on decision-making is sampled and the author then shows how methods which were originally developed for management decisions may be adapted to serve the purposes of this project. The remainder of the thesis is devoted to reports on the development of decision-making algorithms for orthographic view selection, sectioning and crosshatching, the preparation of orthographic views with essential hidden detail, and two approaches to the actual insertion of dimension lines and text. The thesis concludes that the theories of decision-making can be applied to work of this kind. It may be possible to generate computer solutions that are closer to the optimum than some man-made dimensioning schemes. Further work on important details is required before a commercially acceptable package could be produced.