924 resultados para Input and outputs
Resumo:
A major oceanographic event preserved in the Cocos plate sedimentary column survived subduction and is recorded in the changing composition of Nicaraguan magmas. A uranium increase in these magmas since the latest Miocene (after 7 Ma) resulted from the 'carbonate crash' at 10 Ma and the ensuing high organic carbon burial in the sediments. The response of the arc to this paleoceanographic event requires near steady-state sediment recycling at this margin since 20 Ma. This relative stability in sediment subduction invites one of the first attempts to balance sedimentary input and arc output across a subduction zone. Calculations based on Th indicate that as much as 75% of the sedimentary column was subducted beneath the arc. The Nicaraguan margin is one of the few places to observe such strong links between the oceans and the solid earth.
Resumo:
Dinoflagellate cysts are useful for reconstructing upper water conditions. For adequate reconstructions detailed information is required about the relationship between modern day environmental conditions and the geographic distribution of cysts in sediments. This Atlas summarises the modern global distribution of 71 organicwalled dinoflagellate cyst species. The synthesis is based on the integration of literature sources together with data of 2405 globally distributed surface sediment samples that have been preparedwith a comparable methodology and taxonomy. The distribution patterns of individual cyst species are being comparedwith environmental factors that are knownto influence dinoflagellate growth, gamete production, encystment, excystment and preservation of their organic-walled cysts: surface water temperature, salinity, nitrate, phosphate, chlorophyll-a concentrations and bottom water oxygen concentrations. Graphs are provided for every species depicting the relationship between seasonal and annual variations of these parameters and the relative abundance of the species. Results have been compared with previously published records; an overview of the ecological significance as well as information about the seasonal production of each individual species is presented. The relationship between the cyst distribution and variation in the aforementioned environmental parameters was analysed by performing a canonical correspondence analysis. All tested variables showed a positive relationship on the 99% confidence level. Sea-surface temperature represents the parameter corresponding to the largest amount of variance within the dataset (40%) followed by nitrate, salinity, phosphate and bottom-water oxygen concentration, which correspond to 34%, 33%, 25% and 24% of the variance, respectively. Characterisations of selected environments as well as a discussion about how these factors could have influenced the final cyst yield in sediments are included.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Risk-ranking protocols are used widely to classify the conservation status of the world's species. Here we report on the first empirical assessment of their reliability by using a retrospective study of 18 pairs of bird and mammal species (one species extinct and the other extant) with eight different assessors. The performance of individual assessors varied substantially, but performance was improved by incorporating uncertainty in parameter estimates and consensus among the assessors. When this was done, the ranks from the protocols were consistent with the extinction outcome in 70-80% of pairs and there were mismatches in only 10-20% of cases. This performance was similar to the subjective judgements of the assessors after they had estimated the range and population parameters required by the protocols, and better than any single parameter. When used to inform subjective judgement, the protocols therefore offer a means of reducing unpredictable biases that may be associated with expert input and have the advantage of making the logic behind assessments explicit. We conclude that the protocols are useful for forecasting extinctions, although they are prone to some errors that have implications for conservation. Some level of error is to be expected, however, given the influence of chance on extinction. The performance of risk assessment protocols may be improved by providing training in the application of the protocols, incorporating uncertainty in parameter estimates and using consensus among multiple assessors, including some who are experts in the application of the protocols. Continued testing and refinement of the protocols may help to provide better absolute estimates of risk, particularly by re-evaluating how the protocols accommodate missing data.
Resumo:
The estimated parameters of output distance functions frequently violate the monotonicity, quasi-convexity and convexity constraints implied by economic theory, leading to estimated elasticities and shadow prices that are incorrectly signed, and ultimately to perverse conclusions concerning the effects of input and output changes on productivity growth and relative efficiency levels. We show how a Bayesian approach can be used to impose these constraints on the parameters of a translog output distance function. Implementing the approach involves the use of a Gibbs sampler with data augmentation. A Metropolis-Hastings algorithm is also used within the Gibbs to simulate observations from truncated pdfs. Our methods are developed for the case where panel data is available and technical inefficiency effects are assumed to be time-invariant. Two models-a fixed effects model and a random effects model-are developed and applied to panel data on 17 European railways. We observe significant changes in estimated elasticities and shadow price ratios when regularity restrictions are imposed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
We demonstrate a portable process for developing a triple bottom line model to measure the knowledge production performance of individual research centres. For the first time, this study also empirically illustrates how a fully units-invariant model of Data Envelopment Analysis (DEA) can be used to measure the relative efficiency of research centres by capturing the interaction amongst a common set of multiple inputs and outputs. This study is particularly timely given the increasing transparency required by governments and industries that fund research activities. The process highlights the links between organisational objectives, desired outcomes and outputs while the emerging performance model represents an executive managerial view. This study brings consistency to current measures that often rely on ratios and univariate analyses that are not otherwise conducive to relative performance analysis.
Resumo:
We develop foreign bank technical, cost and profit efficiency models for particular application with data envelopment analysis (DEA). Key motivations for the paper are (a) the often-observed practice of choosing inputs and outputs where the selection process is poorly explained and linkages to theory are unclear, and (b) foreign bank productivity analysis, which has been neglected in DEA banking literature. The main aim is to demonstrate a process grounded in finance and banking theories for developing bank efficiency models, which can bring comparability and direction to empirical productivity studies. We expect this paper to foster empirical bank productivity studies.
Resumo:
The goal of this manuscript is to introduce a framework for consideration of designs for population pharmacokinetic orpharmacokinetic-pharmacodynamic studies. A standard one compartment pharmacokinetic model with first-order input and elimination is considered. A series of theoretical designs are considered that explore the influence of optimizing the allocation of sampling times, allocating patients to elementary designs, consideration of sparse sampling and unbalanced designs and also the influence of single vs. multiple dose designs. It was found that what appears to be relatively sparse sampling (less blood samples per patient than the number of fixed effects parameters to estimate) can also be highly informative. Overall, it is evident that exploring the population design space can yield many parsimonious designs that are efficient for parameter estimation and that may not otherwise have been considered without the aid of optimal design theory.
Resumo:
Wool tenderness is a significant problem in Australia, especially in areas where sheep graze under highly seasonal conditions. In this study, a profit function model is specified, estimated and simulated to assess the economic impact of staple strength-enhancing research on the profits of Australian woolgrowers. The model is based on a number of fundamental characteristics of the Australian wool industry and the staple-strength enhancing technology being assessed. The model consists of a system of demand and supply equations that are specified in terms of effective, rather than actual, prices. The interrelationships between the inputs and outputs are allowed for in the model in a manner that is consistent with theoretical restrictions. The adoption of the new feed management strategy results in a 4.4% increase in the expected profits of Australian wool producers in the short-run, and a 2.2% increase in expected profits in the long-run.
Resumo:
This paper incorporates hierarchical structure into the neoclassical theory of the firm. Firms are hierarchical in two respects: the organization of workers in production and the wage structure. The firm’s hierarchy is represented as the sector of a circle, where the radius represents the hierarchy’s height, the width of the sector represents the breadth of the hierarchy at a given height, and the angle of the sector represents span of control for any given supervisor. A perfectly competitive firm then chooses height and width, as well as capital inputs, in order to maximize profit. We analyze the short run and long run impact of changes in scale economies, input substitutability and input and output prices on the firm’s hierarchical structure. We find that the firm unambiguously becomes more hierarchical as the specialization of its workers increases or as its output price increases relative to input prices. The effect of changes in scale economies is contingent on the output price. The model also brings forth an analysis of wage inequality within the firm, which is found to be independent of technological considerations, and only depends on the firm’s wage schedule.
Resumo:
Error rates of a Boolean perceptron with threshold and either spherical or Ising constraint on the weight vector are calculated for storing patterns from biased input and output distributions derived within a one-step replica symmetry breaking (RSB) treatment. For unbiased output distribution and non-zero stability of the patterns, we find a critical load, α p, above which two solutions to the saddlepoint equations appear; one with higher free energy and zero threshold and a dominant solution with non-zero threshold. We examine this second-order phase transition and the dependence of α p on the required pattern stability, κ, for both one-step RSB and replica symmetry (RS) in the spherical case and for one-step RSB in the Ising case.
Resumo:
In recent years there has been an increased interest in applying non-parametric methods to real-world problems. Significant research has been devoted to Gaussian processes (GPs) due to their increased flexibility when compared with parametric models. These methods use Bayesian learning, which generally leads to analytically intractable posteriors. This thesis proposes a two-step solution to construct a probabilistic approximation to the posterior. In the first step we adapt the Bayesian online learning to GPs: the final approximation to the posterior is the result of propagating the first and second moments of intermediate posteriors obtained by combining a new example with the previous approximation. The propagation of em functional forms is solved by showing the existence of a parametrisation to posterior moments that uses combinations of the kernel function at the training points, transforming the Bayesian online learning of functions into a parametric formulation. The drawback is the prohibitive quadratic scaling of the number of parameters with the size of the data, making the method inapplicable to large datasets. The second step solves the problem of the exploding parameter size and makes GPs applicable to arbitrarily large datasets. The approximation is based on a measure of distance between two GPs, the KL-divergence between GPs. This second approximation is with a constrained GP in which only a small subset of the whole training dataset is used to represent the GP. This subset is called the em Basis Vector, or BV set and the resulting GP is a sparse approximation to the true posterior. As this sparsity is based on the KL-minimisation, it is probabilistic and independent of the way the posterior approximation from the first step is obtained. We combine the sparse approximation with an extension to the Bayesian online algorithm that allows multiple iterations for each input and thus approximating a batch solution. The resulting sparse learning algorithm is a generic one: for different problems we only change the likelihood. The algorithm is applied to a variety of problems and we examine its performance both on more classical regression and classification tasks and to the data-assimilation and a simple density estimation problems.
Resumo:
As a part of the Managing Uncertainty in Complex Models (MUCM) project, research at Aston University will develop methods for dimensionality reduction of the input and/or output spaces of models, as seen within the emulator framework. Towards this end this report describes a framework for generating toy datasets, whose underlying structure is understood, to facilitate early investigations of dimensionality reduction methods and to gain a deeper understanding of the algorithms employed, both in terms of how effective they are for given types of models / situations, and also their speed in applications and how this scales with various factors. The framework, which allows the evaluation of both screening and projection approaches to dimensionality reduction, is described. We also describe the screening and projection methods currently under consideration and present some preliminary results. The aim of this draft of the report is to solicit feedback from the project team on the dataset generation framework, the methods we propose to use, and suggestions for extensions that should be considered.