952 resultados para Stochastic Frontier Production Function
Resumo:
1. We examined the effect of thermal acclimation on fighting success and underlying performance traits in the crayfish Cherax destructor. We tested the hypothesis that animals will be more successful when fighting at their acclimation temperature than at a colder or warmer temperature, and that changes in metabolic capacity underlie differences in behavioural performance. 2. Thermal acclimation (to 20 degrees C and to 30 degrees C) had a significant effect on behavioural contests, and the likelihood of winning was significantly greater when individuals fought at their acclimation temperature against an individual from an alternate acclimation temperature. 3. The ratio of ADP stimulated respiration to proton leak (respiratory control ratio) of isolated mitochondria increased significantly in chelae muscle of the cold-acclimated group, and differences in respiratory control ratio between winners and losers were significantly correlated with the outcome of agonistic encounters. However, acclimation did not affect tall muscle mitochondria or the activity of pyruvate kinase in either chelae or tail muscle. 4. The force produced by closing chelae was thermally insensitive within acclimation groups, and there were no significant differences between acclimation treatments. None the less, differences in chelae width between contestants were significantly correlated with the outcome of agonistic encounters, but this perceived resource holding power did not reflect the actual power of force production. 5. Thermal acclimation in C destructor has beneficial consequences for dominance and competitive ability, and the success of cold acclimated animals at the cold temperatures can be at least partly explained by concomitant up-regulation of oxidative ATP production capacity.
Resumo:
Around the world, consumers and retailers of fresh produce are becoming more and more discerning about factors such as food safety and traceability, health, convenience and the sustainability of production systems, and in doing so they are changing the way in which fresh produce supply chains are configured and managed. When consumers demand fresh, safe, convenient, value-for-money produce, retailers in an increasingly competitive environment are attracted to those business models most capable of meeting these demands profitably. Traditional models are proving less and less able to deliver competitive advantage in such an environment. As a result, opportunistic, adversarial, price-based approaches to doing business between chain members are being replaced by approaches that are more strategic, collaborative and value-based. The shaping force behind this change is the need for producers, wholesalers, category managers, retailers and consumers to have more certainty about the performance of the supply chains upon which they rely. Certainty is generated through the supply chain's ability to create, deliver and share value. How to build supply chains that create, deliver and share value is arguably the single biggest challenge to the competitiveness of fresh produce firms, and therefore to the industries to which they belong.
Resumo:
Poly(ε-caprolactone) (PCL) fibers produced by wet spinning from solutions in acetone under low-shear (gravity-flow) conditions resulted in fiber strength of 8 MPa and stiffness of 0.08 Gpa. Cold drawing to an extension of 500% resulted in an increase in fiber strength to 43 MPa and stiffness to 0.3 GPa. The growth rate of human umbilical vein endothelial cells (HUVECs) (seeded at a density of 5 × 104 cells/mL) on as-spun fibers was consistently lower than that measured on tissue culture plastic (TCP) beyond day 2. Cell proliferation was similar on gelatin-coated fibers and TCP over 7 days and higher by a factor of 1.9 on 500% cold-drawn PCL fibers relative to TCP up to 4 days. Cell growth on PCL fibers exceeded that on Dacron monofilament by at least a factor of 3.7 at 9 days. Scanning electron microscopy revealed formation of a cell layer on samples of cold-drawn and gelatin-coated fibers after 24 hours in culture. Similar levels of ICAM-1 expression by HUVECs attached to PCL fibers and TCP were measured using RT-PCR and flow cytometry, indicative of low levels of immune activation. Retention of a specific function of HUVECs attached to PCL fibers was demonstrated by measuring their immune response to lipopolysaccharide. Levels of ICAM-1 expression increased by approximately 11% in cells attached to PCL fibers and TCP. The high fiber compliance, favorable endothelial cell proliferation rates, and retention of an important immune response of attached HUVECS support the use of gravity spun PCL fibers for three-dimensional scaffold production in vascular tissue engineering. © Mary Ann Liebert, Inc.
Resumo:
This paper reports on the theoretical foundations and the practical reasons for the increasing popularity of enterprise management. The research has specifically aimed to investigate the dependency between the prevailing type of core competence and the emergent enterprise structure. Empirical inductive research has been conducted in the German automotive industry using the grounded theory approach. This has involved an initial literature review, transcription and codification of interviews to derive tentative propositions, and the validation of the tentative propositions through a questionnaire survey. The research has resulted in the consolidation of the most valid propositions into a conceptual framework. This conceptual framework has been proposed to support enterprise managers who have to make strategic decisions. This study indicates that the prevailing type of core competence is a significant factor that influences the design and management of the enterprise structure. © 2006 Elsevier B.V. All rights reserved.
Resumo:
Recently, Drǎgulescu and Yakovenko proposed an analytical formula for computing the probability density function of stock log returns, based on the Heston model, which they tested empirically. Their research design inadvertently favourably biased the fit of the data to the Heston model, thus overstating their empirical results. Furthermore, Drǎgulescu and Yakovenko did not perform any goodness-of-fit statistical tests. This study employs a research design that facilitates statistical tests of the goodness-of-fit of the Heston model to empirical returns. Robustness checks are also performed. In brief, the Heston model outperformed the Gaussian model only at high frequencies and even so does not provide a statistically acceptable fit to the data. The Gaussian model performed (marginally) better at medium and low frequencies, at which points the extra parameters of the Heston model have adverse impacts on the test statistics. © 2005 Taylor & Francis Group Ltd.
Resumo:
This paper assumes that a primary function of management accounting is the representation of "accounting facts" for purposes such as organizational control. Accountants are able to offer conventional techniques of control, such as standard costing, as a consequence of their ability to deploy accounting representations within managerial and economic models of organizational processes. Accounting competes, at times, with other 'professional' groups, such as production planning or quality management people, in this role of representing the organization to management. The paper develops its arguments around a case illustration of cost accounting set in a low technology manufacturing environment. The research relates to a case organization in which accountants are attempting to establish the reliability of accounting inscriptions of a simple manufacturing process. The case research focuses on the documents, the inscriptions that vie for managements' attention. It is these sometimes messy and inaccurate representations which enable control of complex and heterogeneous activities at a distance. At the end of our site visits we observe quality management systems in the ascendancy over the accountants' standard costing systems. © 2006 Elsevier Ltd. All rights reserved.
Resumo:
In 1957, Farrell proposed to measure technical (in)efficiency as the realised deviation from a frontier isoquant. Since then, the research has developed several methods to derive the production frontier and it has also extended its scope in applying frontier techniques to the measurement of total factor productivity. In this paper, I present the core techniques for the measurement of technical efficiency and productivity based on the notion of frontier and introduce the more recent technological advances in the field.
Resumo:
This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided.
Resumo:
Receptor activity modifying protein 1 (RAMP1) is an integral component of several receptors including the calcitonin gene-related peptide (CGRP) receptor. It forms a complex with the calcitonin receptor-like receptor (CLR) and is required for receptor trafficking and ligand binding. The N-terminus of RAMP1 comprises three helices. The current study investigated regions of RAMP1 important for CGRP or CLR interactions by alanine mutagenesis. Modeling suggested the second and third helices were important in protein-protein interactions. Most of the conserved residues in the N-terminus (M48, W56, Y66, P85, N66, H97, F101, D113, P114, P115), together with a further 13 residues spread throughout three helices of RAMP1, were mutated to alanine and coexpressed with CLR in Cos 7 cells. None of the mutations significantly reduced RAMP expression. Of the nine mutants from helix 1, only M48A had any effect, producing a modest reduction in trafficking of CLR to the cell surface. In helix 2 Y66A almost completely abolished CLR trafficking; L69A and T73A reduced the potency of CGRP to produce cAMP. In helix 3, H97A abolished CLR trafficking; P85A, N86A, and F101A had caused modest reductions in CLR trafficking and also reduced the potency of CGRP on cAMP production. F93A caused a modest reduction in CLR trafficking alone and L94A increased cAMP production. The data are consistent with a CLR recognition site particularly involving Y66 and H97, with lesser roles for adjacent residues in helix 3. L69 and T73 may contribute to a CGRP recognition site in helix 2 also involving nearby residues.
Resumo:
WWe present the case of two aphasic patients: one with fluent speech, MM, and one with dysfluent speech, DB. Both patients make similar proportions of phonological errors in speech production and the errors have similar characteristics. A closer analysis, however, shows a number of differences. DB's phonological errors involve, for the most part, simplifications of syllabic structure; they affect consonants more than vowels; and, among vowels, they show effects of sonority/complexity. This error pattern may reflect articulatory difficulties. MM's errors, instead, show little effect of syllable structure, affect vowels at least as much as consonants and, and affect all different vowels to a similar extent. This pattern is consistent with a more central impairment involving the selection of the right phoneme among competing alternatives. We propose that, at this level, vowel selection may be more difficult than consonant selection because vowels belong to a smaller set of repeatedly activated units.
Resumo:
The infiltration and persistence of hematopoietic immune cells within the rheumatoid arthritis (RA) joint results in elevated levels of pro-inflammatory cytokines, increased reactive oxygen (ROS) and -nitrogen (RNS) species generation, that feeds a continuous self-perpetuating cycle of inflammation and destruction. Meanwhile, the controlled production of ROS is required for signaling within the normal physiological reaction to perceived "foreign matter" and for effective apoptosis. This review focuses on the signaling pathways responsible for the induction of the normal immune response and the contribution of ROS to this process. Evidence for defects in the ability of immune cells in RA to regulate the generation of ROS and the consequence for their immune function and for RA progression is considered. As the hypercellularity of the rheumatoid joint and the associated persistence of hematopoietic cells within the rheumatoid joint are symptomatic of unresponsiveness to apoptotic stimuli, the role of apoptotic signaling proteins (specifically Bcl-2 family members and the tumor suppressor p53) as regulators of ROS generation and apoptosis are considered, evaluating evidence for their aberrant expression and function in RA. We postulate that ROS generation is required for effective therapeutic intervention.
Resumo:
This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein–Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.
Resumo:
Since the oil crisis of 1973 considerable interest has been shown in the production of liquid fuels from alternative sources. In particular processes utilizing coal as the feedstock have received considerable interest. These processes can be divided into direct and indirect liquefaction and pyrolysis. This thesis describes the modelling of indirect coal liquefaction processes for the purpose of performing technical and economic assessment of the production of liquid fuels from coal and lignite, using a variety of gasification and synthesis gas liquefaction technologies. The technologies were modeled on a 'step model' basis where a step is defined as a combination of individual unit operations which together perform a significant function on the process streams, such as a methanol synthesis step or a gasification and physical gas cleaning step. Sample results of the modelling, covering a wide range of gasifiers, liquid synthesis processes and products are presented in this thesis. Due to the large number of combinations of gasifier, liquid synthesis processes, products and economic sensitivity cases, a complete set of results is impractical to present in a single publication. The main results show that methanol is the cheapest fuel to produce from coal followed by fuel alcohol, diesel from the Shell Middle Distillate Synthesis process,gasoline from Mobil Methanol to Gasoline (MTG) process, diesel from the Mobil Methanol Olefins Gasoline Diesel (MOGD) process and finally gasoline from the same process. Some variation in production costs of all the products was shown depending on type of gasifier chosen and feedstock.
Resumo:
This project has been undertaken for Hamworthy Hydraulics Limited. Its objective was to design and develop a controller package for a variable displacement, hydraulic pump for use mainly on mobile earth moving machinery. A survey was undertaken of control options used in practice and from this a design specification was formulated, the successful implementation of which would give Hamworthy an advantage over its competitors. Two different modes for the controller were envisaged. One consisted of using conventional hydro-mechanics and the other was based upon a microprocessor. To meet short term customer prototype requirements the first section of work was the realisation of the hydro-mechanical system. Mathematical models were made to evaluate controller stability and hence aid their design. The final package met the requirements of the specification and a single version could operate all sizes of variable displacement pumps in the Hamworthy range. The choice of controller options and combinations totalled twenty-four. The hydro-mechanical controller was complex and it was realised that a micro-processor system would allow all options to be implemented with just one design of hardware, thus greatly simplifying production. The final section of this project was to determine whether such a design was feasible. This entailed finding cheap, reliable transducers, using mathematical models to predict electro-hydraulic interface stability, testing such interfaces and finally incorporating a micro-processor in an interactive control loop. The study revealed that such a system was technically possible but it would cost 60% more than its hydro-mechanical counterpart. It was therefore concluded that, in the short term, for the markets considered, the hydro-mechanical design was the better solution. Regarding the micro-processor system the final conclusion was that, because the relative costs of the two systems are decreasing, the electro-hydraulic controller will gradually become more attractive and therefore Hamworthy should continue with its development.
Resumo:
This work introduces a novel inversion-based neurocontroller for solving control problems involving uncertain nonlinear systems which could also compensate for multi-valued systems. The approach uses recent developments in neural networks, especially in the context of modelling statistical distributions, which are applied to forward and inverse plant models. Provided that certain conditions are met, an estimate of the intrinsic uncertainty for the outputs of neural networks can be obtained using the statistical properties of networks. More generally, multicomponent distributions can be modelled by the mixture density network. Based on importance sampling from these distributions a novel robust inverse control approach is obtained. This importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localise the possible control solutions to consider. Convergence of the output error for the proposed control method is verified by using a Lyapunov function. Several simulation examples are provided to demonstrate the efficiency of the developed control method. The manner in which such a method is extended to nonlinear multi-variable systems with different delays between the input-output pairs is considered and demonstrated through simulation examples.