22 resultados para DISTRIBUTION MODELS

em Aston University Research Archive


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The spatial distribution of self-employment in India: evidence from semiparametric geoadditive models, Regional Studies. The entrepreneurship literature has rarely considered spatial location as a micro-determinant of occupational choice. It has also ignored self-employment in developing countries. Using Bayesian semiparametric geoadditive techniques, this paper models spatial location as a micro-determinant of self-employment choice in India. The empirical results suggest the presence of spatial occupational neighbourhoods and a clear north–south divide in self-employment when the entire sample is considered; however, spatial variation in the non-agriculture sector disappears to a large extent when individual factors that influence self-employment choice are explicitly controlled. The results further suggest non-linear effects of age, education and wealth on self-employment.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Solid tumours display a complex drug resistance phenotype that involves inherent and acquired mechanisms. Multicellular resistance is an inherent feature of solid tumours and is known to present significant barriers to drug permeation in tumours. Given this barrier, do acquired resistance mechanisms such as P-glycoprotein (P-gp) contribute significantly to resistance? To address this question, the multicellular tumour spheroid (MCTS) model was used to examine the influence of P-gp on drug distribution in solid tissue. Tumour spheroids (TS) were generated from either drug-sensitive MCF7WT cells or a drug-resistant, P-gp-expressing derivative MCF7Adr. Confocal microscopy was used to measure time courses and distribution patterns of three fluorescent compounds; calcein-AM, rhodamine123 and BODIPY-taxol. These compounds were chosen because they are all substrates for P-gp-mediated transport, exhibit high fluorescence and are chemically dissimilar. For example, BODIPY-taxol and rhodamine 123 showed high accumulation and distributed extensively throughout the TSWT, whereas calcein-AM accumulation was restricted to the outermost layers. The presence of P-gp in TSAdr resulted in negligible accumulation, regardless of the compound. Moreover, the inhibition of P-gp by nicardipine restored intracellular accumulation and distribution patterns to levels observed in TSWT. The results demonstrate the effectiveness of P-gp in modulating drug distribution in solid tumour models. However, the penetration of agents throughout the tissue is strongly determined by the physico-chemical properties of the individual compounds.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As a discipline, supply chain management (SCM) has traditionally been primarily concerned with the procurement, processing, movement and sale of physical goods. However an important class of products has emerged - digital products - which cannot be described as physical as they do not obey commonly understood physical laws. They do not possess mass or volume, and they require no energy in their manufacture or distribution. With the Internet, they can be distributed at speeds unimaginable in the physical world, and every copy produced is a 100% perfect duplicate of the original version. Furthermore, the ease with which digital products can be replicated has few analogues in the physical world. This paper assesses the effect of non-physicality on one such product – software – in relation to the practice of SCM. It explores the challenges that arise when managing the software supply chain and how practitioners are addressing these challenges. Using a two-pronged exploratory approach that examines the literature around software management as well as direct interviews with software distribution practitioners, a number of key challenges associated with software supply chains are uncovered, along with responses to these challenges. This paper proposes a new model for software supply chains that takes into account the non-physicality of the product being delivered. Central to this model is the replacement of physical flows with flows of intellectual property, the growing importance of innovation over duplication and the increased centrality of the customer in the entire process. Hybrid physical / digital supply chains are discussed and a framework for practitioners concerned with software supply chains is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on a statistical mechanics approach, we develop a method for approximately computing average case learning curves and their sample fluctuations for Gaussian process regression models. We give examples for the Wiener process and show that universal relations (that are independent of the input distribution) between error measures can be derived.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a general methodology for estimating and incorporating uncertainty in the controller and forward models for noisy nonlinear control problems. Conditional distribution modeling in a neural network context is used to estimate uncertainty around the prediction of neural network outputs. The developed methodology circumvents the dynamic programming problem by using the predicted neural network uncertainty to localize the possible control solutions to consider. A nonlinear multivariable system with different delays between the input-output pairs is used to demonstrate the successful application of the developed control algorithm. The proposed method is suitable for redundant control systems and allows us to model strongly non Gaussian distributions of control signal as well as processes with hysteresis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective is to study beta-amyloid (Abeta) deposition in dementia with Lewy bodies (DLB) with Alzheimer's disease (AD) pathology (DLB/AD). The size frequency distributions of the Abeta deposits were studied and fitted by log-normal and power-law models. Patients were ten clinically and pathologically diagnosed DLB/AD cases. Size distributions had a single peak and were positively skewed and similar to those described in AD and Down's syndrome. Size distributions had smaller means in DLB/AD than in AD. Log-normal and power-law models were fitted to the size distributions of the classic and diffuse deposits, respectively. Size distributions of Abeta deposits were similar in DLB/AD and AD. Size distributions of the diffuse deposits were fitted by a power-law model suggesting that aggregation/disaggregation of Abeta was the predominant factor, whereas the classic deposits were fitted by a log-normal distribution suggesting that surface diffusion was important in the pathogenesis of the classic deposits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel methodology to infer parameters of probabilistic models whose output noise is a Student-t distribution. The method is an extension of earlier work for models that are linear in parameters to nonlinear multi-layer perceptrons (MLPs). We used an EM algorithm combined with variational approximation, the evidence procedure, and an optimisation algorithm. The technique was tested on two regression applications. The first one is a synthetic dataset and the second is gas forward contract prices data from the UK energy market. The results showed that forecasting accuracy is significantly improved by using Student-t noise models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a study of three techniques to improve performance of some standard fore-casting models, application to the energy demand and prices. We focus on forecasting demand and price one-day ahead. First, the wavelet transform was used as a pre-processing procedure with two approaches: multicomponent-forecasts and direct-forecasts. We have empirically compared these approaches and found that the former consistently outperformed the latter. Second, adaptive models were introduced to continuously update model parameters in the testing period by combining ?lters with standard forecasting methods. Among these adaptive models, the adaptive LR-GARCH model was proposed for the fi?rst time in the thesis. Third, with regard to noise distributions of the dependent variables in the forecasting models, we used either Gaussian or Student-t distributions. This thesis proposed a novel algorithm to infer parameters of Student-t noise models. The method is an extension of earlier work for models that are linear in parameters to the non-linear multilayer perceptron. Therefore, the proposed method broadens the range of models that can use a Student-t noise distribution. Because these techniques cannot stand alone, they must be combined with prediction models to improve their performance. We combined these techniques with some standard forecasting models: multilayer perceptron, radial basis functions, linear regression, and linear regression with GARCH. These techniques and forecasting models were applied to two datasets from the UK energy markets: daily electricity demand (which is stationary) and gas forward prices (non-stationary). The results showed that these techniques provided good improvement to prediction performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce models of heterogeneous systems with finite connectivity defined on random graphs to capture finite-coordination effects on the low-temperature behaviour of finite-dimensional systems. Our models use a description in terms of small deviations of particle coordinates from a set of reference positions, particularly appropriate for the description of low-temperature phenomena. A Born-von Karman-type expansion with random coefficients is used to model effects of frozen heterogeneities. The key quantity appearing in the theoretical description is a full distribution of effective single-site potentials which needs to be determined self-consistently. If microscopic interactions are harmonic, the effective single-site potentials turn out to be harmonic as well, and the distribution of these single-site potentials is equivalent to a distribution of localization lengths used earlier in the description of chemical gels. For structural glasses characterized by frustration and anharmonicities in the microscopic interactions, the distribution of single-site potentials involves anharmonicities of all orders, and both single-well and double-well potentials are observed, the latter with a broad spectrum of barrier heights. The appearance of glassy phases at low temperatures is marked by the appearance of asymmetries in the distribution of single-site potentials, as previously observed for fully connected systems. Double-well potentials with a broad spectrum of barrier heights and asymmetries would give rise to the well-known universal glassy low-temperature anomalies when quantum effects are taken into account. © 2007 IOP Publishing Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is important to maintain a uniform distribution of gas and liquid in large diameter packed columns to maintain mass transfer efficiency on scaling up. This work presents measurements and methods of evaluating maldistributed gas flow in packed columns. Little or no previous work has been done in this field. A gas maldistribution number, F, was defined, based on point to point velocity variations in the gas emerging from the top of packed beds. f has a minimum value for a uniformly distributed flow and much larger values for maldistributed flows. A method of testing the quality of vapour distributors is proposed, based on "the variation of f with packed height. A good gas distributor requires a short packed depth to give a good gas distribution. Measurements of gas maldistribution have shown that the principle of dynamic similarity is satisfied if two geometrically similar beds are operated at the same Reynold's number. The validity of f as a good measure of gas maldistribution, and the principle of dynamic similarity are tested statistically by Multi-Factor Analysis of the variance, and visually by the response "surfaces technique. Pressure distribution has been measured in a model of a large diameter packed bed, and shown to be associated with the velocity of the gas in a tangential feed pipe. Two simplified theoretical models are proposed to describe the flow of gases through packed beds and to support the principle of dynamic similarity. These models explain why the packed bed itself causes the flow of gas to become more uniformly distributed. A 1.2m. diameter scaled-down model was constructed geometrically similar to a 7.3m. diameter vacuum crude distillation column. The previously known internal cylinder gas distributor was tested. Three new distributors suitable for use in a large diameter column were developed and tested, these are: Internal Cylinder with Slots and Cross Baffles, Internal Cylinder with Guides in the Annulus, Internal Cylinder with Internal Cross Baffles - It has been shown that this is an excellent distributor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis describes experimental work on the possibility of using deflection baffles in conventional distillation trays as flow straightening devices, with the view of enhancing tray efficiency. The mode of operation is based on deflecting part of the liquid momentum from the centre of the tray to the segment regions in order to drive stagnating liquid at the edges forward. The first part of the work was a detailed investigation into the two-phase flow patterns produced on a conventional sieve tray having 1 mm hole size perforations. The data provide a check on some earlier work and extend the range of the existing databank, particularly to conditions more typical of industrial operation. A critical survey of data collected on trays with different hole sizes (Hine, 1990; Chambers, 1993; Fenwick, 1996; this work) showed that the hole diameter has a significant influence on the flow regime, the size of the stagnant regions and the hydraulic and mass transfer performance. Five modified tray topologies were created with different configurations of baffles and tested extensively in the 2.44 m diameter air-water pilot distillation simulator for their efficacy in achieving uniform flow across the tray and for their impact on tray loading capacity and mass transfer efficiency. Special attention was given to the calibration of the over 100 temperature probes used in measuring the water temperature across the tray on which the heat and mass transfer analogy is based. In addition to normal tray capacity experiments, higher weir load experiments were conducted using a 'half-tray' mode in order to extend the range of data to conditions more typical of industrial operation. The modified trays show superior flow characteristics compared to the conventional tray in terms of the ability to replenish the zones of exceptionally low temperatures and high residence times at the edges of the tray, to lower the bulk liquid gradient and to achieve a more uniform flow across the tray. These superior flow abilities, however, tend to diminish with increasing weir load because of the increasing tendency for the liquid to jump over the barriers instead of flowing over them. The modified tray topologies showed no tendency to cause undue limitation to tray loading capacity. Although the improvement in the efficiency of a single tray over that of the conventional tray was moderate and in some cases marginal, the multiplier effect in a multiple tray column situation would be significant (Porter et al., 1972). These results are in good agreement with an associated CFD studies (Fischer, 1999) carried out by partners in the Advanced Studies in Distillation consortium. It is concluded that deflection baffles can be used in a conventional distillation sieve tray to achieve better liquid flow distribution and obtain enhanced mass transfer efficiency, without undermining the tray loading capacity. Unlike any other controlled-flow tray whose mechanical complexity impose stringent manufacturing and installation tolerances, the baffled-tray models are simple to design, manufacture and install and thus provide an economic method of retrofitting badly performing sieve trays both in terms of downtime and fabrication. NOTE APPENDICES 2-5 ARE ON A SEPARATE FLOPPY DISK ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY WITH PRIOR ARRANGEMENT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Amongst all the objectives in the study of time series, uncovering the dynamic law of its generation is probably the most important. When the underlying dynamics are not available, time series modelling consists of developing a model which best explains a sequence of observations. In this thesis, we consider hidden space models for analysing and describing time series. We first provide an introduction to the principal concepts of hidden state models and draw an analogy between hidden Markov models and state space models. Central ideas such as hidden state inference or parameter estimation are reviewed in detail. A key part of multivariate time series analysis is identifying the delay between different variables. We present a novel approach for time delay estimating in a non-stationary environment. The technique makes use of hidden Markov models and we demonstrate its application for estimating a crucial parameter in the oil industry. We then focus on hybrid models that we call dynamical local models. These models combine and generalise hidden Markov models and state space models. Probabilistic inference is unfortunately computationally intractable and we show how to make use of variational techniques for approximating the posterior distribution over the hidden state variables. Experimental simulations on synthetic and real-world data demonstrate the application of dynamical local models for segmenting a time series into regimes and providing predictive distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores the use of the optimization procedures in SAS/OR software with application to the contemporary logistics distribution network design using an integrated multiple criteria decision making approach. Unlike the traditional optimization techniques, the proposed approach, combining analytic hierarchy process (AHP) and goal programming (GP), considers both quantitative and qualitative factors. In the integrated approach, AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, a GP model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. To facilitate the use of integrated multiple criteria decision making approach by SAS users, an ORMCDM code was implemented in the SAS programming language. The SAS macro developed in this paper selects the chosen variables from a SAS data file and constructs sets of linear programming models based on the selected GP model. An example is given to illustrate how one could use the code to design the logistics distribution network.