961 resultados para Large Marangoni Number


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – In the 1990s, a growing number of companies adopted value-based management (VBM) techniques in the UK. The purpose of this paper is to explore the motivations for the adoption or non-adoption of VBM for managing a business. Design/methodology/approach – An interview-based study of 37 large UK companies. Insights from diffusion theory and institutional theory are utilised to theorise these motivations. Findings – It was found that the rate of adoption of VBM in the sample companies does follow the classical S-shape. It also suggests that the supply-side of the diffusion process, most notably the role played by consultants, was an influence on many companies. This was not, however, a sufficient condition for companies to adopt the technique. The research also finds evidence of relocation diffusion, as several adopters are influenced by new officers, for example chief executive officers and finance directors, importing VBM techniques that they have used in organizations within which they have previously worked. Research limitations/implications – It is quite a small scale study and further work would be needed to develop the findings. Practical implications – Understanding and theorising the adoption of new management techniques will help understand the management of a business. Originality/value – This research adds further evidence to the value of studying management accounting, and more specifically management accounting change, in practice. It shows the developments in the adoption of a new technique and hence how a technique becomes accepted in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problem structuring methods (PSMs) aim to build shared understanding in a group of decision makers. This shared understanding is used as a basis for them to negotiate an agreed action plan that they are prepared to help implement. Engaging in a social process of negotiation with a large number of people is difficult, and so PSMs have typically focused on small groups of less than 20. This paper explores the legitimacy of deploying PSMs in large groups of people (50–1000), where the aim is to negotiate action and build commitment to its implementation. We review the difficulties of facilitating large groups with PSMs, drawing heavily on our experience of working with over 25 large groups. We offer a range of lessons learned and suggest concrete approaches to facilitating large groups to achieve the objectives of PSMs. This paper contributes to the evaluation and development of PSMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using methods of statistical physics, we study the average number and kernel size of general sparse random matrices over GF(q), with a given connectivity profile, in the thermodynamical limit of large matrices. We introduce a mapping of GF(q) matrices onto spin systems using the representation of the cyclic group of order q as the q-th complex roots of unity. This representation facilitates the derivation of the average kernel size of random matrices using the replica approach, under the replica symmetric ansatz, resulting in saddle point equations for general connectivity distributions. Numerical solutions are then obtained for particular cases by population dynamics. Similar techniques also allow us to obtain an expression for the exact and average number of random matrices for any general connectivity profile. We present numerical results for particular distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is important to maintain a uniform distribution of gas and liquid in large diameter packed columns to maintain mass transfer efficiency on scaling up. This work presents measurements and methods of evaluating maldistributed gas flow in packed columns. Little or no previous work has been done in this field. A gas maldistribution number, F, was defined, based on point to point velocity variations in the gas emerging from the top of packed beds. f has a minimum value for a uniformly distributed flow and much larger values for maldistributed flows. A method of testing the quality of vapour distributors is proposed, based on "the variation of f with packed height. A good gas distributor requires a short packed depth to give a good gas distribution. Measurements of gas maldistribution have shown that the principle of dynamic similarity is satisfied if two geometrically similar beds are operated at the same Reynold's number. The validity of f as a good measure of gas maldistribution, and the principle of dynamic similarity are tested statistically by Multi-Factor Analysis of the variance, and visually by the response "surfaces technique. Pressure distribution has been measured in a model of a large diameter packed bed, and shown to be associated with the velocity of the gas in a tangential feed pipe. Two simplified theoretical models are proposed to describe the flow of gases through packed beds and to support the principle of dynamic similarity. These models explain why the packed bed itself causes the flow of gas to become more uniformly distributed. A 1.2m. diameter scaled-down model was constructed geometrically similar to a 7.3m. diameter vacuum crude distillation column. The previously known internal cylinder gas distributor was tested. Three new distributors suitable for use in a large diameter column were developed and tested, these are: Internal Cylinder with Slots and Cross Baffles, Internal Cylinder with Guides in the Annulus, Internal Cylinder with Internal Cross Baffles - It has been shown that this is an excellent distributor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A variety of visual symptoms have been associated with Alzheimer's disease (AD). These include delays in flash visual evoked potentials which indicate a disruption of the integrity of the visual pathway. Examination of the visual cortex has revealed the presence of both senile plaques and neurofibrillary tangles. The purpose of this study was to determine whether there were differences in the number and/or size of optic nerve axons between AD patients and non-demented age-matched controls. Five optic nerves from AD patients and five from age-matched controls were embedded in epon resin and 1 micron sections prepared on a Reichert ultramicrotome. The sections were then stained in toluidine blue and examined at x400 magnification. The numbers of axons were counted in photographs of three fields taken at random from each section. To evaluate the axon diameters, 70 axons were chosen at random from each patient and measured using a calibrated eyepiece graticule. The total axon counts revealed no significant differences between the AD optic nerves and the age-matched controls. However, the frequency distribution of axon diameters was significantly different in the two groups. In particular, there were fewer larger diameter axons in patients with AD as previously reported. Degeneration of the large diameter axons suggests involvement of the magnocellular as opposed to the parvocellular pathways. Hence, there could be differences in visual performance of AD patients compared with normals which could be important in clinical diagnosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principled statistical application of Gaussian random field models used in geostatistics has historically been limited to data sets of a small size. This limitation is imposed by the requirement to store and invert the covariance matrix of all the samples to obtain a predictive distribution at unsampled locations, or to use likelihood-based covariance estimation. Various ad hoc approaches to solve this problem have been adopted, such as selecting a neighborhood region and/or a small number of observations to use in the kriging process, but these have no sound theoretical basis and it is unclear what information is being lost. In this article, we present a Bayesian method for estimating the posterior mean and covariance structures of a Gaussian random field using a sequential estimation algorithm. By imposing sparsity in a well-defined framework, the algorithm retains a subset of “basis vectors” that best represent the “true” posterior Gaussian random field model in the relative entropy sense. This allows a principled treatment of Gaussian random field models on very large data sets. The method is particularly appropriate when the Gaussian random field model is regarded as a latent variable model, which may be nonlinearly related to the observations. We show the application of the sequential, sparse Bayesian estimation in Gaussian random field models and discuss its merits and drawbacks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work, the elastic scattering of fast neutrons from iron and concrete samples were studied at incident neutron energies of 14.0 and 14.4 Mev, using a neutron spectrometer based on the associated particle time-of-flight technique. These samples were chosen because of their importance in the design of fusion reactor shielding and construction. Using the S.A.M.E.S. accelerator and the 3 M v Dynamitron accelerator at the Radiation Centre, 14.0 and 14.4 Mev neutrons were produced by the T(d, n)4He reaction at incident deuteron energies of 140 keV and 900 keV mass III ions respectively. The time of origin of the neutron was determined by detecting the associated alpha particles. The samples used were extended flat plates of thicknesses up to 1.73 mean free paths for iron and 2.3 mean free paths for concrete. The associated alpha particles and fast neutrons were detected by means of a plastic scintillator mounted on a fast focused photomultiplier tube. The differential neutron elastic scattering cross-sections were measured for 14 Mev neutrons in various thicknesses of iron and concrete in the angular range from zero to 90°. In addition, the angular distributions of 14.4 Mev neutrons after passing through extended samples of iron were measured at several scattering angles in the same angular range. The measurements obtained for the thin sample of iron were compared with the results of Coon et al. The differential cross-sections for the thin iron sample were also analyzed on the optical model using the computer code RAROMP. For the concrete sample, the angular distribution of the thin sample was compared with the cross-sections calculated from the major constituent elements of concrete, and with the predicted values of the optical model for those elements. No published data could be found to compare with the results of the concrete differential cross-sections. In the case of thick samples of iron and concrete, the number of scattered neutrons were compared with a phenomological calculation based on the continuous slowing down model. The variation of measured cross-sections with sample thickness were found to follow the empirical relation σ = σ0 eαx. By using the universal constant "K", good fits were obtained to the experimental data. In parallel with the work at 14.0 and 14.4 Mev, an associated particle time-of-flight spectrometer was investigated which used the 2H(d,n)3He reaction for 3.02 Mev neutron energy at the incident deuteron energy of 1 Mev.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the ability to collect and store increasingly large datasets on modern computers comes the need to be able to process the data in a way that can be useful to a Geostatistician or application scientist. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively for likelihood-based Geostatistics. Various methods have been proposed and are extensively used in an attempt to overcome these complexity issues. This thesis introduces a number of principled techniques for treating large datasets with an emphasis on three main areas: reduced complexity covariance matrices, sparsity in the covariance matrix and parallel algorithms for distributed computation. These techniques are presented individually, but it is also shown how they can be combined to produce techniques for further improving computational efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

T-cell activation requires interaction of T-cell receptors (TCR) with peptide epitopes bound by major histocompatibility complex (MHC) proteins. This interaction occurs at a special cell-cell junction known as the immune or immunological synapse. Fluorescence microscopy has shown that the interplay among one agonist peptide-MHC (pMHC), one TCR and one CD4 provides the minimum complexity needed to trigger transient calcium signalling. We describe a computational approach to the study of the immune synapse. Using molecular dynamics simulation, we report here on a study of the smallest viable model, a TCR-pMHC-CD4 complex in a membrane environment. The computed structural and thermodynamic properties are in fair agreement with experiment. A number of biomolecules participate in the formation of the immunological synapse. Multi-scale molecular dynamics simulations may be the best opportunity we have to reach a full understanding of this remarkable supra-macromolecular event at a cell-cell junction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using suitable coupled Navier-Stokes Equations for an incompressible Newtonian fluid we investigate the linear and non-linear steady state solutions for both a homogeneously and a laterally heated fluid with finite Prandtl Number (Pr=7) in the vertical orientation of the channel. Both models are studied within the Large Aspect Ratio narrow-gap and under constant flux conditions with the channel closed. We use direct numerics to identify the linear stability criterion in parametric terms as a function of Grashof Number (Gr) and streamwise infinitesimal perturbation wavenumber (making use of the generalised Squire’s Theorem). We find higher harmonic solutions at lower wavenumbers with a resonance of 1:3exist, for both of the heating models considered. We proceed to identify 2D secondary steady state solutions, which bifurcate from the laminar state. Our studies show that 2D solutions are found not to exist in certain regions of the pure manifold, where we find that 1:3 resonant mode 2D solutions exist, for low wavenumber perturbations. For the homogeneously heated fluid, we notice a jump phenomenon existing between the pure and resonant mode secondary solutions for very specific wavenumbers .We attempt to verify whether mixed mode solutions are present for this model by considering the laterally heated model with the same geometry. We find mixed mode solutions for the laterally heated model showing that a bridge exists between the pure and 1:3 resonant mode 2D solutions, of which some are stationary and some travelling. Further, we show that for the homogeneously heated fluid that the 2D solutions bifurcate in hopf bifurcations and there exists a manifold where the 2D solutions are stable to Eckhaus criterion, within this manifold we proceed to identify 3D tertiary solutions and find that the stability for said 3D bifurcations is not phase locked to the 2D state. For the homogeneously heated model we identify a closed loop within the neutral stability curve for higher perturbation wavenumubers and analyse the nature of the multiple 2D bifurcations around this loop for identical wavenumber and find that a temperature inversion occurs within this loop. We conclude that for a homogeneously heated fluid it is possible to have abrup ttransitions between the pure and resonant 2D solutions, and that for the laterally heated model there exist a transient bifurcation via mixed mode solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Calibration of stochastic traffic microsimulation models is a challenging task. This paper proposes a fast iterative probabilistic precalibration framework and demonstrates how it can be successfully applied to a real-world traffic simulation model of a section of the M40 motorway and its surrounding area in the U.K. The efficiency of the method stems from the use of emulators of the stochastic microsimulator, which provides fast surrogates of the traffic model. The use of emulators minimizes the number of microsimulator runs required, and the emulators' probabilistic construction allows for the consideration of the extra uncertainty introduced by the approximation. It is shown that automatic precalibration of this real-world microsimulator, using turn-count observational data, is possible, considering all parameters at once, and that this precalibrated microsimulator improves on the fit to observations compared with the traditional expertly tuned microsimulation. © 2000-2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different procurement decisions taken by relief organizations can result in considerably different implications in regards to transport, storage, and distribution of humanitarian aid and ultimately can influence the performance of the humanitarian supply chain and the delivery of the humanitarian aid. In this article, we look into what resources are needed and how these resources evolve in the delivery of humanitarian aid. Drawing on the resource-based view of the firm, we develop a framework to categorize the impact of local resources on the configuration of humanitarian supply chains. In contrast to other papers, the importance of localizing the configuration of the humanitarian supply chain is not only conceptually recognized, but empirical investigations are also provided. In terms of methodology, this article is based on the analysis of secondary data from two housing reconstruction projects. Findings indicate that the use of local resources in humanitarian aid has positive effects on programs' overall supply chain performance and these effects are not only related to the macroeconomic perspective, but benefits expand to improvements related to the use of knowledge. At the same time, it was found that local sourcing often comes with a number of problems. For example, in one of the cases, significant problems existed, which were related to the scarcity of local supplies. Both housing reconstruction projects have indicated the continuous need for changes throughout the programs as a dynamic supply chain configuration is important for the long-term sustainability of reconstruction aid. © 2014 Decision Sciences Institute.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of nodes has large impact on the performance, lifetime and cost of wireless sensor network (WSN). It is difficult to determine, because it depends on many factors, such as the network protocols, the collaborative signal processing (CSP) algorithms, etc. A mathematical model is proposed in this paper to calculate the number based on the required working time. It can be used in the general situation by treating these factors as the parameters of energy consumption. © 2004 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: Identify the incidence of vitreomacular traction (VMT) and frequency of reduced vision in the absence of other coexisting macular pathology using a pragmatic classification system for VMT in a population of patients referred to the hospital eye service. Methods: A detailed survey of consecutive optical coherence tomography (OCT) scans was done in a high-throughput ocular imaging service to ascertain cases of vitreomacular adhesion (VMA) and VMT using a departmental classification system. Analysis was done on the stages of traction, visual acuity, and association with other macular conditions. Results: In total, 4384 OCT scan episodes of 2223 patients were performed. Two hundred and fourteen eyes had VMA/VMT, with 112 eyes having coexisting macular pathology. Of 102 patients without coexisting pathology, 57 patients had VMT grade between 2 and 8, with a negative correlation between VMT grade and number of Snellen lines (r= -0.61717). There was a distinct cutoff in visual function when VMT grade was higher than 4 with the presence of cysts and sub retinal separation and breaks in the retinal layers. Conclusions: VMT is a common encounter often associated with other coexisting macular pathology. We estimated an incidence rate of 0.01% of VMT cases with reduced vision and without coexisting macular pathology that may potentially benefit from intervention. Grading of VMT to select eyes with cyst formation as well as hole formation may be useful for targeting patients who are at higher risk of visual loss from VMT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human mesenchymal stem cell (hMSC) therapies have the potential to revolutionise the healthcare industry and replicate the success of the therapeutic protein industry; however, for this to be achieved there is a need to apply key bioprocessing engineering principles and adopt a quantitative approach for large-scale reproducible hMSC bioprocess development. Here we provide a quantitative analysis of the changes in concentration of glucose, lactate and ammonium with time during hMSC monolayer culture over 4 passages, under 100% and 20% dissolved oxgen (dO2), where either a 100%, 50% or 0% growth medium exchange was performed after 72h in culture. Yield coefficients, specific growth rates (h-1) and doubling times (h) were calculated for all cases. The 100% dO2 flasks outperformed the 20% dO2 flasks with respect to cumulative cell number, with the latter consuming more glucose and producing more lactate and ammonium. Furthermore, the 100% and 50% medium exchange conditions resulted in similar cumulative cell numbers, whilst the 0% conditions were significantly lower. Cell immunophenotype and multipotency were not affected by the experimental culture conditions. This study demonstrates the importance of determining optimal culture conditions for hMSC expansion and highlights a potential cost savings from only making a 50% medium exchange, which may prove significant for large-scale bioprocessing. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.