721 resultados para Curved Girder
Resumo:
The difficulties associated with slurry transportation in autogenous (ag) and semi-autogenous (sag) grinding mills have become more apparent in recent years with the increasing trend to build larger diameter mills for grinding high tonnages. This is particularly noticeable when ag/sag mills are run in closed circuit with classifiers such as fine screens/cyclones. Extensive test work carried out on slurry removal mechanism in grate discharge mills (ag/sag) has shown that the conventional pulp lifters (radial and curved) have inherent drawbacks. They allow short-circuiting of the slurry from pulp lifters into the grinding chamber leading to slurry pool formation. Slurry pool absorbs part of the impact thus inhibiting the grinding process. Twin Chamber Pulp Lifter (TCPL) - an efficient design of pulp lifter developed by the authors overcomes the inherent drawbacks of the conventional pulp lifters. Extensive testing in both laboratory and pilot scale mills has shown that the TCPL completely blocks the flow-back process, thus allowing the mill to operate close to their design flow capacity. The TCPL performance is also found to be independent of variations in charge volume and grate design, whereas they significantly affect the performance of conventional pulp lifters (radial and curved). (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Introductory courses covering modem physics sometimes introduce some elementary ideas from general relativity, though the idea of a geodesic is generally limited to shortest Euclidean length on a curved surface of two spatial dimensions rather than extremal aging in spacetime. It is shown that Epstein charts provide a simple geometric picture of geodesics in one space and one time dimension and that for a hypothetical uniform gravitational field, geodesics are straight lines on a planar diagram. This means that the properties of geodesics in a uniform field can be calculated with only a knowledge of elementary geometry and trigonometry, thus making the calculation of some basic results of general relativity accessible to students even in an algebra-based survey course on physics.
Resumo:
Quantum computers hold great promise for solving interesting computational problems, but it remains a challenge to find efficient quantum circuits that can perform these complicated tasks. Here we show that finding optimal quantum circuits is essentially equivalent to finding the shortest path between two points in a certain curved geometry. By recasting the problem of finding quantum circuits as a geometric problem, we open up the possibility of using the mathematical techniques of Riemannian geometry to suggest new quantum algorithms or to prove limitations on the power of quantum computers.
Resumo:
Virtual reality exposure therapy (VRET) developed using immersive or semi-immersive virtual environments present a usability problem for practitioners. To meet practitioner requirements for lower cost and portability VRET programs must often be ported onto desktop environments such as the personal computer (PC). However, success of VRET has been shown to be linked to presence, and the environment's ability to evoke the same reactions and emotions as a real experience. It is generally accepted that high-end virtual environments ( VEs) are more immersive than desktop PCs, but level of immersion does not always predict level of presence. This paper reports on the impact on presence of porting a therapeutic VR application for Schizophrenia from the initial research environment of a semi-immersive curved screen to PC. Presence in these two environments is measured both introspectively and across a number of causal factors thought to underlie the experience of presence. Results show that the VR exposure program successfully made users feel they were present in both platforms. While the desktop PC achieved higher scores on presence across causal factors participants reported they felt more present in the curved screen environment. While comparison of the two groups was statistically significant for the PQ but not for the IPQ, subjective reports of experiences in the environments should be considered in future research as the success of VRET relies heavily on the emotional response of patients to the therapeutic program.
Resumo:
Computer display height and desk design to allow forearm support are two critical design features of workstations for information technology tasks. However there is currently no 3D description of head and neck posture with different computer display heights and no direct comparison to paper based information technology tasks. There is also inconsistent evidence on the effect of forearm support on posture and no evidence on whether these features interact. This study compared the 3D head, neck and upper limb postures of 18 male and 18 female young adults whilst working with different display and desk design conditions. There was no substantial interaction between display height and desk design. Lower display heights increased head and neck flexion with more spinal asymmetry when working with paper. The curved desk, designed to provide forearm support, increased scapula elevation/protraction and shoulder flexion/abduction.
Resumo:
Absolute calibration relates the measured (arbitrary) intensity to the differential scattering cross section of the sample, which contains all of the quantitative information specific to the material. The importance of absolute calibration in small-angle scattering experiments has long been recognized. This work details the absolute calibration procedure of a small-angle X-ray scattering instrument from Bruker AXS. The absolute calibration presented here was achieved by using a number of different types of primary and secondary standards. The samples were: a glassy carbon specimen, which had been independently calibrated from neutron radiation; a range of pure liquids, which can be used as primary standards as their differential scattering cross section is directly related to their isothermal compressibility; and a suspension of monodisperse silica particles for which the differential scattering cross section is obtained from Porod's law. Good agreement was obtained between the different standard samples, provided that care was taken to obtain significant signal averaging and all sources of background scattering were accounted for. The specimen best suited for routine calibration was the glassy carbon sample, due to its relatively intense scattering and stability over time; however, initial calibration from a primary source is necessary. Pure liquids can be used as primary calibration standards, but the measurements take significantly longer and are, therefore, less suited for frequent use.
Resumo:
Neural networks are usually curved statistical models. They do not have finite dimensional sufficient statistics, so on-line learning on the model itself inevitably loses information. In this paper we propose a new scheme for training curved models, inspired by the ideas of ancillary statistics and adaptive critics. At each point estimate an auxiliary flat model (exponential family) is built to locally accommodate both the usual statistic (tangent to the model) and an ancillary statistic (normal to the model). The auxiliary model plays a role in determining credit assignment analogous to that played by an adaptive critic in solving temporal problems. The method is illustrated with the Cauchy model and the algorithm is proved to be asymptotically efficient.
Resumo:
This thesis describes the Generative Topographic Mapping (GTM) --- a non-linear latent variable model, intended for modelling continuous, intrinsically low-dimensional probability distributions, embedded in high-dimensional spaces. It can be seen as a non-linear form of principal component analysis or factor analysis. It also provides a principled alternative to the self-organizing map --- a widely established neural network model for unsupervised learning --- resolving many of its associated theoretical problems. An important, potential application of the GTM is visualization of high-dimensional data. Since the GTM is non-linear, the relationship between data and its visual representation may be far from trivial, but a better understanding of this relationship can be gained by computing the so-called magnification factor. In essence, the magnification factor relates the distances between data points, as they appear when visualized, to the actual distances between those data points. There are two principal limitations of the basic GTM model. The computational effort required will grow exponentially with the intrinsic dimensionality of the density model. However, if the intended application is visualization, this will typically not be a problem. The other limitation is the inherent structure of the GTM, which makes it most suitable for modelling moderately curved probability distributions of approximately rectangular shape. When the target distribution is very different to that, theaim of maintaining an `interpretable' structure, suitable for visualizing data, may come in conflict with the aim of providing a good density model. The fact that the GTM is a probabilistic model means that results from probability theory and statistics can be used to address problems such as model complexity. Furthermore, this framework provides solid ground for extending the GTM to wider contexts than that of this thesis.
Resumo:
The fabrication of micro-channels in single-mode optical fibers is demonstrated using focused femtosecond laser processing and chemical etching. Straight line micro-channels are achieved based on a simple technique which overcomes limitations imposed by the fiber curved surface.
Resumo:
Packed beds have many industrial applications and are increasingly used in the process industries due to their low pressure drop. With the introduction of more efficient packings, novel packing materials (i.e. adsorbents) and new applications (i.e. flue gas desulphurisation); the aspect ratio (height to diameter) of such beds is decreasing. Obtaining uniform gas distribution in such beds is of crucial importance in minimising operating costs and optimising plant performance. Since to some extent a packed bed acts as its own distributor the importance of obtaining uniform gas distribution has increased as aspect ratios (bed height to diameter) decrease. There is no rigorous design method for distributors due to a limited understanding of the fluid flow phenomena and in particular of the effect of the bed base / free fluid interface. This study is based on a combined theoretical and modelling approach. The starting point is the Ergun Equation which is used to determine the pressure drop over a bed where the flow is uni-directional. This equation has been applied in a vectorial form so it can be applied to maldistributed and multi-directional flows and has been realised in the Computational Fluid Dynamics code PHOENICS. The use of this equation and its application has been verified by modelling experimental measurements of maldistributed gas flows, where there is no free fluid / bed base interface. A novel, two-dimensional experiment has been designed to investigate the fluid mechanics of maldistributed gas flows in shallow packed beds. The flow through the outlet of the duct below the bed can be controlled, permitting a rigorous investigation. The results from this apparatus provide useful insights into the fluid mechanics of flow in and around a shallow packed bed and show the critical effect of the bed base. The PHOENICS/vectorial Ergun Equation model has been adapted to model this situation. The model has been improved by the inclusion of spatial voidage variations in the bed and the prescription of a novel bed base boundary condition. This boundary condition is based on the logarithmic law for velocities near walls without restricting the velocity at the bed base to zero and is applied within a turbulence model. The flow in a curved bed section, which is three-dimensional in nature, is examined experimentally. The effect of the walls and the changes in gas direction on the gas flow are shown to be particularly significant. As before, the relative amounts of gas flowing through the bed and duct outlet can be controlled. The model and improved understanding of the underlying physical phenomena form the basis for the development of new distributors and rigorous design methods for them.
Resumo:
The study of surfactant monolayers is certainly not a new technique, but the application of monolayer studies to elucidate controlling factors in liposome design remains an underutilised resource. Using a Langmuir-Blodgett trough, pure and mixed lipid monolayers can be investigated, both for their interactions within the monolayer, and for interfacial interactions with drugs in the aqueous sub-phase. Despite these monolayers effectively being only half a bilayer, with a flat rather than curved structure, information from these studies can be effectively translated into liposomal systems. Here we outline the background, general protocols and application of Langmuir studies with a focus on their application in liposomal systems. A range of case studies are discussed which show how the system can be used to support its application in the development of liposome drug delivery. Examples include investigations into the effect of cholesterol within the liposome bilayer, understanding effective lipid packaging within the bilayer to promote water soluble and poorly soluble drug retention, the effect of alkyl chain length on lipid packaging, and drug-monolayer electrostatic interactions that promote bilayer repackaging.
Resumo:
The work described in this thesis deals with the development and application of a finite element program for the analysis of several cracked structures. In order to simplify the organisation of the material presented herein, the thesis has been subdivided into two Sections : In the first Section the development of a finite element program for the analysis of two-dimensional problems of plane stress or plane strain is described. The element used in this program is the six-mode isoparametric triangular element which permits the accurate modelling of curved boundary surfaces. Various cases of material aniftropy are included in the derivation of the element stiffness properties. A digital computer program is described and examples of its application are presented. In the second Section, on fracture problems, several cracked configurations are analysed by embedding into the finite element mesh a sub-region, containing the singularities and over which an analytic solution is used. The modifications necessary to augment a standard finite element program, such as that developed in Section I, are discussed and complete programs for each cracked configuration are presented. Several examples are included to demonstrate the accuracy and flexibility of the technique.
Resumo:
The finite element method is now well established among engineers as being an extremely useful tool in the analysis of problems with complicated boundary conditions. One aim of this thesis has been to produce a set of computer algorithms capable of efficiently analysing complex three dimensional structures. This set of algorithms has been designed to permit much versatility. Provisions such as the use of only those parts of the system which are relevant to a given analysis and the facility to extend the system by the addition of new elements are incorporate. Five element types have been programmed, these are, prismatic members, rectangular plates, triangular plates and curved plates. The 'in and out of plane' stiffness matrices for a curved plate element are derived using the finite element technique. The performance of this type of element is compared with two other theoretical solutions as well as with a set of independent experimental observations. Additional experimental work was then carried out by the author to further evaluate the acceptability of this element. Finally the analysis of two large civil engineering structures, the shell of an electrical precipitator and a concrete bridge, are presented to investigate the performance of the algorithms. Comparisons are made between the computer time, core store requirements and the accuracy of the analysis, for the proposed system and those of another program.
Resumo:
We present recent results on femtosecond microfabrication of key components for integrated optics such as highly curved low-loss waveguides in glasses, depressed cladding waveguides in crystals. Details of microfabrication and characterisation are discussed.
Resumo:
Differential clinical diagnosis of the parkinsonian syndromes, viz., Parkinson’s disease (PD), progressive supranuclear palsy (PSP), dementia with Lewy bodies (DLB), and multiple system atrophy (MSA) can be difficult. Eye movement problems, however, are a chronic complication of many of these disorders and may be a useful aid to diagnosis. Hence, the presence in PSP of vertical supranuclear gaze palsy, fixation instability, lid retraction, blepharospasm, and apraxia of eyelid opening and closing is useful in separating PD from PSP. Moreover, atypical features of PSP include slowing of upward saccades, moderate slowing of downward saccades, the presence of a full range of voluntary vertical eye movements, a curved trajectory of oblique saccades, and absence of square-wave jerks. Downgaze palsy is probably the most useful diagnostic clinical symptom of PSP. By contrast, DLB patients are specifically impaired in both reflexive and saccadic execution and in the performance of more complex saccadic eye movement tasks. Problems in convergence in DLB are also followed by akinesia and rigidity. Abnormal ocular fixation may occur in a significant proportion of MSA patients along with excessive square-wave jerks, a mild supranuclear gaze palsy, a gaze-evoked nystagmus, a positioning down-beat nystagmus, mild-moderate saccadic hypometria, impaired smooth pursuit movements, and reduced vestibulo-ocular reflex (VOR) suppression. There may be considerable overlap between the eye movement problems characteristic of the various parkinsonian disorders, but taken together with other signs and symptoms, can be a useful aid in differential diagnosis, especially in the separation of PD and PSP.