33 resultados para Sheet-metal work - Simulation methods

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This investigation is in two parts, theory and experimental verification. (1) Theoretical Study In this study it is, for obvious reasons, necessary to analyse the concept of formability first. For the purpose of the present investigation it is sufficient to define the four aspects of formability as follows: (a) the formability of the material at a critical section, (b) the formability of the material in general, (c) process efficiency, (d) proportional increase in surface area. A method of quantitative assessment is proposed for each of the four aspects of formability. The theoretical study also includes the distinction between coaxial and non-coaxial strains which occur, respectively, in axisymmetrical and unsymmetrical forming processes and the inadequacy of the circular grid system for the assessment of formability is explained in the light of this distinction. (2) Experimental Study As one of the bases of the experimental work, the determination of the end point of a forming process, which sets the limit to the formability of the work material, is discussed. The effects of three process parameters on draw-in are shown graphically. Then the delay of fracture in sheet metal forming resulting from draw-in is analysed in kinematical terms, namely, through the radial displacements, the radial and the circumferential strains, and the projected thickness of the workpiece. Through the equilibrium equation of the membrane stresses, the effect on the shape of the unsupported region of the workpiece, and hence the position of the critical section is explained. Then, the effect of draw-in on the four aspects of formability is discussed throughout this investigation. The triangular coordinate system is used to present and analyse the triaxial strains involved. This coordinate system has the advantage of showing all the three principal strains in a material simultaneously, as well as representing clearly the many types of strains involved in sheet metal work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the bulge test, a sheet metal specimen is clamped over a circular hole in a die and formed into a bulge by the hydraulic pressure on one side of the specirnen. As the unsupported part of the specimen is deformed in this way, its area is increased, in other words, the material is generally stretched and its thickness generally decreased. The stresses causing this stretching action are the membrane stresses in the shell generated by the hydraulic pressure, in the same way as the rubber in a toy balloon is stretched by the membrane stresses caused by the air inside it. The bulge test is a widely used sheet metal test, to determine the "formability" of sheet materials. Research on this forming process (2)-(15)* has hitherto been almost exclusively confined to predicting the behaviour of the bulged specimen through the constitutive equations (stresses and strains in relation to displacements and shapes) and empirical work hardening characteristics of the material as determined in the tension test. In the present study the approach is reversed; the stresses and strains in the specimen are measured and determined from the geometry of the deformed shell. Thus, the bulge test can be used for determining the stress-strain relationship in the material under actual conditions in sheet metal forming processes. When sheet materials are formed by fluid pressure, the work-piece assumes an approximately spherical shape, The exact nature and magnitude of the deviation from the perfect sphere can be defined and measured by an index called prolateness. The distribution of prolateness throughout the workpiece at any particular stage of the forming process is of fundamental significance, because it determines the variation of the stress ratio on which the mode of deformation depends. It is found. that, before the process becomes unstable in sheet metal, the workpiece is exactly spherical only at the pole and at an annular ring. Between the pole and this annular ring the workpiece is more pointed than a sphere, and outside this ring, it is flatter than a sphere. In the forming of sheet materials, the stresses and hence the incremental strains, are closely related to the curvatures of the workpiece. This relationship between geometry and state of stress can be formulated quantitatively through prolateness. The determination of the magnitudes of prolateness, however, requires special techniques. The success of the experimental work is due to the technique of measuring the profile inclination of the meridional section very accurately. A travelling microscope, workshop protractor and surface plate are used for measurements of circumferential and meridional tangential strains. The curvatures can be calculated from geometry. If, however, the shape of the workpiece is expressed in terms of the current radial (r) and axial ( L) coordinates, it is very difficult to calculate the curvatures within an adequate degree of accuracy, owing to the double differentiation involved. In this project, a first differentiation is, in effect, by-passed by measuring the profile inclination directly and the second differentiation is performed in a round-about way, as explained in later chapters. The variations of the stresses in the workpiece thus observed have not, to the knowledge of the author, been reported experimentally. The static strength of shells to withstand fluid pressure and their buckling strength under concentrated loads, both depend on the distribution of the thickness. Thickness distribution can be controlled to a limited extent by changing the work hardening characteristics of the work material and by imposing constraints. A technique is provided in this thesis for determining accurately the stress distribution, on which the strains associated with thinning depend. Whether a problem of controlled thickness distribution is tackled by theory, or by experiments, or by both combined, the analysis in this thesis supplies the theoretical framework and some useful experimental techniques for the research applied to particular problems. The improvement of formability by allowing draw-in can also be analysed with the same theoretical and experimental techniques. Results on stress-strain relationships are usually represented by single stress-strain curves plotted either between one stress and one strain (as in the tension or compression tests) or between the effective stress and effective strain, as in tests on tubular specimens under combined tension, torsion and internal pressure. In this study, the triaxial stresses and strains are plotted simultaneously in triangular coordinates. Thus, both stress and strain are represented by vectors and the relationship between them by the relationship between two vector functions. From the results so obtained, conclusions are drawn on both the behaviour and the properties of the material in the bulge test. The stress ratios are generally equal to the strain-rate ratios (stress vectors collinear with incremental strain vectors) and the work-hardening characteristics, which apply only to the particular strain paths are deduced. Plastic instability of the material is generally considered to have been reached when the oil pressure has attained its maximum value so that further deformation occurs under a constant or lower pressure. It is found that the instability regime of deformation has already occurred long before the maximum pressure is attained. Thus, a new concept of instability is proposed, and for this criterion, instability can occur for any type of pressure growth curves.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As more of the economy moves from traditional manufacturing to the service sector, the nature of work is becoming less tangible and thus, the representation of human behaviour in models is becoming more important. Representing human behaviour and decision making in models is challenging, both in terms of capturing the essence of the processes, and also the way that those behaviours and decisions are or can be represented in the models themselves. In order to advance understanding in this area, a useful first step is to evaluate and start to classify the various types of behaviour and decision making that are required to be modelled. This talk will attempt to set out and provide an initial classification of the different types of behaviour and decision making that a modeller might want to represent in a model. Then, it will be useful to start to assess the main methods of simulation in terms of their capability in representing these various aspects. The three main simulation methods, System Dynamics, Agent Based Modelling and Discrete Event Simulation all achieve this to varying degrees. There is some evidence that all three methods can, within limits, represent the key aspects of the system being modelled. The three simulation approaches are then assessed for their suitability in modelling these various aspects. Illustration of behavioural modelling will be provided from cases in supply chain management, evacuation modelling and rail disruption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cold roll forming is an extremely important but little studied sheet metal forming process. In this thesis, the process of cold roll forming is introduced and it is seen that form roll design is central to the cold roll forming process. The conventional design and manufacture of form rolls is discussed and it is observed that surrounding the design process are a number of activities which although peripheral are time consuming and a possible source of error. A CAD/CAM system is described which alleviates many of the problems traditional to form roll design. New techniques for the calculation of strip length and controlling the means of forming bends are detailed. The CAD/CAM system's advantages and limitations are discussed and, whilst the system has numerous significant advantages, its principal limitation can be said to be the need to manufacture form rolls and test them on a mill before a design can be stated satisfactory. A survey of the previous theoretical and experimental analysis of cold roll forming is presented and is found to be limited. By considering the previous work, a method of numerical analysis of the cold roll forming process is proposed based on a minimum energy approach. Parallel to the numerical analysis, a comprehensive range of software has been developed to enhance the designer's visualisation of the effects of his form roll design. A complementary approach to the analysis of form roll design is the generation of form roll design, a method for the partial generation of designs is described. It is suggested that the two approaches should continue in parallel and that the limitation of each approach is knowledge of the cold roll forming process. Hence, an initial experimental investigation of the rolling of channel sections is described. Finally, areas of potential future work are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cold roll forming of thin-walled sections is a very useful process in the sheet metal industry. However, the conventional method for the design and manufacture of form-rolls, the special tooling used in the cold roll forming process, is a very time consuming and skill demanding exercise. This thesis describes the establishment of a stand-alone minicomputer based CAD/CAM system for assisting the design and manufacture of form-rolls. The work was undertaken in collaboration with a leading manufacturer of thin-walled sections. A package of computer programs have been developed to provide computer aids for every aspect of work in form-roll design and manufacture. The programs have been successfully implemented, as an integrated CAD/CAM software system, on the ICL PERQ minicomputer with graphics facilities. Thus, the developed CAD/CAM system is a single-user workstation, with software facilities to help the user to perform the conventional roll design activities including the design of the finished section, the flower pattern, and the form-rolls. A roll editor program can then be used to modify, if required, the computer generated roll profiles. As far as manufacturing is concerned, a special-purpose roll machining program and postprocessor can be used in conjunction to generate the NC control part-programs for the production of form-rolls by NC turning. Graphics facilities have been incorporated into the CAD/CAM software programs to display drawings interactively on the computer screen throughout all stages of execution of the CAD/CAM software. It has been found that computerisation can shorten the lead time in all activities dealing with the design and manufacture of form-rolls, and small or medium size manufacturing companies can gain benefits from the CAD/CM! technology by developing, according to its own specification, a tailor-made CAD/CAM software system on a low cost minicomputer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulation is an effective method for improving supply chain performance. However, there is limited advice available to assist practitioners in selecting the most appropriate method for a given problem. Much of the advice that does exist relies on custom and practice rather than a rigorous conceptual or empirical analysis. An analysis of the different modelling techniques applied in the supply chain domain was conducted, and the three main approaches to simulation used were identified; these are System Dynamics (SD), Discrete Event Simulation (DES) and Agent Based Modelling (ABM). This research has examined these approaches in two stages. Firstly, a first principles analysis was carried out in order to challenge the received wisdom about their strengths and weaknesses and a series of propositions were developed from this initial analysis. The second stage was to use the case study approach to test these propositions and to provide further empirical evidence to support their comparison. The contributions of this research are both in terms of knowledge and practice. In terms of knowledge, this research is the first holistic cross paradigm comparison of the three main approaches in the supply chain domain. Case studies have involved building ‘back to back’ models of the same supply chain problem using SD and a discrete approach (either DES or ABM). This has led to contributions concerning the limitations of applying SD to operational problem types. SD has also been found to have risks when applied to strategic and policy problems. Discrete methods have been found to have potential for exploring strategic problem types. It has been found that discrete simulation methods can model material and information feedback successfully. Further insights have been gained into the relationship between modelling purpose and modelling approach. In terms of practice, the findings have been summarised in the form of a framework linking modelling purpose, problem characteristics and simulation approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past 50 years there has been considerable progress in our understanding of biomolecular interactions at an atomic level. This in turn has allowed molecular simulation methods employing full atomistic modeling at ever larger scales to develop. However, some challenging areas still remain where there is either a lack of atomic resolution structures or where the simulation system is inherently complex. An area where both challenges are present is that of membranes containing membrane proteins. In this review we analyse a new practical approach to membrane protein study that offers a potential new route to high resolution structures and the possibility to simplify simulations. These new approaches collectively recognise that preservation of the interaction between the membrane protein and the lipid bilayer is often essential to maintain structure and function. The new methods preserve these interactions by producing nano-scale disc shaped particles that include bilayer and the chosen protein. Currently two approaches lead in this area: the MSP system that relies on peptides to stabilise the discs, and SMALPs where an amphipathic styrene maleic acid copolymer is used. Both methods greatly enable protein production and hence have the potential to accelerate atomic resolution structure determination as well as providing a simplified format for simulations of membrane protein dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sol-gel-synthesized bioactive glasses may be formed via a hydrolysis condensation reaction, silica being introduced in the form of tetraethyl orthosilicate (TEOS), and calcium is typically added in the form of calcium nitrate. The synthesis reaction proceeds in an aqueous environment; the resultant gel is dried, before stabilization by heat treatment. These materials, being amorphous, are complex at the level of their atomic-scale structure, but their bulk properties may only be properly understood on the basis of that structural insight. Thus, a full understanding of their structure-property relationship may only be achieved through the application of a coherent suite of leading-edge experimental probes, coupled with the cogent use of advanced computer simulation methods. Using as an exemplar a calcia-silica sol-gel glass of the kind developed by Larry Hench, in the memory of whom this paper is dedicated, we illustrate the successful use of high-energy X-ray and neutron scattering (diffraction) methods, magic-angle spinning solid-state NMR, and molecular dynamics simulation as components to a powerful methodology for the study of amorphous materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report gives an overview of the work being carried out, as part of the NEUROSAT project, in the Neural Computing Research Group at Aston University. The aim is to give a general review of the work and methods, with reference to other documents which provide the detail. The document is ongoing and will be updated as parts of the project are completed. Thus some of the references are not yet present. In the broadest sense, the Aston part of NEUROSAT is about using neural networks (and other advanced statistical techniques) to extract wind vectors from satellite measurements of ocean surface radar backscatter. The work involves several phases, which are outlined below. A brief summary of the theory and application of satellite scatterometers forms the first section. The next section deals with the forward modelling of the scatterometer data, after which the inverse problem is addressed. Dealiasing (or disambiguation) is discussed, together with proposed solutions. Finally a holistic framework is presented in which the problem can be solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In multilevel analyses, problems may arise when using Likert-type scales at the lowest level of analysis. Specifically, increases in variance should lead to greater censoring for the groups whose true scores fall at either end of the distribution. The current study used simulation methods to examine the influence of single-item Likert-type scale usage on ICC(1), ICC(2), and group-level correlations. Results revealed substantial underestimation of ICC(1) when using Likert-type scales with common response formats (e.g., 5 points). ICC(2) and group-level correlations were also underestimated, but to a lesser extent. Finally, the magnitude of underestimation was driven in large part to an interaction between Likert-type scale usage and the amounts of within- and between-group variance. © Sage Publications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Groupe Spécial Mobile (GSM) has been developed as the pan-European second generation of digital mobile systems. GSM operates in the 900 MHz frequency band and employs digital technology instead of the analogue technology of its predecessors. Digital technology enables the GSM system to operate in much smaller zones in comparison with the analogue systems. The GSM system will offer greater roaming facilities to its subscribers, extended throughout the countries that have installed the system. The GSM system could be seen as a further enhancement to European integration. GSM has adopted a contention-based protocol for multipoint-to-point transmission. In particular, the slotted-ALOHA medium access protocol is used to coordinate the transmission of the channel request messages between the scattered mobile stations. Collision still happens when more than one mobile station having the same random reference number attempts to transmit on the same time-slot. In this research, a modified version of this protocol has been developed in order to reduce the number of collisions and hence increase the random access channel throughput compared to the existing protocol. The performance evaluation of the protocol has been carried out using simulation methods. Due to the growing demand for mobile radio telephony as well as for data services, optimal usage of the scarce availability radio spectrum is becoming increasingly important. In this research, a protocol has been developed whereby the number of transmitted information packets over the GSM system is increased without any additional increase of the allocated radio spectrum. Simulation results are presented to show the improvements achieved by the proposed protocol. Cellular mobile radio networks commonly respond to an increase in the service demand by using smaller coverage areas. As a result, the volume of the signalling exchanges increases. In this research, a proposal for interconnecting the various entitles of the mobile radio network over the future broadband networks based on the IEEE 802.6 Metropolitan Area Network (MAN) is outlined. Simulation results are presented to show the benefits achieved by interconnecting these entities over the broadband Networks.