69 resultados para process modelling
Resumo:
Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.
Resumo:
Utilising cameras as a means to survey the surrounding environment is becoming increasingly popular in a number of different research areas and applications. Central to using camera sensors as input to a vision system, is the need to be able to manipulate and process the information captured in these images. One such application, is the use of cameras to monitor the quality of airport landing lighting at aerodromes where a camera is placed inside an aircraft and used to record images of the lighting pattern during the landing phase of a flight. The images are processed to determine a performance metric. This requires the development of custom software for the localisation and identification of luminaires within the image data. However, because of the necessity to keep airport operations functioning as efficiently as possible, it is difficult to collect enough image data to develop, test and validate any developed software. In this paper, we present a technique to model a virtual landing lighting pattern. A mathematical model is postulated which represents the glide path of the aircraft including random deviations from the expected path. A morphological method has been developed to localise and track the luminaires under different operating conditions. © 2011 IEEE.
Resumo:
Among the key challenges present in the modelling and optimisation of composite structures against impact is the computational expense involved in setting up accurate simulations of the impact event and then performing the iterations required to optimise the designs. It is of more interest to find good designs given the limitations of the resources and time available rather than the best possible design. In this paper, low cost but sufficiently accurate finite element (FE) models were generated in LS Dyna for several experimentally characterised materials by semi-automating the modelling process and using existing material models. These models were then used by an optimisation algorithm to generate new hybrid offspring, leading to minimum weight and/or cost designs from a selection of isotropic metals, polymers and orthotropic fibre-reinforced laminates that countered a specified impact threat. Experimental validation of the optimal designs thus identified was then successfully carried out using a single stage gas gun. With sufficient computational hardware, the techniques developed in this pilot study can further utilise fine meshes, equations of state and sophisticated material models, so that optimal hybrid systems can be identified from a wide range of materials, designs and threats.
Resumo:
An appreciation of the quantity of streamflow derived from the main hydrological pathways involved in transporting diffuse contaminants is critical when addressing a wide range of water resource management issues. In order to assess hydrological pathway contributions to streams, it is necessary to provide feasible upper and lower bounds for flows in each pathway. An important first step in this process is to provide reliable estimates of the slower responding groundwater pathways and subsequently the quicker overland and interflow pathways. This paper investigates the effectiveness of a multi-faceted approach applying different hydrograph separation techniques, supplemented by lumped hydrological modelling, for calculating the Baseflow Index (BFI), for the development of an integrated approach to hydrograph separation. A semi-distributed, lumped and deterministic rainfall runoff model known as NAM has been applied to ten catchments (ranging from 5 to 699 km2). While this modelling approach is useful as a validation method, NAM itself is also an important tool for investigation. These separation techniques provide a large variation in BFI, a difference of 0.741 predicted for BFI in a catchment with the less reliable fixed and sliding interval methods and local minima turning point methods included. This variation is reduced to 0.167 with these methods omitted. The Boughton and Eckhardt algorithms, while quite subjective in their use, provide quick and easily implemented approaches for obtaining physically realistic hydrograph separations. It is observed that while the different separation techniques give varying BFI values for each of the catchments, a recharge coefficient approach developed in Ireland, when applied in conjunction with the Master recession Curve Tabulation method, predict estimates in agreement with those obtained using the NAM model, and these estimates are also consistent with the study catchments’ geology. These two separation methods, in conjunction with the NAM model, were selected to form an integrated approach to assessing BFI in catchments.
Resumo:
Polypropylene sheets have been stretched at 160 °C to a state of large biaxial strain of extension ratio 3, and the stresses then allowed to relax at constant strain. The state of strain is reached via a path consisting of two sequential planar extensions, the second perpendicular to the first, under plane stress conditions with zero stress acting normal to the sheet. This strain path is highly relevant to solid phase deformation processes such as stretch blow moulding and thermoforming, and also reveals fundamental aspects of the flow rule required in the constitutive behaviour of the material. The rate of decay of stress is rapid, and such as to be highly significant in the modelling of processes that include stages of constant strain. A constitutive equation is developed that includes Eyring processes to model both the stress relaxation and strain rate dependence of the stress. The axial and transverse stresses observed during loading show that the use of a conventional Levy-Mises flow rule is ineffective, and instead a flow rule is used that takes account of the anisotropic state of the material via a power law function of the principal extension ratios. Finally the constitutive model is demonstrated to give quantitatively useful representation of the stresses both in loading and in stress relaxation.
Resumo:
Reinforced concrete (RC) beams may be strengthened for shear using externally bonded fiber reinforced polymer (FRP) composites in the form of side bonding, U-jacketing or complete wrapping. The shear failure of almost all RC beams shear-strengthened with side bonded FRP and the majority of those strengthened with FRP U-jackets, is due to debonding of the FRP. The bond behavior between the externally-bonded FRP reinforcement (referred to as FRP strips for simplicity) and the concrete substrate therefore plays a crucial role in the failure process of these beams. Despite extensive research in the past decade, there is still a lack of understanding of how debonding of FRP strips in such a beam propagates and how the debonding process affects its shear behavior. This paper presents an analytical study on the progressive debonding of FRP strips in such strengthened beams. The complete debonding process is modeled and the contribution of the FRP strips to the shear capacity of the beam is quantified. The validity of the analytical solution is verified by comparing its predictions with numerical results from a finite element analysis. This analytical treatment represents a significant step forward in understanding how interaction between FRP strips, steel stirrups and concrete affects the shear resistance of RC beams shear-strengthened with FRP strips.
Resumo:
Simulations of the injection stretch-blow moulding process have been developed for the manufacture of poly(ethylene terephthalate) bottles using the commercial finite element package ABAQUS/standard. Initially a simulation of the manufacture of a 330 mL bottle was developed with three different material models (hyperelastic, creep, and a non-linear viscoelastic model (Buckley model)) to ascertain their suitability for modelling poly(ethylene terephthalate). The Buckley model was found to give results for the sidewall thickness that matched best with those measured from bottles off the production line. Following the investigation of the material models, the Buckley model was chosen to conduct a three-dimensional simulation of the manufacture of a 2 L bottle. It was found that the model was also capable of predicting the wall thickness distribution accurately for this bottle. In the development of the three-dimensional simulation a novel approach, which uses an axisymmetric model until the material reaches the petaloid base, was developed. This resulted in substantial savings in computing time. © 2000 IoM Communication Ltd.
Resumo:
The use of joint modelling approaches is becoming increasingly popular when an association exists between survival and longitudinal processes. Widely recognized for their gain in efficiency, joint models also offer a reduction in bias compared with naïve methods. With the increasing popularity comes a constantly expanding literature on joint modelling approaches. The aim of this paper is to give an overview of recent literature relating to joint models, in particular those that focus on the time-to-event survival process. A discussion is provided on the range of survival submodels that have been implemented in a joint modelling framework. A particular focus is given to the recent advancements in software used to build these models. Illustrated through the use of two different real-life data examples that focus on the survival of end-stage renal disease patients, the use of the JM and joineR packages within R are demonstrated. The possible future direction for this field of research is also discussed. © 2013 International Statistical Institute.
Resumo:
In this article the multibody simulation software package MADYMO for analysing and optimizing occupant safety design was used to model crash tests for Normal Containment barriers in accordance with EN 1317. The verification process was carried out by simulating a TB31 and a TB32 crash test performed on vertical portable concrete barriers and by comparing the numerical results to those obtained experimentally. The same modelling approach was applied to both tests to evaluate the predictive capacity of the modelling at two different impact speeds. A sensitivity analysis of the vehicle stiffness was also carried out. The capacity to predict all of the principal EN1317 criteria was assessed for the first time: the acceleration severity index, the theoretical head impact velocity, the barrier working width and the vehicle exit box. Results showed a maximum error of 6% for the acceleration severity index and 21% for theoretical head impact velocity for the numerical simulation in comparison to the recorded data. The exit box position was predicted with a maximum error of 4°. For the working width, a large percentage difference was observed for test TB31 due to the small absolute value of the barrier deflection but the results were well within the limit value from the standard for both tests. The sensitivity analysis showed the robustness of the modelling with respect to contact stiffness increase of ±20% and ±40%. This is the first multibody model of portable concrete barriers that can reproduce not only the acceleration severity index but all the test criteria of EN 1317 and is therefore a valuable tool for new product development and for injury biomechanics research.
Resumo:
Background: Large-scale randomised controlled trials are relatively rare in education. The present study approximates to, but is not exactly, a randomised controlled trial. It was an attempt to scale up previous small peer tutoring projects, while investing only modestly in continuing professional development for teachers.Purpose: A two-year study of peer tutoring in reading was undertaken in one local education authority in Scotland. The relative effectiveness of cross-age versus same-age tutoring, light versus intensive intervention, and reading versus reading and mathematics tutoring were investigated.Programme description (if relevant): The intervention was Paired Reading, a freely available cross-ability tutoring method applied to books of the pupils' choice but above the tutee's independent readability level. It involves Reading Together and Reading Alone, and switching from one to the other according to need.Sample: Eighty-seven primary schools of overall average socio-economic status, ability and gender in one council in Scotland. There were few ethnic minority students. Proportions of students with special needs were low. Children were eight and 10 years old as the intervention started. Macro-evaluation n = 3520. Micro-evaluation Year 1 15 schools n = 592, Year 2 a different 15 schools n = 591, compared with a comparison group of five schools n = 240.Design and methods: Almost all the primary schools in the local authority participated and were randomly allocated to condition. A macro-evaluation tested and retested over a two-year period using Performance Indicators in Primary Schools. A micro-evaluation tested and retested within each year using norm-referenced tests of reading comprehension. Macro-evaluation was with multi-level modelling, micro-evaluation with descriptive statistics and effect sizes, analysis of variance (ANOVA) and multivariate analysis of variance (MANOVA).Results: Macro-evaluation yielded significant pre-post gains in reading attainment for cross-age tutoring over both years. No other differences were significant. Micro-evaluation yielded pre-post changes in Year 1 (selected) and Year 2 (random) greater than controls, with no difference between same-age and cross-age tutoring. Light and intensive tutoring were equally effective. Tutoring reading and mathematics together was more effective than only tutoring reading. Lower socio-economic and lower reading ability students did better. Girls did better than boys. Regarding observed implementation quality, some factors were high and others low. Few implementation variables correlated with attainment gain.Conclusions: Paired Reading tutoring does lead to better reading attainment compared with students not participating. This is true in the long term (macro-evaluation) for cross-age tutoring, and in the short term (micro-evaluation) for both cross-age and same-age tutoring. Tutors and tutees benefited. Intensity had no effect but dual tutoring did have an effect. Low-socio-economic status, low-ability and female students did better. The results of the different forms of evaluation were indeed different. There are implications for practice and for future research. © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
This paper presents an integrated design and costing method for large stiffened panels for the purpose of investigating the influence and interaction of lay-up technology and production rate on manufacturing cost. A series of wing cover panels (≈586kg, 19·9m2) have been sized with realistic requirements considering manual and automated lay-up routes. The integrated method has enabled the quantification of component unit cost sensitivity to changes in annual production rate and employed equipment maximum deposition rate. Moreover the results demonstrate the interconnected relationship between lay-up process and panel design, and unit cost. The optimum unit cost solution when using automated lay-up is a combination of the minimum deposition rate and minimum number of lay-up machines to meet the required production rate. However, the location of the optimum unit cost, at the boundaries between the number of lay-up machines required, can make unit cost very sensitive to small changes in component design, production rate, and equipment maximum deposition rate. - See more at: http://aerosociety.com/News/Publications/Aero-Journal/Online/1941/Modelling-layup-automation-and-production-rate-interaction-on-the-cost-of-large-stiffened-panel-components#sthash.0fLuu9iG.dpuf
Resumo:
In the production process of polyethylene terephthalate (PET) bottles, the initial temperature of preforms plays a central role on the final thickness, intensity and other structural properties of the bottles. Also, the difference between inside and outside temperature profiles could make a significant impact on the final product quality. The preforms are preheated by infrared heating oven system which is often an open loop system and relies heavily on trial and error approach to adjust the lamp power settings. In this paper, a radial basis function (RBF) neural network model, optimized by a two-stage selection (TSS) algorithm combined with partial swarm optimization (PSO), is developed to model the nonlinear relations between the lamp power settings and the output temperature profile of PET bottles. Then an improved PSO method for lamp setting adjustment using the above model is presented. Simulation results based on experimental data confirm the effectiveness of the modelling and optimization method.
Resumo:
Extrusion is one of the fundamental production methods in the polymer processing industry and is used in the production of a large number of commodities in a diverse industrial sector. Being an energy intensive production method, process energy efficiency is one of the major concerns and the selection of the most energy efficient processing conditions is a key to reducing operating costs. Usually, extruders consume energy through the drive motor, barrel heaters, cooling fans, cooling water pumps, gear pumps, etc. Typically the drive motor is the largest energy consuming device in an extruder while barrel/die heaters are responsible for the second largest energy demand. This study is focused on investigating the total energy demand of an extrusion plant under various processing conditions while identifying ways to optimise the energy efficiency. Initially, a review was carried out on the monitoring and modelling of the energy consumption in polymer extrusion. Also, the power factor, energy demand and losses of a typical extrusion plant were discussed in detail. The mass throughput, total energy consumption and power factor of an extruder were experimentally observed over different processing conditions and the total extruder energy demand was modelled empirically and also using a commercially available extrusion simulation software. The experimental results show that extruder energy demand is heavily coupled between the machine, material and process parameters. The total power predicted by the simulation software exhibits a lagging offset compared with the experimental measurements. Empirical models are in good agreement with the experimental measurements and hence these can be used in studying process energy behaviour in detail and to identify ways to optimise the process energy efficiency.
Resumo:
Quantum yields of the photocatalytic degradation of methyl orange under controlled periodic illumination (CPI) have been modelled using existing models. A modified Langmuir-Hinshelwood (L-H) rate equation was used to predict the degradation reaction rates of methyl orange at various duty cycles and a simple photocatalytic model was applied in modelling quantum yield enhancement of the photocatalytic process due to the CPI effect. A good agreement between the modelled and experimental data was observed for quantum yield modelling. The modified L-H model, however, did not accurately predict the photocatalytic decomposition of the dye under periodic illumination.
Resumo:
The technique of externally bonding fibre reinforced polymer (FRP) composites has been becoming popular worldwide for retrofitting existing reinforced concrete (RC) structures. A major failure mode in such strengthened structures is the debonding of FRP from the concrete substrate. The bond behaviour between FRP and concrete thus plays a crucial role in these structures. The FRP-to-concrete bond behaviour has been extensively investigated experimentally, commonly using the pull-off test of FRP-to-concrete bonded joint. Comparatively, much less research has been concerned with the numerical simulation of this bond behaviour, chiefly due to difficulties in accurately modelling the complex behaviour of concrete. This paper proposes a robust finite element (FE) model for simulating the bond behaviour in the entire loading process in the pull-off test. A concrete damage plasticity model based on the plastic degradation theory is proposed to overcome the weakness of the elastic degradation theory which has been commonly adopted in previous studies. The model produces results in very close agreement with test data. © Tsinghua University Press, Beijing and Springer-Verlag Berlin Heidelberg 2011.