28 resultados para Cost modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for this paper is to present procedures for automatically creating idealised finite element models from the 3D CAD solid geometry of a component. The procedures produce an accurate and efficient analysis model with little effort on the part of the user. The technique is applicable to thin walled components with local complex features and automatically creates analysis models where 3D elements representing the complex regions in the component are embedded in an efficient shell mesh representing the mid-faces of the thin sheet regions. As the resulting models contain elements of more than one dimension, they are referred to as mixed dimensional models. Although these models are computationally more expensive than some of the idealisation techniques currently employed in industry, they do allow the structural behaviour of the model to be analysed more accurately, which is essential if appropriate design decisions are to be made. Also, using these procedures, analysis models can be created automatically whereas the current idealisation techniques are mostly manual, have long preparation times, and are based on engineering judgement. In the paper the idealisation approach is first applied to 2D models that are used to approximate axisymmetric components for analysis. For these models 2D elements representing the complex regions are embedded in a 1D mesh representing the midline of the cross section of the thin sheet regions. Also discussed is the coupling, which is necessary to link the elements of different dimensionality together. Analysis results from a 3D mixed dimensional model created using the techniques in this paper are compared to those from a stiffened shell model and a 3D solid model to demonstrate the improved accuracy of the new approach. At the end of the paper a quantitative analysis of the reduction in computational cost due to shell meshing thin sheet regions demonstrates that the reduction in degrees of freedom is proportional to the square of the aspect ratio of the region, and for long slender solids, the reduction can be proportional to the aspect ratio of the region if appropriate meshing algorithms are used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the key challenges present in the modelling and optimisation of composite structures against impact is the computational expense involved in setting up accurate simulations of the impact event and then performing the iterations required to optimise the designs. It is of more interest to find good designs given the limitations of the resources and time available rather than the best possible design. In this paper, low cost but sufficiently accurate finite element (FE) models were generated in LS Dyna for several experimentally characterised materials by semi-automating the modelling process and using existing material models. These models were then used by an optimisation algorithm to generate new hybrid offspring, leading to minimum weight and/or cost designs from a selection of isotropic metals, polymers and orthotropic fibre-reinforced laminates that countered a specified impact threat. Experimental validation of the optimal designs thus identified was then successfully carried out using a single stage gas gun. With sufficient computational hardware, the techniques developed in this pilot study can further utilise fine meshes, equations of state and sophisticated material models, so that optimal hybrid systems can be identified from a wide range of materials, designs and threats.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gabions are stone-filled wire containers which are frequently used as retaining walls. However, due to their high mass, relatively low cost and visual appeal, a row of single gabion blocks, joined at the ends, has the potential to be used as a roadside impact absorption device where traditional steel or concrete devices may not be suitable. To evaluate such application, the shear and bending deformation of gabions under vehicle impact need to be investigated. In this paper, the shear response of a single gabion block is analytically modelled and a gabion beam multibody model is developed using a discretisation method to capture the deformability of the gabion structure. The material properties of the gabion beam are adopted from experimental values available in the literature and the modelling is statically validated over a three-point bending test and a distributed loading test. The results show that the discretised multibody modelling can be effectively used to describe the static deformation behaviour of gabion blocks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a novel method for modelling a scaled vehicle–barrier crash test similar to the 20◦ angled barrier test specified in EN 1317 is reported. The intended application is for proof-of-concept evaluation of novel roadside barrier designs, and as a cost-effective precursor to full-scale testing or detailed computational modelling. The method is based on the combination of the conservation of energy law and the equation of motion of a spring mass system representing the impact, and shows, for the first time, the feasibility of applying classical scaling theories to evaluation of roadside barrier design. The scaling method is used to set the initial velocity of the vehicle in the scaled test and to provide scaling factors to convert the measured vehicle accelerations in the scaled test to predicted full-scale accelerations. These values can then be used to calculate the Acceleration Severity Index score of the barrier for a full-scale test. The theoretical validity of the method is demonstrated by comparison to numerical simulations of scaled and full-scale angled barrier impacts using multibody analysis implemented in the crash simulation software MADYMO. Results show a maximum error of 0.3% ascribable to the scaling method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a physics based modelling procedure to predict the thermal damage of composite material when struck by lightning. The procedure uses the Finite Element Method with non-linear material models to represent the extreme thermal material behaviour of the composite material (carbon/epoxy) and an embedded copper mesh protection system. Simulation predictions are compared against published experimental data, illustrating the potential accuracy and computational cost of virtual lightning strike tests and the requirement for temperature dependent material modelling. The modelling procedure is then used to examine and explain a number of practical solutions to minimize thermal material damage. © 2013 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Asthma is now one of the most common long-term conditions in the UK. It is therefore important to develop a comprehensive appreciation of the healthcare and societal costs in order to inform decisions on care provision and planning. We plan to build on our earlier estimates of national prevalence and costs from asthma by filling the data gaps previously identified in relation to healthcare and broadening the field of enquiry to include societal costs. This work will provide the first UK-wide estimates of the costs of asthma. In the context of asthma for the UK and its member countries (ie, England, Northern Ireland, Scotland and Wales), we seek to: (1) produce a detailed overview of estimates of incidence, prevalence and healthcare utilisation; (2) estimate health and societal costs; (3) identify any remaining information gaps and explore the feasibility of filling these and (4) provide insights into future research that has the potential to inform changes in policy leading to the provision of more cost-effective care.

Methods and analysis Secondary analyses of data from national health surveys, primary care, prescribing, emergency care, hospital, mortality and administrative data sources will be undertaken to estimate prevalence, healthcare utilisation and outcomes from asthma. Data linkages and economic modelling will be undertaken in an attempt to populate data gaps and estimate costs. Separate prevalence and cost estimates will be calculated for each of the UK-member countries and these will then be aggregated to generate UK-wide estimates.

Ethics and dissemination Approvals have been obtained from the NHS Scotland Information Services Division's Privacy Advisory Committee, the Secure Anonymised Information Linkage Collaboration Review System, the NHS South-East Scotland Research Ethics Service and The University of Edinburgh's Centre for Population Health Sciences Research Ethics Committee. We will produce a report for Asthma-UK, submit papers to peer-reviewed journals and construct an interactive map.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reduced Order Models (ROMs) have proven to be a valid and efficient approach to model the thermal behaviour of building zones. The main issues associated with the use of zonal/lumped models are how to (1) divide the domain (lumps) and (2) evaluate the pa- rameters which characterise the lump-to-lump exchange of energy and momentum. The object of this research is to develop a methodology for the generation of ROMs from CFD models. The lumps of the ROM and their average property values are automatically ex- tracted from the CFD models through user defined constraints. This methodology has been applied to validated CFD models of a zone of the Environmental Research Insti- tute (ERI) Building in University College Cork (UCC). The ROM predicts temperature distribution in the domain with an average error lower than 2%. It is computationally efficient with an execution time of 3.45 seconds. Future steps in this research will be the development of the procedure to automatically extract the parameters which define lump-to-lump energy and momentum exchange. At the moment these parameters are evaluated through the minimisation of a cost function. The ROMs will also be utilised to predict the transient thermal behaviour of the building zone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building Information Modelling (BIM) is growing in pace, not only in design and construction stages, but also in the analysis of facilities throughout their life cycle. With this continued growth and utilisation of BIM processes, comes the possibility to adopt such procedures, to accurately measure the energy efficiency of buildings, to accurately estimate their energy usage. To this end, the aim of this research is to investigate if the introduction of BIM Energy Performance Assessment in the form of software analysis, provides accurate results, when compared with actual energy consumption recorded. Through selective sampling, three domestic case studies are scrutinised, with baseline figures taken from existing energy providers, the results scrutinised and compared with calculations provided from two separate BIM energy analysis software packages. Of the numerous software packages available, criterion sampling is used to select two of the most prominent platforms available on the market today. The two packages selected for scrutiny are Integrated Environmental Solutions - Virtual Environment (IES-VE) and Green Building Studio (GBS). The results indicate that IES-VE estimated the energy use in region of ±8% in two out of three case studies while GBS estimated usage approximately ±5%. The findings indicate that the introduction of BIM energy performance assessment, using proprietary software analysis, is a viable alternative to manual calculations of building energy use, mainly due to the accuracy and speed of assessing, even the most complex models. Given the surge in accurate and detailed BIM models and the importance placed on the continued monitoring and control of buildings energy use within today’s environmentally conscious society, this provides an alternative means by which to accurately assess a buildings energy usage, in a quick and cost effective manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Despite vaccines and improved medical intensive care, clinicians must continue to be vigilant of possible Meningococcal Disease in children. The objective was to establish if the procalcitonin test was a cost-effective adjunct for prodromal Meningococcal Disease in children presenting at emergency department with fever without source.

METHODS AND FINDINGS: Data to evaluate procalcitonin, C-reactive protein and white cell count tests as indicators of Meningococcal Disease were collected from six independent studies identified through a systematic literature search, applying PRISMA guidelines. The data included 881 children with fever without source in developed countries.The optimal cut-off value for the procalcitonin, C-reactive protein and white cell count tests, each as an indicator of Meningococcal Disease, was determined. Summary Receiver Operator Curve analysis determined the overall diagnostic performance of each test with 95% confidence intervals. A decision analytic model was designed to reflect realistic clinical pathways for a child presenting with fever without source by comparing two diagnostic strategies: standard testing using combined C-reactive protein and white cell count tests compared to standard testing plus procalcitonin test. The costs of each of the four diagnosis groups (true positive, false negative, true negative and false positive) were assessed from a National Health Service payer perspective. The procalcitonin test was more accurate (sensitivity=0.89, 95%CI=0.76-0.96; specificity=0.74, 95%CI=0.4-0.92) for early Meningococcal Disease compared to standard testing alone (sensitivity=0.47, 95%CI=0.32-0.62; specificity=0.8, 95% CI=0.64-0.9). Decision analytic model outcomes indicated that the incremental cost effectiveness ratio for the base case was £-8,137.25 (US $ -13,371.94) per correctly treated patient.

CONCLUSIONS: Procalcitonin plus standard recommended tests, improved the discriminatory ability for fatal Meningococcal Disease and was more cost-effective; it was also a superior biomarker in infants. Further research is recommended for point-of-care procalcitonin testing and Markov modelling to incorporate cost per QALY with a life-time model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel digital image correlation (DIC) technique has been developed to track changes in textile yarn orientations during shear characterisation experiments, requiring only low-cost digital imaging equipment. Fabric shear angles and effective yarn strains are calculated and visualised using this new DIC technique for bias extension testing of an aerospace grade, carbon-fibre reinforcement material with a plain weave architecture. The DIC results are validated by direct measurement, and the use of a wide bias extension sample is evaluated against a more commonly used narrow sample. Wide samples exhibit a shear angle range 25% greater than narrow samples and peak loads which are 10 times higher. This is primarily due to excessive yarn slippage in the narrow samples; hence, the wide sample configuration is recommended for characterisation of shear properties which are required for accurate modelling of textile draping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of a virtual testing environment, as a cost-effective industrial design tool in the design and analysis of composite structures, requires the need to create models efficiently, as well as accelerate the analysis by reducing the number of degrees of freedom, while still satisfying the need for accurately tracking the evolution of a debond, delamination or crack front. The eventual aim is to simulate both damage initiation and propagation in components with realistic geometrical features, where crack propagation paths are not trivial. Meshless approaches, and the Element-Free Galerkin (EFG) method, are particularly suitable for problems involving changes in topology and have been successfully applied to simulate damage in homogeneous materials and concrete. In this work, the method is utilized to model initiation and mixed-mode propagation of cracks in composite laminates, and to simulate experimentally-observed crack migration which is difficult to model using standard finite element analysis. N

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DESIGN We will address our research objectives by searching the published and unpublished literature and conducting an evidence synthesis of i) studies of the effectiveness of psychosocial interventions provided for children and adolescents who have suffered maltreatment, ii) economic evaluations of these interventions and iii) studies of their acceptability to children, adolescents and their carers. SEARCH STRATEGY: Evidence will be identified via electronic databases for health and allied health literature, social sciences and social welfare, education and other evidence based depositories, and economic databases. We will identify material generated by user-led,voluntary sector enquiry by searching the internet and browsing the websites of relevant UK government departments and charities. Additionally, studies will be identified via the bibliographies of retrieved articles/reviews; targeted author searches; forward citation searching. We will also use our extensive professional networks, and our planned consultations with key stakeholders and our study steering committee. Databases will be searched from inception to time of search. REVIEW STRATEGY Inclusion criteria: 1) Infants, children or adolescents who have experienced maltreatment between the ages of 0 17 years. 2) All psychosocial interventions available for maltreated children and adolescents, by any provider and in any setting, aiming to address the sequelae of any form of maltreatment, including fabricated illness. 3) For synthesis of evidence of effectiveness: all controlled studies in which psychosocial interventions are compared with no-treatment, treatment as usual, waitlist or other-treated controls. For a synthesis of evidence of acceptability we will include any design that asks participants for their views or provides data on non-participation. For decision-analytic modelling we may include uncontrolled studies. Primary and secondary outcomes will be confirmed in consultation with stakeholders. Provisional primary outcomes are psychological distress/mental health (particularly PTSD, depression and anxiety, self-harm); ii) behaviour; iii) social functioning; iv) cognitive / academic attainment, v) quality of life, and vi) costs. After studies that meet the inclusion criteria have been identified (independently by two reviewers), data will be extracted and risk of bias (RoB) assessed (independently by two reviewers) using the Cochrane Collaboration RoB Tool (effectiveness), quality hierarchies of data sources for economic analyses (cost-effectiveness) and the CASP tool for qualitative research (acceptability). Where interventions are similar and appropriate data are available (or can be obtained) evidence synthesis will be performed to pool the results. Where possible, we will explore the extent to which age, maltreatment history (including whether intra- or extra-familial), time since maltreatment, care setting (family / out-of-home care including foster care/residential), care history, and characteristics of intervention (type, setting, provider, duration) moderate the effects of psychosocial interventions. A synthesis of acceptability data will be undertaken, using a narrative approach to synthesis. A decision-analytic model will be constructed to compare the expected cost-effectiveness of the different types of intervention identified in the systematic review. We will also conduct a Value of information analysis if the data permit. EXPECTED OUTPUTS: A synthesis of the effectiveness and cost effectiveness of psychosocial interventions for maltreated children (taking into account age, maltreatment profile and setting) and their acceptability to key stakeholders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Muddy floods occur when rainfall generates runoff on agricultural land, detaching and transporting sediment into the surrounding natural and built environment. In the Belgian Loess Belt, muddy floods occur regularly and lead to considerable economic costs associated with damage to property and infrastructure. Mitigation measures designed to manage the problem have been tested in a pilot area within Flanders and were found to be cost-effective within three years. This study assesses whether these mitigation measures will remain effective under a changing climate. To test this, the Water Erosion Prediction Project (WEPP) model was used to examine muddy flooding diagnostics (precipitation, runoff, soil loss and sediment yield) for a case study hillslope in Flanders where grass buffer strips are currently used as a mitigation measure. The model was run for present day conditions and then under 33 future site-specific climate scenarios. These future scenarios were generated from three earth system models driven by four representative concentration pathways and downscaled using quantile mapping and the weather generator CLIGEN. Results reveal that under the majority of future scenarios, muddy flooding diagnostics are projected to increase, mostly as a consequence of large scale precipitation events rather than mean changes. The magnitude of muddy flood events for a given return period is also generally projected to increase. These findings indicate that present day mitigation measures may have a reduced capacity to manage muddy flooding given the changes imposed by a warming climate with an enhanced hydrological cycle. Revisions to the design of existing mitigation measures within existing policy frameworks are considered the most effective way to account for the impacts of climate change in future mitigation planning.