917 resultados para Inflation (Finance) - Mathematical models
Resumo:
Computer based mathematical models describing the aircraft evacuation process have a vital role to play in the design and development of safer aircraft, in the implementation of safer and more rigorous certification criteria and in post mortuuum accident investigation. As the risk of personal injury and costs involved in performing large-scale evacuation experiments for the next generation 'Ultra High Capacity Aircraft' (UHCA) are expected to be high, the development and use of these evacuation modelling tools may become essential if these aircraft are to prove a viable reality. In this paper the capabilities and limitation of the air-EXODUS evacuation model are described. Its successful application to the prediction of a recent certificaiton trial, prior to the actual trial taking place, is described. Also described is a newly defined parameter known as OPS which can be used as a measure of evacuation trial optimality. Finally, the data requirements of aircraft evacuation models is discussed along with several projects currently underway at the University of Greenwich designed to obtain this data. Included in this discussion is a description of the AASK - Aircraft Accident Statistics and Knowledge - data base which contains detailed information from aircraft accident survivors.
Resumo:
Computer based mathematical models describing the aircraft evacuation process have a vital role to play in aviation safety. However, such models have a heavy dependency on real evacuation data. The Fire Safety Engineering Group of the University of Greenwich is undertaking a large data extraction exercise in order to address this issue. This paper describes the extraction and application of data from aviation accident reports. To aid in the storage and analysis of the raw data, a computer database known as AASK (Aircraft Accident Statistics and Knowledge) is under development. AASK is being developed to store human observational and anecdotal data contained in accident reports and interview transcripts. AASK currently contains information from 25 survivable aviation accidents covering the period 04/04/77 to 06/08/95, involving some 2415 passengers, 2210 survivors, 205 fatalities and accounts from 669 people. Copyright © 1999 John Wiley & Sons, Ltd.
Resumo:
High pollution levels have been often observed in urban street canyons due to the increased traffic emissions and reduced natural ventilation. Microscale dispersion models with different levels of complexity may be used to assess urban air qualityand support decision-making for pollution control strategies and traffic planning. Mathematical models calculate pollutant concentrations by solving either analytically a simplified set of parametric equations or numerically a set of differential equations that describe in detail wind flow and pollutant dispersion. Street canyon models, which might also include simplified photochemistry and particle deposition–resuspension algorithms, are often nested within larger-scale urban dispersion codes. Reduced-scale physical models in wind tunnels may also be used for investigating atmospheric processes within urban canyons and validating mathematical models. A range of monitoring techniques is used to measure pollutant concentrations in urban streets. Point measurement methods (continuous monitoring, passive and active pre-concentration sampling, grab sampling) are available for gaseous pollutants. A number of sampling techniques (mainlybased on filtration and impaction) can be used to obtain mass concentration, size distribution and chemical composition of particles. A combination of different sampling/monitoring techniques is often adopted in experimental studies. Relativelysimple mathematical models have usually been used in association with field measurements to obtain and interpret time series of pollutant concentrations at a limited number of receptor locations in street canyons. On the other hand, advanced numerical codes have often been applied in combination with wind tunnel and/or field data to simulate small-scale dispersion within the urban canopy.
Resumo:
Computer based mathematical models describing the aircraft evacuation process have a vital role to play in the design and development of safer aircraft, the implementation of safer and more rigorous certification criteria, in cabin crew training and post-mortem accident investigation. As the risk of personal injury and the costs involved in performing full-scale certification trials are high, the development and use of these evacuation modelling tools are essential. Furthermore, evacuation models provide insight into the evacuation process that is impossible to derive from a single certification trial. The airEXODUS evacuation model has been under development since 1989 with support from the UK CAA and the aviation industry. In addition to describing the capabilities of the airEXODUS evacuation model, this paper describes the findings of a recent CAA project aimed at investigating model accuracy in predicting past certification trials. Furthermore, airEXODUS is used to examine issues related to the Blended Wing Body (BWB) and Very Large Transport Aircraft (VLTA). These radical new aircraft concepts pose considerable challenges to designers, operators and certification authorities. BWB concepts involving one or two decks with possibly four or more aisles offer even greater challenges. Can the largest exits currently available cope with passenger flow arising from four or five aisles? Do we need to consider new concepts in exit design? Should the main aisle be made wider to accommodate more passengers? In this paper we discuss various issues evacuation related issues associated VLTA and BWB aircraft and demonstrate how computer based evacuation models can be used to investigage these issues through examination of aisle/exit configurations for BWB cabin layouts.
Resumo:
We consider single machine scheduling and due date assignment problems in which the processing time of a job depends on its position in a processing sequence. The objective functions include the cost of changing the due dates, the total cost of discarded jobs that cannot be completed by their due dates and, possibly, the total earliness of the scheduled jobs. We present polynomial-time dynamic programming algorithms in the case of two popular due date assignment methods: CON and SLK. The considered problems are related to mathematical models of cooperation between the manufacturer and the customer in supply chain scheduling.
Resumo:
Dosators and other dosing mechanisms operating on generally similar principles are very widely used in the pharmaceutical industry for capsule filling, and for dosing products that are delivered to the customer in powder form such as inhalers. This is a trend that is set to increase. However a significant problem for this technology is being able to predict how accurately and reliably, new drug formulations will be dosed from these machines prior to manufacture. This paper presents a review of the literature relating to powder dosators which considers mathematical models for predicting dosator performance, the effects of the dosator geometry and machine settings on the accuracy of the dose weight. An overview of a model based on classical powder mechanics theory that has been developed at The University of Greenwich is presented. The model uses inputs from a range of powder characterisation tests including, wall friction, bulk density, stress ratio and permeability. To validate the model it is anticipated that it will be trialled for a range of powders alongside a single shot dosator test rig.
Resumo:
Theoretical and experimental studies of cross correlation techniques applied to non-restrictive velocity measurement of pneumatically conveyed solids using ring-shaped electrodynamic flow sensors are presented. In-depth studies of the electrodynamic sensing mechanism, and also of the spatial sensitivity and spatial filtering properties of the sensor are included, together with their relationships to measurement accuracy and the effects of solids' velocity profiles. The experimental evaluation of a 53 mm bore sensing head is described, including trials using a calibrated pneumatic conveyor circulating pulverized fuel and cement. Comparisons of test results with the mathematical models of the sensor are used to identify important aspects of the instrument design. Off-line test results obtained using gravity-fed solids flow show that the system repeatability is within +/-0.5% over the velocity range of 2-4 m s(-1) for volumetric concentrations of solids no greater than 0.2%. Results obtained in the pilot-plant trials demonstrate that the system is capable of achieving repeatability better than +/-2% and linearity within +/-2% over the velocity range 20-40 m s(-1) for volumetric concentrations of solids in the range 0.01-0.44%.
Resumo:
Human activity causes ocean acidification (OA) though the dissolution of anthropogenically generated CO2 into seawater, and eutrophication through the addition of inorganic nutrients. Eutrophication increases the phytoplankton biomass that can be supported during a bloom, and the resultant uptake of dissolved inorganic carbon during photosynthesis increases water-column pH (bloom-induced basification). This increased pH can adversely affect plankton growth. With OA, basification commences at a lower pH. Using experimental analyses of the growth of three contrasting phytoplankton under different pH scenarios, coupled with mathematical models describing growth and death as functions of pH and nutrient status, we show how different conditions of pH modify the scope for competitive interactions between phytoplankton species. We then use the models previously configured against experimental data to explore how the commencement of bloom-induced basification at lower pH with OA, and operating against a background of changing patterns in nutrient loads, may modify phytoplankton growth and competition. We conclude that OA and changed nutrient supply into shelf seas with eutrophication or de-eutrophication (the latter owing to pollution control) has clear scope to alter phytoplankton succession, thus affecting future trophic dynamics and impacting both biogeochemical cycling and fisheries.
Resumo:
This paper provides an overview of the basic theory underlying 1D unsteady gas dynamics, the computational method developed at Queen’s University Belfast (QUB), the use of CFD as an alternative and some experimental results that demonstrate the techniques used to develop the mathematical models.
Resumo:
Artifact removal from physiological signals is an essential component of the biosignal processing pipeline. The need for powerful and robust methods for this process has become particularly acute as healthcare technology deployment undergoes transition from the current hospital-centric setting toward a wearable and ubiquitous monitoring environment. Currently, determining the relative efficacy and performance of the multiple artifact removal techniques available on real world data can be problematic, due to incomplete information on the uncorrupted desired signal. The majority of techniques are presently evaluated using simulated data, and therefore, the quality of the conclusions is contingent on the fidelity of the model used. Consequently, in the biomedical signal processing community, there is considerable focus on the generation and validation of appropriate signal models for use in artifact suppression. Most approaches rely on mathematical models which capture suitable approximations to the signal dynamics or underlying physiology and, therefore, introduce some uncertainty to subsequent predictions of algorithm performance. This paper describes a more empirical approach to the modeling of the desired signal that we demonstrate for functional brain monitoring tasks which allows for the procurement of a ground truth signal which is highly correlated to a true desired signal that has been contaminated with artifacts. The availability of this ground truth, together with the corrupted signal, can then aid in determining the efficacy of selected artifact removal techniques. A number of commonly implemented artifact removal techniques were evaluated using the described methodology to validate the proposed novel test platform. © 2012 IEEE.
Resumo:
Purpose
Recent in vitro results have shown significant contributions to cell killing from signaling effects at doses that are typically used in radiation therapy. This study investigates whether these in vitro observations can be reconciled with in vivo knowledge and how signaling may have an impact on future developments in radiation therapy.
Methods and Materials
Prostate cancer treatment plans were generated for a series of 10 patients using 3-dimensional conformal therapy, intensity modulated radiation therapy (IMRT), and volumetric modulated arc therapy techniques. These plans were evaluated using mathematical models of survival following modulated radiation exposures that were developed from in vitro observations and incorporate the effects of intercellular signaling. The impact on dose-volume histograms and mean doses were evaluated by converting these survival levels into "signaling-adjusted doses" for comparison.
Results
Inclusion of intercellular communication leads to significant differences between the signalling-adjusted and physical doses across a large volume. Organs in low-dose regions near target volumes see the largest increases, with mean signaling-adjusted bladder doses increasing from 23 to 33 Gy in IMRT plans. By contrast, in high-dose regions, there is a small decrease in signaling-adjusted dose due to reduced contributions from neighboring cells, with planning target volume mean doses falling from 74 to 71 Gy in IMRT. Overall, however, the dose distributions remain broadly similar, and comparisons between the treatment modalities are largely unchanged whether physical or signaling-adjusted dose is compared. Conclusions Although incorporating cellular signaling significantly affects cell killing in low-dose regions and suggests a different interpretation for many phenomena, their effect in high-dose regions for typical planning techniques is comparatively small. This indicates that the significant signaling effects observed in vitro are not contradicted by comparison with clinical observations. Future investigations are needed to validate these effects in vivo and to quantify their ranges and potential impact on more advanced radiation therapy techniques.
Resumo:
Parasites play pivotal roles in structuring communities, often via indirect interactions with non-host species. These effects can be density-mediated (through mortality) or trait-mediated (behavioural, physiological and developmental), and may be crucial to population interactions, including biological invasions. For instance, parasitism can alter intraguild predation (IGP) between native and invasive crustaceans, reversing invasion outcomes. Here, we use mathematical models to examine how parasite-induced trait changes influence the population dynamics of hosts that interact via IGP. We show that trait-mediated indirect interactions impart keystone effects, promoting or inhibiting host coexistence. Parasites can thus have strong ecological impacts, even if they have negligible virulence, underscoring the need to consider trait-mediated effects when predicting effects of parasites on community structure in general and biological invasions in particular.
Resumo:
Mathematical modelling has become an essential tool in the design of modern catalytic systems. Emissions legislation is becoming increasingly stringent, and so mathematical models of aftertreatment systems must become more accurate in order to provide confidence that a catalyst will convert pollutants over the required range of conditions.
Automotive catalytic converter models contain several sub-models that represent processes such as mass and heat transfer, and the rates at which the reactions proceed on the surface of the precious metal. Of these sub-models, the prediction of the surface reaction rates is by far the most challenging due to the complexity of the reaction system and the large number of gas species involved. The reaction rate sub-model uses global reaction kinetics to describe the surface reaction rate of the gas species and is based on the Langmuir Hinshelwood equation further developed by Voltz et al. [1] The reactions can be modelled using the pre-exponential and activation energies of the Arrhenius equations and the inhibition terms.
The reaction kinetic parameters of aftertreatment models are found from experimental data, where a measured light-off curve is compared against a predicted curve produced by a mathematical model. The kinetic parameters are usually manually tuned to minimize the error between the measured and predicted data. This process is most commonly long, laborious and prone to misinterpretation due to the large number of parameters and the risk of multiple sets of parameters giving acceptable fits. Moreover, the number of coefficients increases greatly with the number of reactions. Therefore, with the growing number of reactions, the task of manually tuning the coefficients is becoming increasingly challenging.
In the presented work, the authors have developed and implemented a multi-objective genetic algorithm to automatically optimize reaction parameters in AxiSuite®, [2] a commercial aftertreatment model. The genetic algorithm was developed and expanded from the code presented by Michalewicz et al. [3] and was linked to AxiSuite using the Simulink add-on for Matlab.
The default kinetic values stored within the AxiSuite model were used to generate a series of light-off curves under rich conditions for a number of gas species, including CO, NO, C3H8 and C3H6. These light-off curves were used to generate an objective function.
This objective function was used to generate a measure of fit for the kinetic parameters. The multi-objective genetic algorithm was subsequently used to search between specified limits to attempt to match the objective function. In total the pre-exponential factors and activation energies of ten reactions were simultaneously optimized.
The results reported here demonstrate that, given accurate experimental data, the optimization algorithm is successful and robust in defining the correct kinetic parameters of a global kinetic model describing aftertreatment processes.
Resumo:
Gels obtained by complexation of octablock star polyethylene oxide/polypropylene oxide copolymers (Tetronic 90R4) with -cyclodextrin (-CD) were evaluated as matrices for drug release. Both molecules are biocompatible so they can be potentially applied to drug delivery systems. Two different types of matrices of Tetronic 90R4 and -CD were evaluated: gels and tablets. These gels are capable to gelifying in situ and show sustained erosion kinetics in aqueous media. Tablets were prepared by freeze-drying and comprising the gels. Using these two different matrices, the release of two model molecules, L-tryptophan (Trp), and a protein, bovine serum albumin (BSA), was evaluated. The release profiles of these molecules from gels and tablets prove that they are suitable for sustained delivery. Mathematical models were applied to the release curves from tablets to elucidate the drug delivery mechanism. Good correlations were found for the fittings of the release curves to different equations. The results point that the release of Trp from different tablets is always governed by Fickian diffusion, whereas the release of BSA is governed by a combination of diffusion and tablet erosion.