164 resultados para Electricity -- Prices -- Mathematical models.
Resumo:
The release of ultrafine particles (UFP) from laser printers and office equipment was analyzed using a particle counter (FMPS; Fast Mobility Particle Sizer) with a high time resolution, as well as the appropriate mathematical models. Measurements were carried out in a 1 m³ chamber, a 24 m³ chamber and an office. The time-dependent emission rates were calculated for these environments using a deconvolution model, after which the total amount of emitted particles was calculated. The total amounts of released particles were found to be independent of the environmental parameters and therefore, in principle, they were appropriate for the comparison of different printers. On the basis of the time-dependent emission rates, “initial burst” emitters and constant emitters could also be distinguished. In the case of an “initial burst” emitter, the comparison to other devices is generally affected by strong variations between individual measurements. When conducting exposure assessments for UFP in an office, the spatial distribution of the particles also had to be considered. In this work, the spatial distribution was predicted on a case by case basis, using CFD simulation.
Resumo:
Experiments were undertaken to study drying kinetics of moist cylindrical shaped food particulates during fluidised bed drying. Cylindrical particles were prepared from Green beans with three different length:diameter ratios, 3:1, 2:1 and 1:1. A batch fluidised bed dryer connected to a heat pump system was used for the experimentation. A Heat pump and fluid bed combination was used to increase overall energy efficiency and achieve higher drying rates. Drying kinetics, were evaluated with non-dimensional moisture at three different drying temperatures of 30, 40 and 50o C. Numerous mathematical models can be used to calculate drying kinetics ranging from analytical models with simplified assumptions to empirical models built by regression using experimental data. Empirical models are commonly used for various food materials due to their simpler approach. However problems in accuracy, limits the applications of empirical models. Some limitations of empirical models could be reduced by using semi-empirical models based on heat and mass transfer of the drying operation. One such method is the quasi-stationary approach. In this study, a modified quasi-stationary approach was used to model drying kinetics of the cylindrical food particles at three drying temperatures.
Resumo:
The healing process for bone fractures is sensitive to mechanical stability and blood supply at the fracture site. Most currently available mechanobiological algorithms of bone healing are based solely on mechanical stimuli, while the explicit analysis of revascularization and its influences on the healing process have not been thoroughly investigated in the literature. In this paper, revascularization was described by two separate processes: angiogenesis and nutrition supply. The mathematical models for angiogenesis and nutrition supply have been proposed and integrated into an existing fuzzy algorithm of fracture healing. The computational algorithm of fracture healing, consisting of stress analysis, analyses of angiogenesis and nutrient supply, and tissue differentiation, has been tested on and compared with animal experimental results published previously. The simulation results showed that, for a small and medium-sized fracture gap, the nutrient supply is sufficient for bone healing, for a large fracture gap, non-union may be induced either by deficient nutrient supply or inadequate mechanical conditions. The comparisons with experimental results demonstrated that the improved computational algorithm is able to simulate a broad spectrum of fracture healing cases and to predict and explain delayed unions and non-union induced by large gap sizes and different mechanical conditions. The new algorithm will allow the simulation of more realistic clinical fracture healing cases with various fracture gaps and geometries and may be helpful to optimise implants and methods for fracture fixation.
Resumo:
In sport and exercise biomechanics, forward dynamics analyses or simulations have frequently been used in attempts to establish optimal techniques for performance of a wide range of motor activities. However, the accuracy and validity of these simulations is largely dependent on the complexity of the mathematical model used to represent the neuromusculoskeletal system. It could be argued that complex mathematical models are superior to simple mathematical models as they enable basic mechanical insights to be made and individual-specific optimal movement solutions to be identified. Contrary to some claims in the literature, however, we suggest that it is currently not possible to identify the complete optimal solution for a given motor activity. For a complete optimization of human motion, dynamical systems theory implies that mathematical models must incorporate a much wider range of organismic, environmental and task constraints. These ideas encapsulate why sports medicine specialists need to adopt more individualized clinical assessment procedures in interpreting why performers' movement patterns may differ.
Resumo:
Modern Engineering Asset Management (EAM) requires the accurate assessment of current and the prediction of future asset health condition. Appropriate mathematical models that are capable of estimating times to failures and the probability of failures in the future are essential in EAM. In most real-life situations, the lifetime of an engineering asset is influenced and/or indicated by different factors that are termed as covariates. Hazard prediction with covariates is an elemental notion in the reliability theory to estimate the tendency of an engineering asset failing instantaneously beyond the current time assumed that it has already survived up to the current time. A number of statistical covariate-based hazard models have been developed. However, none of them has explicitly incorporated both external and internal covariates into one model. This paper introduces a novel covariate-based hazard model to address this concern. This model is named as Explicit Hazard Model (EHM). Both the semi-parametric and non-parametric forms of this model are presented in the paper. The major purpose of this paper is to illustrate the theoretical development of EHM. Due to page limitation, a case study with the reliability field data is presented in the applications part of this study.
Resumo:
The driving task requires sustained attention during prolonged periods, and can be performed in highly predictable or repetitive environments. Such conditions could create drowsiness or hypovigilance and impair the ability to react to critical events. Identifying vigilance decrement in monotonous conditions has been a major subject of research, but no research to date has attempted to predict this vigilance decrement. This pilot study aims to show that vigilance decrements due to monotonous tasks can be predicted through mathematical modelling. A short vigilance task sensitive to short periods of lapses of vigilance called Sustained Attention to Response Task is used to assess participants’ performance. This task models the driver’s ability to cope with unpredicted events by performing the expected action. A Hidden Markov Model (HMM) is proposed to predict participants’ hypovigilance. Driver’s vigilance evolution is modelled as a hidden state and is correlated to an observable variable: the participant’s reactions time. This experiment shows that the monotony of the task can lead to an important vigilance decline in less than five minutes. This impairment can be predicted four minutes in advance with an 86% accuracy using HMMs. This experiment showed that mathematical models such as HMM can efficiently predict hypovigilance through surrogate measures. The presented model could result in the development of an in-vehicle device that detects driver hypovigilance in advance and warn the driver accordingly, thus offering the potential to enhance road safety and prevent road crashes.
Resumo:
Modern computer graphics systems are able to construct renderings of such high quality that viewers are deceived into regarding the images as coming from a photographic source. Large amounts of computing resources are expended in this rendering process, using complex mathematical models of lighting and shading. However, psychophysical experiments have revealed that viewers only regard certain informative regions within a presented image. Furthermore, it has been shown that these visually important regions contain low-level visual feature differences that attract the attention of the viewer. This thesis will present a new approach to image synthesis that exploits these experimental findings by modulating the spatial quality of image regions by their visual importance. Efficiency gains are therefore reaped, without sacrificing much of the perceived quality of the image. Two tasks must be undertaken to achieve this goal. Firstly, the design of an appropriate region-based model of visual importance, and secondly, the modification of progressive rendering techniques to effect an importance-based rendering approach. A rule-based fuzzy logic model is presented that computes, using spatial feature differences, the relative visual importance of regions in an image. This model improves upon previous work by incorporating threshold effects induced by global feature difference distributions and by using texture concentration measures. A modified approach to progressive ray-tracing is also presented. This new approach uses the visual importance model to guide the progressive refinement of an image. In addition, this concept of visual importance has been incorporated into supersampling, texture mapping and computer animation techniques. Experimental results are presented, illustrating the efficiency gains reaped from using this method of progressive rendering. This visual importance-based rendering approach is expected to have applications in the entertainment industry, where image fidelity may be sacrificed for efficiency purposes, as long as the overall visual impression of the scene is maintained. Different aspects of the approach should find many other applications in image compression, image retrieval, progressive data transmission and active robotic vision.
Resumo:
Experiments were undertaken to study drying kinetics of different shaped moist food particulates during heat pump assisted fluidised bed drying. Three particular geometrical shapes of parallelepiped, cylindrical and spheres were selected from potatoes (aspect ratio = 1:1, 2:1, 3:1), cut beans (length: diameter = 1:1, 2:1, 3:1) and peas respectively. A batch fluidised bed dryer connected to a heat pump system was used for the experimentation. A Heat pump and fluid bed combination was used to increase overall energy efficiency and achieve higher drying rates. Drying kinetics, were evaluated with non-dimensional moisture at three different drying temperatures of 30, 40 and 50o C. Due to complex hydrodynamics of the fluidised beds, drying kinetics are dryer or material specific. Numerous mathematical models can be used to calculate drying kinetics ranging from analytical models with simplified assumptions to empirical models built by regression using experimental data. Empirical models are commonly used for various food materials due to their simpler approach. However problems in accuracy, limits the applications of empirical models. Some limitations of empirical models could be reduced by using semi-empirical models based on heat and mass transfer of the drying operation. One such method is the quasi-stationary approach. In this study, a modified quasi-stationary approach was used to model drying kinetics of the cylindrical food particles at three drying temperatures.