951 resultados para dynamic response parameters


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The ability to measure ocular surface temperature (OST) with thermal imaging offers potential insight into ocular physiology that has been acknowledged in the literature. The TH7102MX thermo-camera (NEC San-ei, Japan) continuously records dynamic information about OST without sacrificing spatial resolution. Using purpose-designed image analysis software, it was possible to select and quantify the principal components of absolute temperature values and the magnitude plus rate of temperature change that followed blinking. The techniques was examined for repeatability, reproducibility and the effects of extrinsic factors: a suitable experimental protocol was thus developed. The precise source of the measured thermal radiation has previously been subject toe dispute: in this thesis, the results of a study examining the relationships between physical parameters of the anterior eye and OST, confirmed a principal role for the tear film in OST. The dynamic changes in OST were studied in a large group of young subjects: quantifying the post-blink changes in temperature with time also established a role for tear flow dynamics in OST. Using dynamic thermography, the effects of hydrogel contact lens wear on OST were investigated: a model eye for in vivo work, and both neophyte and adapted contact lens wearers for in vivo studies. Significantly greater OST was observed in contact lens wearers, particularly with silicone hydrogel lenses compared to etafilcon A, and tended to be greatest when lenses had been worn continuously. This finding is important to understanding the ocular response to contact lens wear. In a group of normal subjects, dynamic thermography appeared to measure the ocular response to the application of artificial tear drops: this may prove to be a significant research and clinical tool.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis demonstrates that the use of finite elements need not be confined to space alone, but that they may also be used in the time domain, It is shown that finite element methods may be used successfully to obtain the response of systems to applied forces, including, for example, the accelerations in a tall structure subjected to an earthquake shock. It is further demonstrated that at least one of these methods may be considered to be a practical alternative to more usual methods of solution. A detailed investigation of the accuracy and stability of finite element solutions is included, and methods of applications to both single- and multi-degree of freedom systems are described. Solutions using two different temporal finite elements are compared with those obtained by conventional methods, and a comparison of computation times for the different methods is given. The application of finite element methods to distributed systems is described, using both separate discretizations in space and time, and a combined space-time discretization. The inclusion of both viscous and hysteretic damping is shown to add little to the difficulty of the solution. Temporal finite elements are also seen to be of considerable interest when applied to non-linear systems, both when the system parameters are time-dependent and also when they are functions of displacement. Solutions are given for many different examples, and the computer programs used for the finite element methods are included in an Appendix.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Architecture and learning algorithm of self-learning spiking neural network in fuzzy clustering task are outlined. Fuzzy receptive neurons for pulse-position transformation of input data are considered. It is proposed to treat a spiking neural network in terms of classical automatic control theory apparatus based on the Laplace transform. It is shown that synapse functioning can be easily modeled by a second order damped response unit. Spiking neuron soma is presented as a threshold detection unit. Thus, the proposed fuzzy spiking neural network is an analog-digital nonlinear pulse-position dynamic system. It is demonstrated how fuzzy probabilistic and possibilistic clustering approaches can be implemented on the base of the presented spiking neural network.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Long-span bridges are flexible and therefore are sensitive to wind induced effects. One way to improve the stability of long span bridges against flutter is to use cross-sections that involve twin side-by-side decks. However, this can amplify responses due to vortex induced oscillations. Wind tunnel testing is a well-established practice to evaluate the stability of bridges against wind loads. In order to study the response of the prototype in laboratory, dynamic similarity requirements should be satisfied. One of the parameters that is normally violated in wind tunnel testing is Reynolds number. In this dissertation, the effects of Reynolds number on the aerodynamics of a double deck bridge were evaluated by measuring fluctuating forces on a motionless sectional model of a bridge at different wind speeds representing different Reynolds regimes. Also, the efficacy of vortex mitigation devices was evaluated at different Reynolds number regimes. One other parameter that is frequently ignored in wind tunnel studies is the correct simulation of turbulence characteristics. Due to the difficulties in simulating flow with large turbulence length scale on a sectional model, wind tunnel tests are often performed in smooth flow as a conservative approach. The validity of simplifying assumptions in calculation of buffeting loads, as the direct impact of turbulence, needs to be verified for twin deck bridges. The effects of turbulence characteristics were investigated by testing sectional models of a twin deck bridge under two different turbulent flow conditions. Not only the flow properties play an important role on the aerodynamic response of the bridge, but also the geometry of the cross section shape is expected to have significant effects. In this dissertation, the effects of deck details, such as width of the gap between the twin decks, and traffic barriers on the aerodynamic characteristics of a twin deck bridge were investigated, particularly on the vortex shedding forces with the aim of clarifying how these shape details can alter the wind induced responses. Finally, a summary of the issues that are involved in designing a dynamic test rig for high Reynolds number tests is given, using the studied cross section as an example.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The authors would like to express their gratitude to their supporters. Drs Jim Cousins, S.R. Uma and Ken Gledhill facilitated this research by providing access to GeoNet seismic data and structural building information. Piotr Omenzetter’s work within the Lloyd’s Register Foundation Centre for Safety and Reliability Engineering at the University of Aberdeen is supported by Lloyd’s Register Foundation. The Foundation helps to protect life and property by supporting engineering-related education, public engagement and the application of research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The authors would like to express their gratitude to their supporters. Drs Jim Cousins, S.R. Uma and Ken Gledhill facilitated this research by providing access to GeoNet seismic data and structural building information. Piotr Omenzetter’s work within the Lloyd’s Register Foundation Centre for Safety and Reliability Engineering at the University of Aberdeen is supported by Lloyd’s Register Foundation. The Foundation helps to protect life and property by supporting engineering-related education, public engagement and the application of research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Non-alcoholic steatohepatitis (NASH) is a chronic liver disease that is capable of progressing to end-stage liver disease, but generally has a benign course. Non-alcoholic steatohepatitis (NASH) is a growing public health problem with no approved therapy. NASH projected to be the leading cause of liver transplantation in the United States by 2020. Obesity, non-insulin-dependent diabetes mellitus and hyperlipidaemia are the most common associations of the disease. Global prevalence of NASH is 10-24% amongst general population but increases to 25-75% in obese diabetic individuals. Objective: There is an urgent need for efficient therapeutic options as there is still no approved medication. The aim of this study was to detect changes in biochemical parameters including insulin resistance, cytokines, blood lipid profile and liver enzymes following weight loss in patients with non-alcoholic steatohepatitis. Materials and methods: One hundred obese patients with NASH, their age between 35-50 years, body mass index (BMI) from 30 to 35 Kg/m2 were included in the study in two subgroups; the first group (A) received moderate aerobic exercise training in addition to diet regimen , where the second group (B) received no treatment intervention. Results: The mean values of leptin, TNF-α, IL6, IL8, Alanine Aminotransferase (ALT), Aspartate Aminotransferase (AST), Homeostasis Model Assessment-Insulin Resistance- index (HOMA-IR), Total Cholesterol (TC), Low Density Lipoprotein Cholesterol (LDL-c) , Triglycerides (TG) and BMI were significantly decreased in group (A), where the mean value of Adiponectin and High Density Lipoprotein Cholesterol (HDL-c) were significantly increased, while there were no significant changes in group (B). Also, there was a significant difference between both groups at the end of the study. Conclusion: Weight loss modulates insulin resistance, adiponectin, leptin, inflammatory cytokine levels and markers of hepatic function in patients with nonalcoholic steatohepatitis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bomb attacks carried out by terrorists, targeting high occupancy buildings, have become increasingly common in recent times. Large numbers of casualties and property damage result from overpressure of the blast followed by failing of structural elements. Understanding the blast response of multi-storey buildings and evaluating their remaining life have therefore become important. Response and damage analysis of single structural components, such as columns or slabs, to explosive loads have been examined in the literature, but the studies on blast response and damage analysis of structural frames in multi-storey buildings is limited and this is necessary for assessing the vulnerability of them. This paper investigates the blast response and damage evaluation of reinforced concrete (RC) frames, designed for normal gravity loads, in order to evaluate their remaining life. Numerical modelling and analysis were carried out using the explicit finite element software, LS DYNA. The modelling and analysis takes into consideration reinforcement details together and material performance under higher strain rates. Damage indices for columns are calculated based on their residual and original capacities. Numerical results generated in the can be used to identify relationships between the blast load parameters and the column damage. Damage index curve will provide a simple means for assessing the damage to a typical multi-storey building RC frame under an external bomb circumstance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

John Frazer's architectural work is inspired by living and generative processes. Both evolutionary and revolutionary, it explores informatin ecologies and the dynamics of the spaces between objects. Fuelled by an interest in the cybernetic work of Gordon Pask and Norbert Wiener, and the possibilities of the computer and the "new science" it has facilitated, Frazer and his team of collaborators have conducted a series of experiments that utilize genetic algorithms, cellular automata, emergent behaviour, complexity and feedback loops to create a truly dynamic architecture. Frazer studied at the Architectural Association (AA) in London from 1963 to 1969, and later became unit master of Diploma Unit 11 there. He was subsequently Director of Computer-Aided Design at the University of Ulter - a post he held while writing An Evolutionary Architecture in 1995 - and a lecturer at the University of Cambridge. In 1983 he co-founded Autographics Software Ltd, which pioneered microprocessor graphics. Frazer was awarded a person chair at the University of Ulster in 1984. In Frazer's hands, architecture becomes machine-readable, formally open-ended and responsive. His work as computer consultant to Cedric Price's Generator Project of 1976 (see P84)led to the development of a series of tools and processes; these have resulted in projects such as the Calbuild Kit (1985) and the Universal Constructor (1990). These subsequent computer-orientated architectural machines are makers of architectural form beyond the full control of the architect-programmer. Frazer makes much reference to the multi-celled relationships found in nature, and their ongoing morphosis in response to continually changing contextual criteria. He defines the elements that describe his evolutionary architectural model thus: "A genetic code script, rules for the development of the code, mapping of the code to a virtual model, the nature of the environment for the development of the model and, most importantly, the criteria for selection. In setting out these parameters for designing evolutionary architectures, Frazer goes beyond the usual notions of architectural beauty and aesthetics. Nevertheless his work is not without an aesthetic: some pieces are a frenzy of mad wire, while others have a modularity that is reminiscent of biological form. Algorithms form the basis of Frazer's designs. These algorithms determine a variety of formal results dependent on the nature of the information they are given. His work, therefore, is always dynamic, always evolving and always different. Designing with algorithms is also critical to other architects featured in this book, such as Marcos Novak (see p150). Frazer has made an unparalleled contribution to defining architectural possibilities for the twenty-first century, and remains an inspiration to architects seeking to create responsive environments. Architects were initially slow to pick up on the opportunities that the computer provides. These opportunities are both representational and spatial: computers can help architects draw buildings and, more importantly, they can help architects create varied spaces, both virtual and actual. Frazer's work was groundbreaking in this respect, and well before its time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the potential for the third-order aberrations coma and trefoil to provide a signed cue to accommodation. It is first demonstrated theoretically (with some assumptions) that the point spread function is insensitive to the sign of spherical defocus in the presence of odd-order aberrations. In an experimental investigation, the accommodation response to a sinusoidal change in vergence (1–3 D, 0.2 Hz) of a monochromatic stimulus was obtained with a dynamic infrared optometer. Measurements were obtained in 10 young visually normal individuals with and without custom contact lenses that induced low and high values of r.m.s. trefoil (0.25, 1.03 μm) and coma (0.34, 0.94 μm). Despite variation between subjects, we did not find any statistically significant increase or decrease in the accommodative gain for low levels of trefoil and coma, although effects approached or reached significance for the high levels of trefoil and coma. Theoretical and experimental results indicate that the presence of Zernike third-order aberrations on the eye does not seem to play a crucial role in the dynamics of the accommodation response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a detailed description of the influence of critical parameters that govern the vulnerability of columns under lateral impact loads. Numerical simulations are conducted by using the Finite Element program LS-DYNA, incorporating steel reinforcement, material models and strain rate effects. A simplified method based on impact pulse generated from full scale impact tests is used for impact reconstruction and effects of the various pulse loading parameters are investigated under low to medium velocity impacts. A constitutive material model which can simulate failures under tri-axial state of stresses is used for concrete. Confinement effects are also introduced to the numerical simulation and columns of Grade 30 to 50 concrete under pure axial loading are analysed in detail. This research confirmed that the vulnerability of the axially loaded columns can be mitigated by reducing the slenderness ratio and concrete grade, and by choosing the design option with a minimal amount of longitudinal steel. Additionally, it is evident that approximately a 50% increase in impact capacity can be gained for columns in medium rise buildings by enhancing the confinement effects alone. Results also indicated that the ductility as well as the mode of failure under impact can be changed with the volumetric ratio of lateral steel. Moreover, to increase the impact capacity of the vulnerable columns, a higher confining stress is required. The general provisions of current design codes do not sufficiently cover this aspect and hence this research will provide additional guidelines to overcome the inadequacies of code provisions.