33 resultados para Full-scale Physical Modelling
Resumo:
Tne object of this research was to investigate the behaviour of birdcage scaffolding as used in falsework structures, assess the suitability of existing design methods and make recommendations for a set of design rules. Since excessive deflection is as undesirable in a structure as total collapse, the project was divided into two sections. These were to determine the ultimate vertical and horizontal load-carrying capacity and also the deflection characteristics of any falsework. So theoretical analyses were developed to ascertain the ability of both the individual standards to resist vertical load, and of the bracing to resist horizontal load.Furthermore a model was evolved which would predict the horizontal deflection of a scaffold under load using strain energy methods. These models were checked by three series of experiments. The first was on individual standards under vertical load only. The second series was carried out on full scale falsework structures loading vertically and horizontally to failure. Finally experiments were conducted on scaffold couplers to provide additional verification of the method of predicting deflections. This thesis gives the history of the project and an introduction into the field of scaffolding. It details both the experiments conducted and the theories developed and the correlation between theory and experiment. Finally it makes recommendations for a design method to be employed by scaffolding designers.
Resumo:
In this Thesis, details of a proposed method for the elastic-plastic failure load analysis of complete building structures are given. In order to handle the problem, a computer programme in Atlas Autocode is produced. The structures consist of a number of parallel shear walls and intermediate frames connected by floor slabs. The results of an experimental investigation are given to verify the theoretical results and to demonstrate various factors that may influence the behaviour of these structures. Large full scale practical structures are also analysed by the proposed method and suggestions are made for achieving design economy as well as for extending research in various aspects of this field. The existing programme for elastic-plastic analysis of large frames is modified to allow for the effect of composite action of structural members, i.e. reinforced concrete floor slabs and the supporting steel beams. This modified programme is used to analyse some framed type structures with composite action as well as those which incorporate plates and shear walls. The results obtained are studied to ascertain the influence of composite action and other factors on the load carrying capacity of both bare frames and complete building structures. The theoretical failure load presented in this thesis does not predict the overall failure load of the structure nor does it predict the partial failure load of the shear walls and slabs but it merely predicts the partial failure load of a single frame and assumes that the loss of stiffess of such a frame renders the overall structure unusable. For most structures the analysis proposed in this thesis is likely to break down prematurely due to the failure of the slab and shear wall system and this factor must be taken into account in any future work on such structures. The experimental work reported in this thesis is acknowledged to be unsatisfactory as a verification of the limited theory proposed. In particular perspex was not found to be a suitable material for testing at high loads, micro-concrete may be more suitable.
Resumo:
This thesis encompasses an investigation of the behaviour of concrete frame structure under localised fire scenarios by implementing a constitutive model using finite-element computer program. The investigation phase included properties of material at elevated temperature, description of computer program, thermal and structural analyses. Transient thermal properties of material have been employed in this study to achieve reasonable results. The finite-element computer package of ANSYS is utilized in the present analyses to examine the effect of fire on the concrete frame under five various fire scenarios. In addition, a report of full-scale BRE Cardington concrete building designed to Eurocode2 and BS8110 subjected to realistic compartment fire is also presented. The transient analyses of present model included additional specific heat to the base value of dry concrete at temperature 100°C and 200°C. The combined convective-radiation heat transfer coefficient and transient thermal expansion have also been considered in the analyses. For the analyses with the transient strains included, the constitutive model based on empirical formula in a full thermal strain-stress model proposed by Li and Purkiss (2005) is employed. Comparisons between the models with and without transient strains included are also discussed. Results of present study indicate that the behaviour of complete structure is significantly different from the behaviour of individual isolated members based on current design methods. Although the current tabulated design procedures are conservative when the entire building performance is considered, it should be noted that the beneficial and detrimental effects of thermal expansion in complete structures should be taken into account. Therefore, developing new fire engineering methods from the study of complete structures rather than from individual isolated member behaviour is essential.
Resumo:
This thesis presents research within empirical financial economics with focus on liquidity and portfolio optimisation in the stock market. The discussion on liquidity is focused on measurement issues, including TAQ data processing and measurement of systematic liquidity factors (FSO). Furthermore, a framework for treatment of the two topics in combination is provided. The liquidity part of the thesis gives a conceptual background to liquidity and discusses several different approaches to liquidity measurement. It contributes to liquidity measurement by providing detailed guidelines on the data processing needed for applying TAQ data to liquidity research. The main focus, however, is the derivation of systematic liquidity factors. The principal component approach to systematic liquidity measurement is refined by the introduction of moving and expanding estimation windows, allowing for time-varying liquidity co-variances between stocks. Under several liability specifications, this improves the ability to explain stock liquidity and returns, as compared to static window PCA and market average approximations of systematic liquidity. The highest ability to explain stock returns is obtained when using inventory cost as a liquidity measure and a moving window PCA as the systematic liquidity derivation technique. Systematic factors of this setting also have a strong ability in explaining a cross-sectional liquidity variation. Portfolio optimisation in the FSO framework is tested in two empirical studies. These contribute to the assessment of FSO by expanding the applicability to stock indexes and individual stocks, by considering a wide selection of utility function specifications, and by showing explicitly how the full-scale optimum can be identified using either grid search or the heuristic search algorithm of differential evolution. The studies show that relative to mean-variance portfolios, FSO performs well in these settings and that the computational expense can be mitigated dramatically by application of differential evolution.
Resumo:
This volume brings together French and British scholars of France to analyse one of French politics' most intellectually compelling phenomena, the presidency of the republic. It examines the strengths and weaknesses of that leadership as well as the way that executive power has been established in the Fifth Republic; how presidential power and the subsequent full scale development of 'personality politics' developed within an essentially party-driven, democratic and, most importantly, republican system. Hence the authors in this volume examine the phenomenon of a strong presidency in the French republican framework. The individual chapters focus on the presidency and upon the individual presidents and the way in which they have addressed their own relation to the presidencies they presided over on top of a range of other factors informing their terms of office. A conclusion sums up and appraises the contemporary role of the French presidency within the party system and the republic. The project has generated a great deal of interest in the French political studies community.
Resumo:
Previous research into formulaic language has focussed on specialised groups of people (e.g. L1 acquisition by infants and adult L2 acquisition) with ordinary adult native speakers of English receiving less attention. Additionally, whilst some features of formulaic language have been used as evidence of authorship (e.g. the Unabomber’s use of you can’t eat your cake and have it too) there has been no systematic investigation into this as a potential marker of authorship. This thesis reports the first full-scale study into the use of formulaic sequences by individual authors. The theory of formulaic language hypothesises that formulaic sequences contained in the mental lexicon are shaped by experience combined with what each individual has found to be communicatively effective. Each author’s repertoire of formulaic sequences should therefore differ. To test this assertion, three automated approaches to the identification of formulaic sequences are tested on a specially constructed corpus containing 100 short narratives. The first approach explores a limited subset of formulaic sequences using recurrence across a series of texts as the criterion for identification. The second approach focuses on a word which frequently occurs as part of formulaic sequences and also investigates alternative non-formulaic realisations of the same semantic content. Finally, a reference list approach is used. Whilst claiming authority for any reference list can be difficult, the proposed method utilises internet examples derived from lists prepared by others, a procedure which, it is argued, is akin to asking large groups of judges to reach consensus about what is formulaic. The empirical evidence supports the notion that formulaic sequences have potential as a marker of authorship since in some cases a Questioned Document was correctly attributed. Although this marker of authorship is not universally applicable, it does promise to become a viable new tool in the forensic linguist’s tool-kit.
Resumo:
This article examines the development and impact of German citizenship policy over the past decade. As its point of departure, it takes the 2000 Citizenship Law, which sought to undertake a full-scale reform and liberalisation of access to German membership. The article discusses this law’s content and subsequent amendments, focusing particularly on its quantitative impact, asking why the number of naturalisations has been lower than originally expected. The article outlines current challenges to the law’s structure operation and identifies potential trajectories for its future development.
Resumo:
Detection thresholds for two visual- and two auditory-processing tasks were obtained for 73 children and young adults who varied broadly in reading ability. A reading-disabled subgroup had significantly higher thresholds than a normal-reading subgroup for the auditory tasks only. When analyzed across the whole group, the auditory tasks and one of the visual tasks, coherent motion detection, were significantly related to word reading. These effects were largely independent of ADHD ratings; however, none of these measures accounted for significant variance in word reading after controlling for full-scale IQ. In contrast, phoneme awareness, rapid naming, and nonword repetition each explained substantial, significant word reading variance after controlling for IQ, suggesting more specific roles for these oral language skills in the development of word reading. © 2004 Elsevier Inc. All rights reserved.
Resumo:
Clogging is the main operational problem associated with horizontal subsurface flow constructed wetlands (HSSF CWs). The measurement of saturated hydraulic conductivity has proven to be a suitable technique to assess clogging within HSSF CWs. The vertical and horizontal distribution of hydraulic conductivity was assessed in two full-scale HSSF CWs by using two different in situ permeameter methods (falling head (FH) and constant head (CH) methods). Horizontal hydraulic conductivity profiles showed that both methods are correlated by a power function (FH= CH 0.7821, r 2=0.76) within the recorded range of hydraulic conductivities (0-70 m/day). However, the FH method provided lower values of hydraulic conductivity than the CH method (one to three times lower). Despite discrepancies between the magnitudes of reported readings, the relative distribution of clogging obtained via both methods was similar. Therefore, both methods are useful when exploring the general distribution of clogging and, specially, the assessment of clogged areas originated from preferential flow paths within full-scale HSSF CWs. Discrepancy between methods (either in magnitude and pattern) aroused from the vertical hydraulic conductivity profiles under highly clogged conditions. It is believed this can be attributed to procedural differences between the methods, such as the method of permeameter insertion (twisting versus hammering). Results from both methods suggest that clogging develops along the shortest distance between water input and output. Results also evidence that the design and maintenance of inlet distributors and outlet collectors appear to have a great influence on the pattern of clogging, and hence the asset lifetime of HSSF CWs. © Springer Science+Business Media B.V. 2011.
Resumo:
Background: Qualitative research has suggested that spousal carers of someone with dementia differ in terms of whether they perceive their relationship with that person as continuous with the premorbid relationship or as radically different, and that a perception of continuity may be associated with more person-centered care and the experience of fewer of the negative emotions associated with caring. The aim of the study was to develop and evaluate a quantitative measure of the extent to which spousal carers perceive the relationship to be continuous. Methods: An initial pool of 42 questionnaire items was generated on the basis of the qualitative research about relationship continuity. These were completed by 51 spousal carers and item analysis was used to reduce the pool to 23 items. The retained items, comprising five subscales, were then administered to a second sample of 84 spousal carers, and the questionnaire's reliability, discriminative power, and validity were evaluated. Results: The questionnaire showed good reliability: Cronbach's α for the full scale was 0.947, and test-retest reliability was 0.932. Ferguson's δ was 0.987, indicating good discriminative power. Evidence of construct validity was provided by predicted patterns of subscale correlations with the Closeness and Conflict Scale and the Marwit-Meuser Caregiver Grief Inventory. Conclusion: Initial psychometric evaluation of the measure was encouraging. The measure provides a quantitative means of investigating ideas from qualitative research about the role of relationship continuity in influencing how spousal carers provide care and how they react emotionally to their caring role. © 2012 International Psychogeriatric Association.
Resumo:
A method of accurately controlling the position of a mobile robot using an external large volume metrology (LVM) instrument is presented in this article. By utilising an LVM instrument such as a laser tracker or indoor GPS (iGPS) in mobile robot navigation, many of the most difficult problems in mobile robot navigation can be simplified or avoided. Using the real-time position information from the laser tracker, a very simple navigation algorithm, and a low cost robot, 5mm repeatability was achieved over a volume of 30m radius. A surface digitisation scan of a wind turbine blade section was also demonstrated, illustrating possible applications of the method for manufacturing processes. Further, iGPS guidance of a small KUKA omni-directional robot has been demonstrated, and a full scale prototype system is being developed in cooperation with KUKA Robotics, UK. © 2011 Taylor & Francis.
Resumo:
The tobacco industry's future depends on increasing tobacco use in low-income and middle-income countries (LMICs), which face a growing burden of tobacco-related disease, yet have potential to prevent full-scale escalation of this epidemic. To drive up sales the industry markets its products heavily, deliberately targeting non-smokers and keeps prices low until smoking and local economies are sufficiently established to drive prices and profits up. The industry systematically flaunts existing tobacco control legislation and works aggressively to prevent future policies using its resource advantage to present highly misleading economic arguments, rebrand political activities as corporate social responsibility, and establish and use third parties to make its arguments more palatable. Increasingly it is using domestic litigation and international arbitration to bully LMICs from implementing effective policies and hijacking the problem of tobacco smuggling for policy gain, attempting to put itself in control of an illegal trade in which there is overwhelming historical evidence of its complicity. Progress will not be realised until tobacco industry interference is actively addressed as outlined in Article 5.3 of the Framework Convention on Tobacco Control. Exemplar LMICs show this action can be achieved and indicate that exposing tobacco industry misconduct is an essential first step.
Resumo:
Purpose—This article considers North Korea and the notion of crisis, by linking historical development over the Korean peninsula to the conflict resolution literature, and investigates why despite a large number of destabilizing events, a war involving Pyongyang has yet to erupt. Design/methodology—This article uses historical data and a framework developed by Aggarwal et al., in order to highlight patterns of interaction between states such as the United States, North Korea and South Korea, organizations such as the United Nations, as well as processes such as the Six- Party Talks and the Agreed Framework. The article then develops a crisis framework based on conflict resolution and negotiation literature, and applies it to three North Korean administrations. Findings—Findings suggest that an open- ended understanding of time (for all parties involved on the peninsula) leads to an impossibility to reach a threshold where full- scale war would be triggered, thus leaving parties in a stable state of crisis for which escalating moves and de- escalating techniques might become irrelevant. Practical implications—It is hoped that this article will help further endeavors linking conflict resolution theoretical frameworks to the Korean peninsula security situation. In the case of the Korean peninsula, time has been understood as open-ended, leading parties to a lingering state of heightened hostilities that oscillates toward war, but that is controlled enough not to reach it. In-depth analysis of particular security sectors such as nuclear energy, food security, or missile testing would prove particularly useful in understanding the complexity of the Korean peninsula situation to a greater extent. It is hoped that this paper will help further endeavours linking conflict resolution theoretical frameworks to the Korean peninsula security situation. Originality/value—This research suggests that regarding the Korean peninsula, time has been understood as open- ended, leading parties to a lingering state of heightened.
Resumo:
This article considers North Korea and the notion of crisis, by linking historical development over the Korean peninsula to the conflict resolution literature, and investigates why despite a large number of destabilising events, a war involving Pyongyang has yet to erupt. The paper considers historical data and uses a framework developed by Aggarwal et al. in order to highlight patterns of interaction between states such as the United States, North Korea and South Korea, organisations such as the United Nations, as well as processes such as the Six-Party Talk and the Agreed Framework. The paper then develops a crisis framework based on conflict resolution and negotiation literature, and applies it to three North Korean administrations. Findings suggests that an elastic understanding of time (for all parties involved on the peninsula) leads to an impossibility to reach a threshold where full-scale war would be triggered, thus leaving parties in a stable state of crisis for which escalating moves and de-escalating techniques might become irrelevant.
Resumo:
This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether (DME) gas adsorptive separation and steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). Hydrogen is currently receiving increasing interest as an alternative source of clean energy and has high potential applications, including the transportation sector and power generation. Computational fluid dynamic (CFD) modelling has attracted considerable recognition in the engineering sector consequently leading to using it as a tool for process design and optimisation in many industrial processes. In most cases, these processes are difficult or expensive to conduct in lab scale experiments. The CFD provides a cost effective methodology to gain detailed information up to the microscopic level. The main objectives in this project are to: (i) develop a predictive model using ANSYS FLUENT (CFD) commercial code to simulate the flow hydrodynamics, mass transfer, reactions and heat transfer in a large scale dual fluidized bed system for combined gas separation and steam reforming processes (ii) implement a suitable adsorption models in the CFD code, through a user defined function, to predict selective separation of a gas from a mixture (iii) develop a model for dimethyl ether steam reforming (DME-SR) to predict hydrogen production (iv) carry out detailed parametric analysis in order to establish ideal operating conditions for future industrial application. The project has originated from a real industrial case problem in collaboration with the industrial partner Dow Corning (UK) and jointly funded by the Engineering and Physical Research Council (UK) and Dow Corning. The research examined gas separation by adsorption in a bubbling bed, as part of a dual fluidized bed system. The adsorption process was simulated based on the kinetics derived from the experimental data produced as part of a separate PhD project completed under the same fund. The kinetic model was incorporated in FLUENT CFD tool as a pseudo-first order rate equation; some of the parameters for the pseudo-first order kinetics were obtained using MATLAB. The modelling of the DME adsorption in the designed bubbling bed was performed for the first time in this project and highlights the novelty in the investigations. The simulation results were analysed to provide understanding of the flow hydrodynamic, reactor design and optimum operating condition for efficient separation. Bubbling bed validation by estimation of bed expansion and the solid and gas distribution from simulation agreed well with trends seen in the literatures. Parametric analysis on the adsorption process demonstrated that increasing fluidizing velocity reduced adsorption of DME. This is as a result of reduction in the gas residence time which appears to have much effect compared to the solid residence time. The removal efficiency of DME from the bed was found to be more than 88%. Simulation of the DME-SR in FLUENT CFD was conducted using selected kinetics from literature and implemented in the model using an in-house developed user defined function. The validation of the kinetics was achieved by simulating a case to replicate an experimental study of a laboratory scale bubbling bed by Vicente et al [1]. Good agreement was achieved for the validation of the models, which was then applied in the DME-SR in the large scale riser section of the dual fluidized bed system. This is the first study to use the selected DME-SR kinetics in a circulating fluidized bed (CFB) system and for the geometry size proposed for the project. As a result, the simulation produced the first detailed data on the spatial variation and final gas product in such an industrial scale fluidized bed system. The simulation results provided insight in the flow hydrodynamic, reactor design and optimum operating condition. The solid and gas distribution in the CFB was observed to show good agreement with literatures. The parametric analysis showed that the increase in temperature and steam to DME molar ratio increased the production of hydrogen due to the increased DME conversions, whereas the increase in the space velocity has been found to have an adverse effect. Increasing temperature between 200 oC to 350 oC increased DME conversion from 47% to 99% while hydrogen yield increased substantially from 11% to 100%. The CO2 selectivity decreased from 100% to 91% due to the water gas shift reaction favouring CO at higher temperatures. The higher conversions observed as the temperature increased was reflected on the quantity of unreacted DME and methanol concentrations in the product gas, where both decreased to very low values of 0.27 mol% and 0.46 mol% respectively at 350 °C. Increasing the steam to DME molar ratio from 4 to 7.68 increased the DME conversion from 69% to 87%, while the hydrogen yield increased from 40% to 59%. The CO2 selectivity decreased from 100% to 97%. The decrease in the space velocity from 37104 ml/g/h to 15394 ml/g/h increased the DME conversion from 87% to 100% while increasing the hydrogen yield from 59% to 87%. The parametric analysis suggests an operating condition for maximum hydrogen yield is in the region of 300 oC temperatures and Steam/DME molar ratio of 5. The analysis of the industrial sponsor’s case for the given flow and composition of the gas to be treated suggests that 88% of DME can be adsorbed from the bubbling and consequently producing 224.4t/y of hydrogen in the riser section of the dual fluidized bed system. The process also produces 1458.4t/y of CO2 and 127.9t/y of CO as part of the product gas. The developed models and parametric analysis carried out in this study provided essential guideline for future design of DME-SR at industrial level and in particular this work has been of tremendous importance for the industrial collaborator in order to draw conclusions and plan for future potential implementation of the process at an industrial scale.