39 resultados para full-scale testing
Resumo:
Diagnosing faults in wastewater treatment, like diagnosis of most problems, requires bi-directional plausible reasoning. This means that both predictive (from causes to symptoms) and diagnostic (from symptoms to causes) inferences have to be made, depending on the evidence available, in reasoning for the final diagnosis. The use of computer technology for the purpose of diagnosing faults in the wastewater process has been explored, and a rule-based expert system was initiated. It was found that such an approach has serious limitations in its ability to reason bi-directionally, which makes it unsuitable for diagnosing tasks under the conditions of uncertainty. The probabilistic approach known as Bayesian Belief Networks (BBNS) was then critically reviewed, and was found to be well-suited for diagnosis under uncertainty. The theory and application of BBNs are outlined. A full-scale BBN for the diagnosis of faults in a wastewater treatment plant based on the activated sludge system has been developed in this research. Results from the BBN show good agreement with the predictions of wastewater experts. It can be concluded that the BBNs are far superior to rule-based systems based on certainty factors in their ability to diagnose faults and predict systems in complex operating systems having inherently uncertain behaviour.
Resumo:
Tne object of this research was to investigate the behaviour of birdcage scaffolding as used in falsework structures, assess the suitability of existing design methods and make recommendations for a set of design rules. Since excessive deflection is as undesirable in a structure as total collapse, the project was divided into two sections. These were to determine the ultimate vertical and horizontal load-carrying capacity and also the deflection characteristics of any falsework. So theoretical analyses were developed to ascertain the ability of both the individual standards to resist vertical load, and of the bracing to resist horizontal load.Furthermore a model was evolved which would predict the horizontal deflection of a scaffold under load using strain energy methods. These models were checked by three series of experiments. The first was on individual standards under vertical load only. The second series was carried out on full scale falsework structures loading vertically and horizontally to failure. Finally experiments were conducted on scaffold couplers to provide additional verification of the method of predicting deflections. This thesis gives the history of the project and an introduction into the field of scaffolding. It details both the experiments conducted and the theories developed and the correlation between theory and experiment. Finally it makes recommendations for a design method to be employed by scaffolding designers.
Resumo:
This thesis encompasses an investigation of the behaviour of concrete frame structure under localised fire scenarios by implementing a constitutive model using finite-element computer program. The investigation phase included properties of material at elevated temperature, description of computer program, thermal and structural analyses. Transient thermal properties of material have been employed in this study to achieve reasonable results. The finite-element computer package of ANSYS is utilized in the present analyses to examine the effect of fire on the concrete frame under five various fire scenarios. In addition, a report of full-scale BRE Cardington concrete building designed to Eurocode2 and BS8110 subjected to realistic compartment fire is also presented. The transient analyses of present model included additional specific heat to the base value of dry concrete at temperature 100°C and 200°C. The combined convective-radiation heat transfer coefficient and transient thermal expansion have also been considered in the analyses. For the analyses with the transient strains included, the constitutive model based on empirical formula in a full thermal strain-stress model proposed by Li and Purkiss (2005) is employed. Comparisons between the models with and without transient strains included are also discussed. Results of present study indicate that the behaviour of complete structure is significantly different from the behaviour of individual isolated members based on current design methods. Although the current tabulated design procedures are conservative when the entire building performance is considered, it should be noted that the beneficial and detrimental effects of thermal expansion in complete structures should be taken into account. Therefore, developing new fire engineering methods from the study of complete structures rather than from individual isolated member behaviour is essential.
Resumo:
This thesis presents research within empirical financial economics with focus on liquidity and portfolio optimisation in the stock market. The discussion on liquidity is focused on measurement issues, including TAQ data processing and measurement of systematic liquidity factors (FSO). Furthermore, a framework for treatment of the two topics in combination is provided. The liquidity part of the thesis gives a conceptual background to liquidity and discusses several different approaches to liquidity measurement. It contributes to liquidity measurement by providing detailed guidelines on the data processing needed for applying TAQ data to liquidity research. The main focus, however, is the derivation of systematic liquidity factors. The principal component approach to systematic liquidity measurement is refined by the introduction of moving and expanding estimation windows, allowing for time-varying liquidity co-variances between stocks. Under several liability specifications, this improves the ability to explain stock liquidity and returns, as compared to static window PCA and market average approximations of systematic liquidity. The highest ability to explain stock returns is obtained when using inventory cost as a liquidity measure and a moving window PCA as the systematic liquidity derivation technique. Systematic factors of this setting also have a strong ability in explaining a cross-sectional liquidity variation. Portfolio optimisation in the FSO framework is tested in two empirical studies. These contribute to the assessment of FSO by expanding the applicability to stock indexes and individual stocks, by considering a wide selection of utility function specifications, and by showing explicitly how the full-scale optimum can be identified using either grid search or the heuristic search algorithm of differential evolution. The studies show that relative to mean-variance portfolios, FSO performs well in these settings and that the computational expense can be mitigated dramatically by application of differential evolution.
Resumo:
This volume brings together French and British scholars of France to analyse one of French politics' most intellectually compelling phenomena, the presidency of the republic. It examines the strengths and weaknesses of that leadership as well as the way that executive power has been established in the Fifth Republic; how presidential power and the subsequent full scale development of 'personality politics' developed within an essentially party-driven, democratic and, most importantly, republican system. Hence the authors in this volume examine the phenomenon of a strong presidency in the French republican framework. The individual chapters focus on the presidency and upon the individual presidents and the way in which they have addressed their own relation to the presidencies they presided over on top of a range of other factors informing their terms of office. A conclusion sums up and appraises the contemporary role of the French presidency within the party system and the republic. The project has generated a great deal of interest in the French political studies community.
Resumo:
Previous research into formulaic language has focussed on specialised groups of people (e.g. L1 acquisition by infants and adult L2 acquisition) with ordinary adult native speakers of English receiving less attention. Additionally, whilst some features of formulaic language have been used as evidence of authorship (e.g. the Unabomber’s use of you can’t eat your cake and have it too) there has been no systematic investigation into this as a potential marker of authorship. This thesis reports the first full-scale study into the use of formulaic sequences by individual authors. The theory of formulaic language hypothesises that formulaic sequences contained in the mental lexicon are shaped by experience combined with what each individual has found to be communicatively effective. Each author’s repertoire of formulaic sequences should therefore differ. To test this assertion, three automated approaches to the identification of formulaic sequences are tested on a specially constructed corpus containing 100 short narratives. The first approach explores a limited subset of formulaic sequences using recurrence across a series of texts as the criterion for identification. The second approach focuses on a word which frequently occurs as part of formulaic sequences and also investigates alternative non-formulaic realisations of the same semantic content. Finally, a reference list approach is used. Whilst claiming authority for any reference list can be difficult, the proposed method utilises internet examples derived from lists prepared by others, a procedure which, it is argued, is akin to asking large groups of judges to reach consensus about what is formulaic. The empirical evidence supports the notion that formulaic sequences have potential as a marker of authorship since in some cases a Questioned Document was correctly attributed. Although this marker of authorship is not universally applicable, it does promise to become a viable new tool in the forensic linguist’s tool-kit.
Resumo:
This article examines the development and impact of German citizenship policy over the past decade. As its point of departure, it takes the 2000 Citizenship Law, which sought to undertake a full-scale reform and liberalisation of access to German membership. The article discusses this law’s content and subsequent amendments, focusing particularly on its quantitative impact, asking why the number of naturalisations has been lower than originally expected. The article outlines current challenges to the law’s structure operation and identifies potential trajectories for its future development.
A conceptual framework for supply chain collaboration:empirical evidence from the agri-food industry
Resumo:
Purpose - The purpose of this paper is to analyse the concept of supply chain collaboration and to provide an overall framework that can be used as a conceptual landmark for further empirical research. In addition, the concept is explored in the context of agri-food industry and particularities are identified. Finally, the paper submits empirical evidence from an exploratory case study in the agri-food industry, at the grower-processor interface, and information regarding the way the concept is actually applied in small medium-sized enterprises (SMEs) is presented. Design/methodology/approach - The paper employed case study research by conducting in-depth interviews in the two companies. Findings - Supply chain collaboration concept is of significant importance for the agri-food industry however, some constraints arise due to the nature of industry's products, and the specific structure of the sector. Subsequently, collaboration in the supply chain is often limited to operational issues and to logistics-related activities. Research limitations/implications - Research is limited to a single case study and further qualitative testing of the conceptual model is needed in order to adjust the model before large scale testing. Practical implications - Case study findings may be transferable to other similar dual relationships at the grower-processor interface. Weaker parts in asymmetric relationships have opportunities to improve their position, altering the dependence balance, by achieving product/process excellence. Originality/value - The paper provides evidence regarding the applicability of the supply chain collaboration concept in the agri-food industry. It takes into consideration not relationships between big multinational companies, but SMEs. © Emerald Group Publishing Limited.
Resumo:
Detection thresholds for two visual- and two auditory-processing tasks were obtained for 73 children and young adults who varied broadly in reading ability. A reading-disabled subgroup had significantly higher thresholds than a normal-reading subgroup for the auditory tasks only. When analyzed across the whole group, the auditory tasks and one of the visual tasks, coherent motion detection, were significantly related to word reading. These effects were largely independent of ADHD ratings; however, none of these measures accounted for significant variance in word reading after controlling for full-scale IQ. In contrast, phoneme awareness, rapid naming, and nonword repetition each explained substantial, significant word reading variance after controlling for IQ, suggesting more specific roles for these oral language skills in the development of word reading. © 2004 Elsevier Inc. All rights reserved.
Resumo:
Clogging is the main operational problem associated with horizontal subsurface flow constructed wetlands (HSSF CWs). The measurement of saturated hydraulic conductivity has proven to be a suitable technique to assess clogging within HSSF CWs. The vertical and horizontal distribution of hydraulic conductivity was assessed in two full-scale HSSF CWs by using two different in situ permeameter methods (falling head (FH) and constant head (CH) methods). Horizontal hydraulic conductivity profiles showed that both methods are correlated by a power function (FH= CH 0.7821, r 2=0.76) within the recorded range of hydraulic conductivities (0-70 m/day). However, the FH method provided lower values of hydraulic conductivity than the CH method (one to three times lower). Despite discrepancies between the magnitudes of reported readings, the relative distribution of clogging obtained via both methods was similar. Therefore, both methods are useful when exploring the general distribution of clogging and, specially, the assessment of clogged areas originated from preferential flow paths within full-scale HSSF CWs. Discrepancy between methods (either in magnitude and pattern) aroused from the vertical hydraulic conductivity profiles under highly clogged conditions. It is believed this can be attributed to procedural differences between the methods, such as the method of permeameter insertion (twisting versus hammering). Results from both methods suggest that clogging develops along the shortest distance between water input and output. Results also evidence that the design and maintenance of inlet distributors and outlet collectors appear to have a great influence on the pattern of clogging, and hence the asset lifetime of HSSF CWs. © Springer Science+Business Media B.V. 2011.
Resumo:
Background: Qualitative research has suggested that spousal carers of someone with dementia differ in terms of whether they perceive their relationship with that person as continuous with the premorbid relationship or as radically different, and that a perception of continuity may be associated with more person-centered care and the experience of fewer of the negative emotions associated with caring. The aim of the study was to develop and evaluate a quantitative measure of the extent to which spousal carers perceive the relationship to be continuous. Methods: An initial pool of 42 questionnaire items was generated on the basis of the qualitative research about relationship continuity. These were completed by 51 spousal carers and item analysis was used to reduce the pool to 23 items. The retained items, comprising five subscales, were then administered to a second sample of 84 spousal carers, and the questionnaire's reliability, discriminative power, and validity were evaluated. Results: The questionnaire showed good reliability: Cronbach's α for the full scale was 0.947, and test-retest reliability was 0.932. Ferguson's δ was 0.987, indicating good discriminative power. Evidence of construct validity was provided by predicted patterns of subscale correlations with the Closeness and Conflict Scale and the Marwit-Meuser Caregiver Grief Inventory. Conclusion: Initial psychometric evaluation of the measure was encouraging. The measure provides a quantitative means of investigating ideas from qualitative research about the role of relationship continuity in influencing how spousal carers provide care and how they react emotionally to their caring role. © 2012 International Psychogeriatric Association.
Resumo:
A method of accurately controlling the position of a mobile robot using an external large volume metrology (LVM) instrument is presented in this article. By utilising an LVM instrument such as a laser tracker or indoor GPS (iGPS) in mobile robot navigation, many of the most difficult problems in mobile robot navigation can be simplified or avoided. Using the real-time position information from the laser tracker, a very simple navigation algorithm, and a low cost robot, 5mm repeatability was achieved over a volume of 30m radius. A surface digitisation scan of a wind turbine blade section was also demonstrated, illustrating possible applications of the method for manufacturing processes. Further, iGPS guidance of a small KUKA omni-directional robot has been demonstrated, and a full scale prototype system is being developed in cooperation with KUKA Robotics, UK. © 2011 Taylor & Francis.
Resumo:
The tobacco industry's future depends on increasing tobacco use in low-income and middle-income countries (LMICs), which face a growing burden of tobacco-related disease, yet have potential to prevent full-scale escalation of this epidemic. To drive up sales the industry markets its products heavily, deliberately targeting non-smokers and keeps prices low until smoking and local economies are sufficiently established to drive prices and profits up. The industry systematically flaunts existing tobacco control legislation and works aggressively to prevent future policies using its resource advantage to present highly misleading economic arguments, rebrand political activities as corporate social responsibility, and establish and use third parties to make its arguments more palatable. Increasingly it is using domestic litigation and international arbitration to bully LMICs from implementing effective policies and hijacking the problem of tobacco smuggling for policy gain, attempting to put itself in control of an illegal trade in which there is overwhelming historical evidence of its complicity. Progress will not be realised until tobacco industry interference is actively addressed as outlined in Article 5.3 of the Framework Convention on Tobacco Control. Exemplar LMICs show this action can be achieved and indicate that exposing tobacco industry misconduct is an essential first step.
Resumo:
This article considers North Korea and the notion of crisis, by linking historical development over the Korean peninsula to the conflict resolution literature, and investigates why despite a large number of destabilising events, a war involving Pyongyang has yet to erupt. The paper considers historical data and uses a framework developed by Aggarwal et al. in order to highlight patterns of interaction between states such as the United States, North Korea and South Korea, organisations such as the United Nations, as well as processes such as the Six-Party Talk and the Agreed Framework. The paper then develops a crisis framework based on conflict resolution and negotiation literature, and applies it to three North Korean administrations. Findings suggests that an elastic understanding of time (for all parties involved on the peninsula) leads to an impossibility to reach a threshold where full-scale war would be triggered, thus leaving parties in a stable state of crisis for which escalating moves and de-escalating techniques might become irrelevant.
Resumo:
This thesis introduces and develops a novel real-time predictive maintenance system to estimate the machine system parameters using the motion current signature. Recently, motion current signature analysis has been addressed as an alternative to the use of sensors for monitoring internal faults of a motor. A maintenance system based upon the analysis of motion current signature avoids the need for the implementation and maintenance of expensive motion sensing technology. By developing nonlinear dynamical analysis for motion current signature, the research described in this thesis implements a novel real-time predictive maintenance system for current and future manufacturing machine systems. A crucial concept underpinning this project is that the motion current signature contains information relating to the machine system parameters and that this information can be extracted using nonlinear mapping techniques, such as neural networks. Towards this end, a proof of concept procedure is performed, which substantiates this concept. A simulation model, TuneLearn, is developed to simulate the large amount of training data required by the neural network approach. Statistical validation and verification of the model is performed to ascertain confidence in the simulated motion current signature. Validation experiment concludes that, although, the simulation model generates a good macro-dynamical mapping of the motion current signature, it fails to accurately map the micro-dynamical structure due to the lack of knowledge regarding performance of higher order and nonlinear factors, such as backlash and compliance. Failure of the simulation model to determine the micro-dynamical structure suggests the presence of nonlinearity in the motion current signature. This motivated us to perform surrogate data testing for nonlinearity in the motion current signature. Results confirm the presence of nonlinearity in the motion current signature, thereby, motivating the use of nonlinear techniques for further analysis. Outcomes of the experiment show that nonlinear noise reduction combined with the linear reverse algorithm offers precise machine system parameter estimation using the motion current signature for the implementation of the real-time predictive maintenance system. Finally, a linear reverse algorithm, BJEST, is developed and applied to the motion current signature to estimate the machine system parameters.