35 resultados para Combined Web crippling and Flange Crushing
Resumo:
The aim of this research is to promote the use of G.R.P. as a structural material. In the past, the use of G.R.P. has been confined to non-load carrying applications. Such uses are still rapidly increasing but in addition significant changes have been made during the last decade in the development of semi-structural and now even fully structural applications. Glass-reinforced plastic is characterized by a high strength but a relatively low modulus of elasticity. For this reasona G.R.P. structure can expect to show large deformations as a result of which the individual structural members will fail under load due to a loss of stability rather than approaching the ultimate strength of the material. For this reason the selection of the geometrical shapes of G.R.P. structural elements is considered to be an important factor in designing G.R.P. structures. The first chapter of this thesis deals with a general review of the theoretical and experimental methods used to describe the structural properties of G.R.P. The research programme includes five stages dealing with the structural behaviour of G.R.P. The first stage (Chapter 2) begins with selecting and designing an optimum box beam cross-section which gives the maximum flexural and torsional rigidity. The second stage of investigation (Chapter 3) deals with beam to beam connections. A joint was designed and manufactured with different types of fasteners used to connect two beam units. A suitable fastener was selected and the research extended to cover the behaviour of long span beams using multiple joints. The third part of the investigation includes a study of the behaviour of box beams subjected to combined bending, shear and torsion. A special torque rig was developed to perform the tests. Creep deformation of 6 m span G.R.P. was investigated as the fourth stage under a range of loading conditions. As a result of the phenomenon of post buckling behaviour exhibited in the compression flange during testing of box beams during earlier stages of the investigation it was decided to consider this phenomenon in more detail in the final stage of the investigation. G.R.P. plates with different fibre orientation were subjected to uniaxial compression and tested up to failure. In all stages of the investigation theoretical predictions and experimental results were compared and generally good correlation between theory and experimental data was observed.
Resumo:
Reported in this thesis are test results of 37 eccentrically prestressed beams with stirrups. Single variable parameters were investigated including the prestressing force, the prestressing steel area, the concrete strength, the aspect ratio h/b and the stirrups size and spacing. Interaction of bending, torsion and shear was also investigated by testing a series of beams subjected to varying bending/torsional moment ratios. For the torsional strength an empirical expression of linear format is proposed and can be rearranged in a non-dimensional interaction form: T/To+V/Vo+M/Mo+Ps/Po+Fs/Fo=Pc2/Fsp. This formula which is based on an average experimental steel stress lower than the yield point is compared with 243 prestressed beams containing ' stirrups, including the author's test beams, and good agreement is obtained. For the theoretical analysis of the problem of torsion combined with bending and shear in concrete beams with stirrups, the method of torque-friction is proposed and developed using an average steel stress. A general linear interaction equation for combined torsion with bending and/or shear is proposed in the following format: (fi) T/Tu=1 where (fi) is a combined loading factor to modify the pure ultimate strength for differing cases of torsion with bending and/or shear. From the analysis of 282 reinforced and prestressed concrete beams containing stirrups, including the present investigation, good agreement is obtained between the method and the test results. It is concluded that the proposed method provides a rational and simple basis for predicting the ultimate torisional strength and may also be developed for design purposes.
Resumo:
This thesis investigates corporate financial disclosure practices on Web sites and their impact. This is done, first by examining the views of various Saudi user groups (institutional investors, financial analysts and private investors) on disclosure of financial reporting on the Internet and assessing differences, if any, in perceptions of the groups. Over 303 individuals from three groups responded to a questionnaire. Views were elicited regarding: users attitude to the Internet infrastructure in Saudi Arabia, users information sources about companies in Saudi Arabia, respondents perception about the advantages and disadvantages in Internet financial reporting (IFR), respondents attitude to the quality of IFR provided by Saudi public companies and the impact of IFR on users information needs. Overall, it was found professional groups (Institutional investors, financial analysts) hold similar views in relation to many issues, while the opinions of private investors differ considerably. Second, the thesis examines the use of the Internet for the disclosure of financial and investor-related information by Saudi public companies (113 companies) and look to identify reasons for the differences in the online disclosure practices of companies by testing the association between eight firm-specific factors and the level of online disclosure. The financial disclosure index (167 items) is used to measure public company disclosure in Saudi Arabia. The descriptive part of the study reveals that 95 (84%) of the Saudi public companies in the sample had a website and 51 (45%) had a financial information section of some description. Furthermore, none of the sample companies provided 100% of the 167 index items applicable to the company. Results of multivariate analysis show that firm size and stock market listing are significant explanatory variables for the amount of information disclosed on corporate Web sites. The thesis finds a significant and negative relationship between the proportion of institutional ownership of a companys shares and the level of IFR.
Resumo:
Mental simulations and analogies have been identified as powerful learning tools for RNPs. Furthermore, visuals in advertising have recently been conceptualized as meaningful sources of information as opposed to peripheral cues and thus may help consumers learn about RNPs. The study of visual attention may also contribute to understanding the links between conceptual and perceptual analyses when learning for a RNP. Two conceptual models are developed. the first model consists of causal relationships between the attributes of advertising stimuli for RNPs and consumer responses, as well as mediating influences. The second model focuses on the role of visual attention in product comprehension as a response to advertising stimuli. Two experiments are conducted: a Web-Experiment and an eye-tracking experiment. The first experiment (858 subjects) examines the effect of learning strategies (mental simulation vs. analogy vs. no analogy/no mental simulation) and presentation formats (words vs. pictures) on individual responses. The mediating role of emotions is assessed. The second experiment investigates the effect of learning strategies and presentation formats on product comprehension, along with the role of attention (17 subjects). The findings from experiment 1 indicate that learning strategies and presentation formats can either enhance or undermine the effect of advertising stimuli on individual responses. Moreover, the nature of the product (i.e. hedonic vs. utilitarian vs. hybrid) should be considered when designing communications for RNPs. The mediating role of emotions is verified. Experiment 2 suggests that an increase in attention to the message may either reflect enhanced comprehension or confusion.
Resumo:
Component-based development (CBD) has become an important emerging topic in the software engineering field. It promises long-sought-after benefits such as increased software reuse, reduced development time to market and, hence, reduced software production cost. Despite the huge potential, the lack of reasoning support and development environment of component modeling and verification may hinder its development. Methods and tools that can support component model analysis are highly appreciated by industry. Such a tool support should be fully automated as well as efficient. At the same time, the reasoning tool should scale up well as it may need to handle hundreds or even thousands of components that a modern software system may have. Furthermore, a distributed environment that can effectively manage and compose components is also desirable. In this paper, we present an approach to the modeling and verification of a newly proposed component model using Semantic Web languages and their reasoning tools. We use the Web Ontology Language and the Semantic Web Rule Language to precisely capture the inter-relationships and constraints among the entities in a component model. Semantic Web reasoning tools are deployed to perform automated analysis support of the component models. Moreover, we also proposed a service-oriented architecture (SOA)-based semantic web environment for CBD. The adoption of Semantic Web services and SOA make our component environment more reusable, scalable, dynamic and adaptive.
Resumo:
The rodent ventrobasal (VB) thalamus receives sensory inputs from the whiskers and projects to the cortex, from which it receives reciprocal excitatory afferents. Much is known about the properties and functional roles of these glutamatergic inputs to thalamocortical neurons in the VB, but no data are available on how these afferents can affect thalamic glial cells. In this study, we used combined electrophysiological recordings and intracellular calcium ([Ca(2+)](i)) imaging to investigate glial cell responses to synaptic afferent stimulation. VB thalamus glial cells can be divided into two groups based on their [Ca(2+)](i) and electrophysiological responses to sensory and corticothalamic stimulation. One group consists of astrocytes, which stain positively for S100B and preferentially load with SR101, have linear current-voltage relations and low input resistance, show no voltage-dependent [Ca(2+)](i) responses, but express mGluR5-dependent [Ca(2+)](i) transients following stimulation of the sensory and/or corticothalamic excitatory afferent pathways. Cells of the other glial group, by contrast, stain positively for NG2, and are characterized by high input resistance, the presence of voltage-dependent [Ca(2+)](i) elevations and voltage-gated inward currents. There were no synaptically induced [Ca(2+)](i) elevations in these cells under control conditions. These results show that thalamic glial cell responses to synaptic input exhibit different properties to those of thalamocortical neurons. As VB astrocytes can respond to synaptic stimulation and signal to neighbouring neurons, this glial cell organization may have functional implications for the processing of somatosensory information and modulation of behavioural state-dependent thalamocortical network activities.
Resumo:
This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether (DME) gas adsorptive separation and steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). Hydrogen is currently receiving increasing interest as an alternative source of clean energy and has high potential applications, including the transportation sector and power generation. Computational fluid dynamic (CFD) modelling has attracted considerable recognition in the engineering sector consequently leading to using it as a tool for process design and optimisation in many industrial processes. In most cases, these processes are difficult or expensive to conduct in lab scale experiments. The CFD provides a cost effective methodology to gain detailed information up to the microscopic level. The main objectives in this project are to: (i) develop a predictive model using ANSYS FLUENT (CFD) commercial code to simulate the flow hydrodynamics, mass transfer, reactions and heat transfer in a large scale dual fluidized bed system for combined gas separation and steam reforming processes (ii) implement a suitable adsorption models in the CFD code, through a user defined function, to predict selective separation of a gas from a mixture (iii) develop a model for dimethyl ether steam reforming (DME-SR) to predict hydrogen production (iv) carry out detailed parametric analysis in order to establish ideal operating conditions for future industrial application. The project has originated from a real industrial case problem in collaboration with the industrial partner Dow Corning (UK) and jointly funded by the Engineering and Physical Research Council (UK) and Dow Corning. The research examined gas separation by adsorption in a bubbling bed, as part of a dual fluidized bed system. The adsorption process was simulated based on the kinetics derived from the experimental data produced as part of a separate PhD project completed under the same fund. The kinetic model was incorporated in FLUENT CFD tool as a pseudo-first order rate equation; some of the parameters for the pseudo-first order kinetics were obtained using MATLAB. The modelling of the DME adsorption in the designed bubbling bed was performed for the first time in this project and highlights the novelty in the investigations. The simulation results were analysed to provide understanding of the flow hydrodynamic, reactor design and optimum operating condition for efficient separation. Bubbling bed validation by estimation of bed expansion and the solid and gas distribution from simulation agreed well with trends seen in the literatures. Parametric analysis on the adsorption process demonstrated that increasing fluidizing velocity reduced adsorption of DME. This is as a result of reduction in the gas residence time which appears to have much effect compared to the solid residence time. The removal efficiency of DME from the bed was found to be more than 88%. Simulation of the DME-SR in FLUENT CFD was conducted using selected kinetics from literature and implemented in the model using an in-house developed user defined function. The validation of the kinetics was achieved by simulating a case to replicate an experimental study of a laboratory scale bubbling bed by Vicente et al [1]. Good agreement was achieved for the validation of the models, which was then applied in the DME-SR in the large scale riser section of the dual fluidized bed system. This is the first study to use the selected DME-SR kinetics in a circulating fluidized bed (CFB) system and for the geometry size proposed for the project. As a result, the simulation produced the first detailed data on the spatial variation and final gas product in such an industrial scale fluidized bed system. The simulation results provided insight in the flow hydrodynamic, reactor design and optimum operating condition. The solid and gas distribution in the CFB was observed to show good agreement with literatures. The parametric analysis showed that the increase in temperature and steam to DME molar ratio increased the production of hydrogen due to the increased DME conversions, whereas the increase in the space velocity has been found to have an adverse effect. Increasing temperature between 200 oC to 350 oC increased DME conversion from 47% to 99% while hydrogen yield increased substantially from 11% to 100%. The CO2 selectivity decreased from 100% to 91% due to the water gas shift reaction favouring CO at higher temperatures. The higher conversions observed as the temperature increased was reflected on the quantity of unreacted DME and methanol concentrations in the product gas, where both decreased to very low values of 0.27 mol% and 0.46 mol% respectively at 350 °C. Increasing the steam to DME molar ratio from 4 to 7.68 increased the DME conversion from 69% to 87%, while the hydrogen yield increased from 40% to 59%. The CO2 selectivity decreased from 100% to 97%. The decrease in the space velocity from 37104 ml/g/h to 15394 ml/g/h increased the DME conversion from 87% to 100% while increasing the hydrogen yield from 59% to 87%. The parametric analysis suggests an operating condition for maximum hydrogen yield is in the region of 300 oC temperatures and Steam/DME molar ratio of 5. The analysis of the industrial sponsor’s case for the given flow and composition of the gas to be treated suggests that 88% of DME can be adsorbed from the bubbling and consequently producing 224.4t/y of hydrogen in the riser section of the dual fluidized bed system. The process also produces 1458.4t/y of CO2 and 127.9t/y of CO as part of the product gas. The developed models and parametric analysis carried out in this study provided essential guideline for future design of DME-SR at industrial level and in particular this work has been of tremendous importance for the industrial collaborator in order to draw conclusions and plan for future potential implementation of the process at an industrial scale.
Resumo:
Web APIs have gained increasing popularity in recent Web service technology development owing to its simplicity of technology stack and the proliferation of mashups. However, efficiently discovering Web APIs and the relevant documentations on the Web is still a challenging task even with the best resources available on the Web. In this paper we cast the problem of detecting the Web API documentations as a text classification problem of classifying a given Web page as Web API associated or not. We propose a supervised generative topic model called feature latent Dirichlet allocation (feaLDA) which offers a generic probabilistic framework for automatic detection of Web APIs. feaLDA not only captures the correspondence between data and the associated class labels, but also provides a mechanism for incorporating side information such as labelled features automatically learned from data that can effectively help improving classification performance. Extensive experiments on our Web APIs documentation dataset shows that the feaLDA model outperforms three strong supervised baselines including naive Bayes, support vector machines, and the maximum entropy model, by over 3% in classification accuracy. In addition, feaLDA also gives superior performance when compared against other existing supervised topic models.
Resumo:
Aims: It is well established that the bile salt sodium taurocholate acts as a germinant for Clostridium difficile spores and the amino acid glycine acts as a co-germinant. The aim of this study was to determine whether any other amino acids act as co-germinants. Methods and Results: Clostridium difficile spore suspensions were exposed to different germinant solutions comprising taurocholate, glycine and an additional amino acid for 1 h before heating shocking (to kill germinating cells) or chilling on ice. Samples were then re-germinated and cultured to recover remaining viable cells. Only five amino acids out of the 19 common amino acids tested (valine, aspartic acid, arginine, histidine and serine) demonstrated co-germination activity with taurocholate and glycine. Of these, only histidine produced high levels of germination (97·9–99·9%) consistently in four strains of Cl. difficile spores. Some variation in the level of germination produced was observed between different PCR ribotypes, and the optimum concentration of amino acids with taurocholate for the germination of Cl. difficile NCTC 11204 spores was 10–100 mmol l-1. Conclusions: Histidine was found to be a co-germinant for Cl. difficile spores when combined with glycine and taurocholate. Significance and Impact of the Study: The findings of this study enhance current knowledge regarding agents required for germination of Cl. difficile spores which may be utilized in the development of novel applications to prevent the spread of Cl. difficile infection.
Resumo:
UK universities are accepting increasing numbers of students whose L1 is not English on a wide range of programmes at all levels. These students require additional support and training in English, focussing on their academic disciplines. Corpora have been used in EAP since the 1980s, mainly for research, but a growing number of researchers and practitioners have been advocating the use of corpora in EAP pedagogy, and such use is gradually increasing. This paper outlines the processes and factors to be considered in the design and compilation of an EAP corpus (e.g., the selection and acquisition of texts, metadata, data annotation, software tools and outputs, web interface, and screen displays), especially one intended to be used for teaching. Such a corpus would also facilitate EAP research in terms of longitudinal studies, student progression and development, and course and materials design. The paper has been informed by the preparatory work on the EAP subcorpus of the ACORN corpus project at Aston University. © 2007 Elsevier Ltd. All rights reserved.
Resumo:
In current organizations, valuable enterprise knowledge is often buried under rapidly expanding huge amount of unstructured information in the form of web pages, blogs, and other forms of human text communications. We present a novel unsupervised machine learning method called CORDER (COmmunity Relation Discovery by named Entity Recognition) to turn these unstructured data into structured information for knowledge management in these organizations. CORDER exploits named entity recognition and co-occurrence data to associate individuals in an organization with their expertise and associates. We discuss the problems associated with evaluating unsupervised learners and report our initial evaluation experiments in an expert evaluation, a quantitative benchmarking, and an application of CORDER in a social networking tool called BuddyFinder.
Resumo:
The Semantic Web (SW) offers an opportunity to develop novel, sophisticated forms of question answering (QA). Specifically, the availability of distributed semantic markup on a large scale opens the way to QA systems which can make use of such semantic information to provide precise, formally derived answers to questions. At the same time the distributed, heterogeneous, large-scale nature of the semantic information introduces significant challenges. In this paper we describe the design of a QA system, PowerAqua, designed to exploit semantic markup on the web to provide answers to questions posed in natural language. PowerAqua does not assume that the user has any prior information about the semantic resources. The system takes as input a natural language query, translates it into a set of logical queries, which are then answered by consulting and aggregating information derived from multiple heterogeneous semantic sources.
Resumo:
Browsing constitutes an important part of the user information searching process on the Web. In this paper, we present a browser plug-in called ESpotter, which recognizes entities of various types on Web pages and highlights them according to their types to assist user browsing. ESpotter uses a range of standard named entity recognition techniques. In addition, a key new feature of ESpotter is that it addresses the problem of multiple domains on the Web by adapting lexicon and patterns to these domains.
Resumo:
Background—The molecular mechanisms underlying similarities and differences between physiological and pathological left ventricular hypertrophy (LVH) are of intense interest. Most previous work involved targeted analysis of individual signaling pathways or screening of transcriptomic profiles. We developed a network biology approach using genomic and proteomic data to study the molecular patterns that distinguish pathological and physiological LVH. Methods and Results—A network-based analysis using graph theory methods was undertaken on 127 genome-wide expression arrays of in vivo murine LVH. This revealed phenotype-specific pathological and physiological gene coexpression networks. Despite >1650 common genes in the 2 networks, network structure is significantly different. This is largely because of rewiring of genes that are differentially coexpressed in the 2 networks; this novel concept of differential wiring was further validated experimentally. Functional analysis of the rewired network revealed several distinct cellular pathways and gene sets. Deeper exploration was undertaken by targeted proteomic analysis of mitochondrial, myofilament, and extracellular subproteomes in pathological LVH. A notable finding was that mRNA–protein correlation was greater at the cellular pathway level than for individual loci. Conclusions—This first combined gene network and proteomic analysis of LVH reveals novel insights into the integrated pathomechanisms that distinguish pathological versus physiological phenotypes. In particular, we identify differential gene wiring as a major distinguishing feature of these phenotypes. This approach provides a platform for the investigation of potentially novel pathways in LVH and offers a freely accessible protocol (http://sites.google.com/site/cardionetworks) for similar analyses in other cardiovascular diseases.
Resumo:
Disasters cause widespread harm and disrupt the normal functioning of society, and effective management requires the participation and cooperation of many actors. While advances in information and networking technology have made transmission of data easier than it ever has been before, communication and coordination of activities between actors remain exceptionally difficult. This paper employs semantic web technology and Linked Data principles to create a network of intercommunicating and inter-dependent on-line sites for managing resources. Each site publishes available resources openly and a lightweight opendata protocol is used to request and respond to requests for resources between sites in the network.