847 resultados para Combined Web crippling and Flange Crushing
Resumo:
Ontologies have become the knowledge representation medium of choice in recent years for a range of computer science specialities including the Semantic Web, Agents, and Bio-informatics. There has been a great deal of research and development in this area combined with hype and reaction. This special issue is concerned with the limitations of ontologies and how these can be addressed, together with a consideration of how we can circumvent or go beyond these constraints. The introduction places the discussion in context and presents the papers included in this issue.
Resumo:
The main argument of this paper is that Natural Language Processing (NLP) does, and will continue to, underlie the Semantic Web (SW), including its initial construction from unstructured sources like the World Wide Web (WWW), whether its advocates realise this or not. Chiefly, we argue, such NLP activity is the only way up to a defensible notion of meaning at conceptual levels (in the original SW diagram) based on lower level empirical computations over usage. Our aim is definitely not to claim logic-bad, NLP-good in any simple-minded way, but to argue that the SW will be a fascinating interaction of these two methodologies, again like the WWW (which has been basically a field for statistical NLP research) but with deeper content. Only NLP technologies (and chiefly information extraction) will be able to provide the requisite RDF knowledge stores for the SW from existing unstructured text databases in the WWW, and in the vast quantities needed. There is no alternative at this point, since a wholly or mostly hand-crafted SW is also unthinkable, as is a SW built from scratch and without reference to the WWW. We also assume that, whatever the limitations on current SW representational power we have drawn attention to here, the SW will continue to grow in a distributed manner so as to serve the needs of scientists, even if it is not perfect. The WWW has already shown how an imperfect artefact can become indispensable.
Resumo:
Background & aims It has been suggested that retinal lutein may improve visual acuity for images that are illuminated by white light. Our aim was to determine the effect of a lutein and antioxidant dietary supplement on visual function. Methods A prospective, 9- and 18-month, double-masked randomised controlled trial. For the 9-month trial, 46 healthy participants were randomised (using a random number generator) to placebo (n=25) or active (n=21) groups. Twenty-nine of these subjects went on to complete 18 months of supplementation, 15 from the placebo group, and 14 from the active group. The active group supplemented daily with 6mg lutein combined with vitamins and minerals. Outcome measures were distance and near visual acuity, contrast sensitivity, and photostress recovery time. The study had 80% power at the 5% significance level for each outcome measure. Data were collected at baseline, 9, and 18 months. Results There were no statistically significant differences between groups for any of the outcome measures over 9 or 18 months. Conclusion There was no evidence of effect of 9 or 18 months of daily supplementation with a lutein-based nutritional supplement on visual function in this group of people with healthy eyes. ISRCTN78467674.
Resumo:
Objective: The aim of the study is to determine the effect of lutein combined with vitamin and mineral supplementation on contrast sensitivity in people with age-related macular disease (ARMD). Design: A prospective, 9-month, double-masked randomized controlled trial. Setting: Aston University, Birmingham, UK and a UK optometric clinical practice. Subjects: Age-related maculopathy (ARM) and atrophic age-related macular degeneration (AMD) participants were randomized (using a random number generator) to either placebo (n = 10) or active (n=15) groups. Three of the placebo group and two of the active group dropped out. Interventions: The active group supplemented daily with 6 mg lutein combined with vitamins and minerals. The outcome measure was contrast sensitivity (CS) measured using the Pelli-Robson chart, for which the study had 80% power at the 5% significance level to detect a change of 0.3log units. Results: The CS score increased by 0.07 ± 0.07 and decreased by 0.02 ± 0.18 log units for the placebo and active groups, respectively. The difference between these values is not statistically significant (z = 0.903, P = 0.376). Conclusion: The results suggest that 6 mg of lutein supplementation in combination with other antioxidants is not beneficial for this group. Further work is required to establish optimum dosage levels.
Resumo:
When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.
Resumo:
The aim of this work has been to investigate the behaviour of a continuous rotating annular chromatograph (CRAC) under a combined biochemical reaction and separation duty. Two biochemical reactions have been employed, namely the inversion of sucrose to glucose and fructose in the presence of the enzyme invertase and the saccharification of liquefied starch to maltose and dextrin using the enzyme maltogenase. Simultaneous biochemical reaction and separation has been successfully carried out for the first time in a CRAC by inverting sucrose to fructose and glucose using the enzyme invertase and collecting continuously pure fractions of glucose and fructose from the base of the column. The CRAC was made of two concentric cylinders which form an annulus 140 cm long by 1.2 cm wide, giving an annular space of 14.5 dm3. The ion exchange resin used was an industrial grade calcium form Dowex 50W-X4 with a mean diameter of 150 microns. The mobile phase used was deionised and dearated water and contained the appropriate enzyme. The annular column was slowly rotated at speeds of up to 240°h-1 while the sucrose substrate was fed continuously through a stationary feed pipe to the top of the resin bed. A systematic investigation of the factors affecting the performance of the CRAC under simultaneous biochemical reaction and separation conditions was carried out by employing a factorial experimental procedure. The main factors affecting the performance of the system were found to be the feed rate, feed concentrations and eluent rate. Results from the experiments indicated that complete conversion could be achieved for feed concentrations of up to 50% w/v sucrose and at feed throughputs of up to 17.2 kg sucrose per m3 resin/h. The second enzymic reaction, namely the saccharification of liquefied starch to maltose employing the enzyme maltogenase has also been successfully carried out on a CRAC. Results from the experiments using soluble potato starch showed that conversions of up to 79% were obtained for a feed concentration of 15.5% w/v at a feed flowrate of 400 cm3/h. The product maltose obtained was over 95% pure. Mathematical modelling and computer simulation of the sucrose inversion system has been carried out. A finite difference method was used to solve the partial differential equations and the simulation results showed good agreement with the experimental results obtained.
Resumo:
The work presented in this thesis is divided into two distinct sections. In the first, the functional neuroimaging technique of Magnetoencephalography (MEG) is described and a new technique is introduced for accurate combination of MEG and MRI co-ordinate systems. In the second part of this thesis, MEG and the analysis technique of SAM are used to investigate responses of the visual system in the context of functional specialisation within the visual cortex. In chapter one, the sources of MEG signals are described, followed by a brief description of the necessary instrumentation for accurate MEG recordings. This chapter is concluded by introducing the forward and inverse problems of MEG, techniques to solve the inverse problem, and a comparison of MEG with other neuroimaging techniques. Chapter two provides an important contribution to the field of research with MEG. Firstly, it is described how MEG and MRI co-ordinate systems are combined for localisation and visualisation of activated brain regions. A previously used co-registration methods is then described, and a new technique is introduced. In a series of experiments, it is demonstrated that using fixed fiducial points provides a considerable improvement in the accuracy and reliability of co-registration. Chapter three introduces the visual system starting from the retina and ending with the higher visual rates. The functions of the magnocellular and the parvocellular pathways are described and it is shown how the parallel visual pathways remain segregated throughout the visual system. The structural and functional organisation of the visual cortex is then described. Chapter four presents strong evidence in favour of the link between conscious experience and synchronised brain activity. The spatiotemporal responses of the visual cortex are measured in response to specific gratings. It is shown that stimuli that induce visual discomfort and visual illusions share their physical properties with those that induce highly synchronised gamma frequency oscillations in the primary visual cortex. Finally chapter five is concerned with localization of colour in the visual cortex. In this first ever use of Synthetic Aperture Magnetometry to investigate colour processing in the visual cortex, it is shown that in response to isoluminant chromatic gratings, the highest magnitude of cortical activity arise from area V2.
Resumo:
The aim of this research is to promote the use of G.R.P. as a structural material. In the past, the use of G.R.P. has been confined to non-load carrying applications. Such uses are still rapidly increasing but in addition significant changes have been made during the last decade in the development of semi-structural and now even fully structural applications. Glass-reinforced plastic is characterized by a high strength but a relatively low modulus of elasticity. For this reasona G.R.P. structure can expect to show large deformations as a result of which the individual structural members will fail under load due to a loss of stability rather than approaching the ultimate strength of the material. For this reason the selection of the geometrical shapes of G.R.P. structural elements is considered to be an important factor in designing G.R.P. structures. The first chapter of this thesis deals with a general review of the theoretical and experimental methods used to describe the structural properties of G.R.P. The research programme includes five stages dealing with the structural behaviour of G.R.P. The first stage (Chapter 2) begins with selecting and designing an optimum box beam cross-section which gives the maximum flexural and torsional rigidity. The second stage of investigation (Chapter 3) deals with beam to beam connections. A joint was designed and manufactured with different types of fasteners used to connect two beam units. A suitable fastener was selected and the research extended to cover the behaviour of long span beams using multiple joints. The third part of the investigation includes a study of the behaviour of box beams subjected to combined bending, shear and torsion. A special torque rig was developed to perform the tests. Creep deformation of 6 m span G.R.P. was investigated as the fourth stage under a range of loading conditions. As a result of the phenomenon of post buckling behaviour exhibited in the compression flange during testing of box beams during earlier stages of the investigation it was decided to consider this phenomenon in more detail in the final stage of the investigation. G.R.P. plates with different fibre orientation were subjected to uniaxial compression and tested up to failure. In all stages of the investigation theoretical predictions and experimental results were compared and generally good correlation between theory and experimental data was observed.
Resumo:
Reported in this thesis are test results of 37 eccentrically prestressed beams with stirrups. Single variable parameters were investigated including the prestressing force, the prestressing steel area, the concrete strength, the aspect ratio h/b and the stirrups size and spacing. Interaction of bending, torsion and shear was also investigated by testing a series of beams subjected to varying bending/torsional moment ratios. For the torsional strength an empirical expression of linear format is proposed and can be rearranged in a non-dimensional interaction form: T/To+V/Vo+M/Mo+Ps/Po+Fs/Fo=Pc2/Fsp. This formula which is based on an average experimental steel stress lower than the yield point is compared with 243 prestressed beams containing ' stirrups, including the author's test beams, and good agreement is obtained. For the theoretical analysis of the problem of torsion combined with bending and shear in concrete beams with stirrups, the method of torque-friction is proposed and developed using an average steel stress. A general linear interaction equation for combined torsion with bending and/or shear is proposed in the following format: (fi) T/Tu=1 where (fi) is a combined loading factor to modify the pure ultimate strength for differing cases of torsion with bending and/or shear. From the analysis of 282 reinforced and prestressed concrete beams containing stirrups, including the present investigation, good agreement is obtained between the method and the test results. It is concluded that the proposed method provides a rational and simple basis for predicting the ultimate torisional strength and may also be developed for design purposes.
Resumo:
This thesis investigates corporate financial disclosure practices on Web sites and their impact. This is done, first by examining the views of various Saudi user groups (institutional investors, financial analysts and private investors) on disclosure of financial reporting on the Internet and assessing differences, if any, in perceptions of the groups. Over 303 individuals from three groups responded to a questionnaire. Views were elicited regarding: users attitude to the Internet infrastructure in Saudi Arabia, users information sources about companies in Saudi Arabia, respondents perception about the advantages and disadvantages in Internet financial reporting (IFR), respondents attitude to the quality of IFR provided by Saudi public companies and the impact of IFR on users information needs. Overall, it was found professional groups (Institutional investors, financial analysts) hold similar views in relation to many issues, while the opinions of private investors differ considerably. Second, the thesis examines the use of the Internet for the disclosure of financial and investor-related information by Saudi public companies (113 companies) and look to identify reasons for the differences in the online disclosure practices of companies by testing the association between eight firm-specific factors and the level of online disclosure. The financial disclosure index (167 items) is used to measure public company disclosure in Saudi Arabia. The descriptive part of the study reveals that 95 (84%) of the Saudi public companies in the sample had a website and 51 (45%) had a financial information section of some description. Furthermore, none of the sample companies provided 100% of the 167 index items applicable to the company. Results of multivariate analysis show that firm size and stock market listing are significant explanatory variables for the amount of information disclosed on corporate Web sites. The thesis finds a significant and negative relationship between the proportion of institutional ownership of a companys shares and the level of IFR.
Resumo:
Mental simulations and analogies have been identified as powerful learning tools for RNPs. Furthermore, visuals in advertising have recently been conceptualized as meaningful sources of information as opposed to peripheral cues and thus may help consumers learn about RNPs. The study of visual attention may also contribute to understanding the links between conceptual and perceptual analyses when learning for a RNP. Two conceptual models are developed. the first model consists of causal relationships between the attributes of advertising stimuli for RNPs and consumer responses, as well as mediating influences. The second model focuses on the role of visual attention in product comprehension as a response to advertising stimuli. Two experiments are conducted: a Web-Experiment and an eye-tracking experiment. The first experiment (858 subjects) examines the effect of learning strategies (mental simulation vs. analogy vs. no analogy/no mental simulation) and presentation formats (words vs. pictures) on individual responses. The mediating role of emotions is assessed. The second experiment investigates the effect of learning strategies and presentation formats on product comprehension, along with the role of attention (17 subjects). The findings from experiment 1 indicate that learning strategies and presentation formats can either enhance or undermine the effect of advertising stimuli on individual responses. Moreover, the nature of the product (i.e. hedonic vs. utilitarian vs. hybrid) should be considered when designing communications for RNPs. The mediating role of emotions is verified. Experiment 2 suggests that an increase in attention to the message may either reflect enhanced comprehension or confusion.
Resumo:
Component-based development (CBD) has become an important emerging topic in the software engineering field. It promises long-sought-after benefits such as increased software reuse, reduced development time to market and, hence, reduced software production cost. Despite the huge potential, the lack of reasoning support and development environment of component modeling and verification may hinder its development. Methods and tools that can support component model analysis are highly appreciated by industry. Such a tool support should be fully automated as well as efficient. At the same time, the reasoning tool should scale up well as it may need to handle hundreds or even thousands of components that a modern software system may have. Furthermore, a distributed environment that can effectively manage and compose components is also desirable. In this paper, we present an approach to the modeling and verification of a newly proposed component model using Semantic Web languages and their reasoning tools. We use the Web Ontology Language and the Semantic Web Rule Language to precisely capture the inter-relationships and constraints among the entities in a component model. Semantic Web reasoning tools are deployed to perform automated analysis support of the component models. Moreover, we also proposed a service-oriented architecture (SOA)-based semantic web environment for CBD. The adoption of Semantic Web services and SOA make our component environment more reusable, scalable, dynamic and adaptive.
Resumo:
The rodent ventrobasal (VB) thalamus receives sensory inputs from the whiskers and projects to the cortex, from which it receives reciprocal excitatory afferents. Much is known about the properties and functional roles of these glutamatergic inputs to thalamocortical neurons in the VB, but no data are available on how these afferents can affect thalamic glial cells. In this study, we used combined electrophysiological recordings and intracellular calcium ([Ca(2+)](i)) imaging to investigate glial cell responses to synaptic afferent stimulation. VB thalamus glial cells can be divided into two groups based on their [Ca(2+)](i) and electrophysiological responses to sensory and corticothalamic stimulation. One group consists of astrocytes, which stain positively for S100B and preferentially load with SR101, have linear current-voltage relations and low input resistance, show no voltage-dependent [Ca(2+)](i) responses, but express mGluR5-dependent [Ca(2+)](i) transients following stimulation of the sensory and/or corticothalamic excitatory afferent pathways. Cells of the other glial group, by contrast, stain positively for NG2, and are characterized by high input resistance, the presence of voltage-dependent [Ca(2+)](i) elevations and voltage-gated inward currents. There were no synaptically induced [Ca(2+)](i) elevations in these cells under control conditions. These results show that thalamic glial cell responses to synaptic input exhibit different properties to those of thalamocortical neurons. As VB astrocytes can respond to synaptic stimulation and signal to neighbouring neurons, this glial cell organization may have functional implications for the processing of somatosensory information and modulation of behavioural state-dependent thalamocortical network activities.
Resumo:
This study presents a computational fluid dynamic (CFD) study of Dimethyl Ether (DME) gas adsorptive separation and steam reforming (DME-SR) in a large scale Circulating Fluidized Bed (CFB) reactor. The CFD model is based on Eulerian-Eulerian dispersed flow and solved using commercial software (ANSYS FLUENT). Hydrogen is currently receiving increasing interest as an alternative source of clean energy and has high potential applications, including the transportation sector and power generation. Computational fluid dynamic (CFD) modelling has attracted considerable recognition in the engineering sector consequently leading to using it as a tool for process design and optimisation in many industrial processes. In most cases, these processes are difficult or expensive to conduct in lab scale experiments. The CFD provides a cost effective methodology to gain detailed information up to the microscopic level. The main objectives in this project are to: (i) develop a predictive model using ANSYS FLUENT (CFD) commercial code to simulate the flow hydrodynamics, mass transfer, reactions and heat transfer in a large scale dual fluidized bed system for combined gas separation and steam reforming processes (ii) implement a suitable adsorption models in the CFD code, through a user defined function, to predict selective separation of a gas from a mixture (iii) develop a model for dimethyl ether steam reforming (DME-SR) to predict hydrogen production (iv) carry out detailed parametric analysis in order to establish ideal operating conditions for future industrial application. The project has originated from a real industrial case problem in collaboration with the industrial partner Dow Corning (UK) and jointly funded by the Engineering and Physical Research Council (UK) and Dow Corning. The research examined gas separation by adsorption in a bubbling bed, as part of a dual fluidized bed system. The adsorption process was simulated based on the kinetics derived from the experimental data produced as part of a separate PhD project completed under the same fund. The kinetic model was incorporated in FLUENT CFD tool as a pseudo-first order rate equation; some of the parameters for the pseudo-first order kinetics were obtained using MATLAB. The modelling of the DME adsorption in the designed bubbling bed was performed for the first time in this project and highlights the novelty in the investigations. The simulation results were analysed to provide understanding of the flow hydrodynamic, reactor design and optimum operating condition for efficient separation. Bubbling bed validation by estimation of bed expansion and the solid and gas distribution from simulation agreed well with trends seen in the literatures. Parametric analysis on the adsorption process demonstrated that increasing fluidizing velocity reduced adsorption of DME. This is as a result of reduction in the gas residence time which appears to have much effect compared to the solid residence time. The removal efficiency of DME from the bed was found to be more than 88%. Simulation of the DME-SR in FLUENT CFD was conducted using selected kinetics from literature and implemented in the model using an in-house developed user defined function. The validation of the kinetics was achieved by simulating a case to replicate an experimental study of a laboratory scale bubbling bed by Vicente et al [1]. Good agreement was achieved for the validation of the models, which was then applied in the DME-SR in the large scale riser section of the dual fluidized bed system. This is the first study to use the selected DME-SR kinetics in a circulating fluidized bed (CFB) system and for the geometry size proposed for the project. As a result, the simulation produced the first detailed data on the spatial variation and final gas product in such an industrial scale fluidized bed system. The simulation results provided insight in the flow hydrodynamic, reactor design and optimum operating condition. The solid and gas distribution in the CFB was observed to show good agreement with literatures. The parametric analysis showed that the increase in temperature and steam to DME molar ratio increased the production of hydrogen due to the increased DME conversions, whereas the increase in the space velocity has been found to have an adverse effect. Increasing temperature between 200 oC to 350 oC increased DME conversion from 47% to 99% while hydrogen yield increased substantially from 11% to 100%. The CO2 selectivity decreased from 100% to 91% due to the water gas shift reaction favouring CO at higher temperatures. The higher conversions observed as the temperature increased was reflected on the quantity of unreacted DME and methanol concentrations in the product gas, where both decreased to very low values of 0.27 mol% and 0.46 mol% respectively at 350 °C. Increasing the steam to DME molar ratio from 4 to 7.68 increased the DME conversion from 69% to 87%, while the hydrogen yield increased from 40% to 59%. The CO2 selectivity decreased from 100% to 97%. The decrease in the space velocity from 37104 ml/g/h to 15394 ml/g/h increased the DME conversion from 87% to 100% while increasing the hydrogen yield from 59% to 87%. The parametric analysis suggests an operating condition for maximum hydrogen yield is in the region of 300 oC temperatures and Steam/DME molar ratio of 5. The analysis of the industrial sponsor’s case for the given flow and composition of the gas to be treated suggests that 88% of DME can be adsorbed from the bubbling and consequently producing 224.4t/y of hydrogen in the riser section of the dual fluidized bed system. The process also produces 1458.4t/y of CO2 and 127.9t/y of CO as part of the product gas. The developed models and parametric analysis carried out in this study provided essential guideline for future design of DME-SR at industrial level and in particular this work has been of tremendous importance for the industrial collaborator in order to draw conclusions and plan for future potential implementation of the process at an industrial scale.
Resumo:
Web APIs have gained increasing popularity in recent Web service technology development owing to its simplicity of technology stack and the proliferation of mashups. However, efficiently discovering Web APIs and the relevant documentations on the Web is still a challenging task even with the best resources available on the Web. In this paper we cast the problem of detecting the Web API documentations as a text classification problem of classifying a given Web page as Web API associated or not. We propose a supervised generative topic model called feature latent Dirichlet allocation (feaLDA) which offers a generic probabilistic framework for automatic detection of Web APIs. feaLDA not only captures the correspondence between data and the associated class labels, but also provides a mechanism for incorporating side information such as labelled features automatically learned from data that can effectively help improving classification performance. Extensive experiments on our Web APIs documentation dataset shows that the feaLDA model outperforms three strong supervised baselines including naive Bayes, support vector machines, and the maximum entropy model, by over 3% in classification accuracy. In addition, feaLDA also gives superior performance when compared against other existing supervised topic models.