916 resultados para Bonding interface analysis
Resumo:
An implementation of a Lexical Functional Grammar (LFG) natural language front-end to a database is presented, and its capabilities demonstrated by reference to a set of queries used in the Chat-80 system. The potential of LFG for such applications is explored. Other grammars previously used for this purpose are briefly reviewed and contrasted with LFG. The basic LFG formalism is fully described, both as to its syntax and semantics, and the deficiencies of the latter for database access application shown. Other current LFG implementations are reviewed and contrasted with the LFG implementation developed here specifically for database access. The implementation described here allows a natural language interface to a specific Prolog database to be produced from a set of grammar rule and lexical specifications in an LFG-like notation. In addition to this the interface system uses a simple database description to compile metadata about the database for later use in planning the execution of queries. Extensions to LFG's semantic component are shown to be necessary to produce a satisfactory functional analysis and semantic output for querying a database. A diverse set of natural language constructs are analysed using LFG and the derivation of Prolog queries from the F-structure output of LFG is illustrated. The functional description produced from LFG is proposed as sufficient for resolving many problems of quantification and attachment.
Resumo:
The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.
Resumo:
The work described in this thesis is directed towards the reduction of tyre/road interface noise and embodies a study of the factors involved in its generation. These factors comprise: (a) materials and construction of tyres and road surfaces (b) the spectral distribution of the noise. The importance of this work has become greater with reduction in engine noise. A review of the literature shows what has been achieved so far, and stresses the importance of maintaining other desirable tyre properties such as adhesion in wet conditions. The work has involved an analysis of mechanical factors in tyre construction and the behaviour of road surfaces. Measurements on noise have been carried out under practical conditions and also on replica surfaces in the laboratory, and in addition tests of wet road adhesion have been carried out with a variety of road surfaces. Consideration has been given to the psychological effects of the spectral distribution of noise. A major part of the work under-taken has been the development of a computer program, the results of which have made it possible to design a tyre tread block pattern to give an optimum spectral distribution. Sample tyres built to this design have been subjected to noise measurements and these have been shown to agree closely with the theoretical prediction and other properties of this tyre have proved to be satisfactory.
Resumo:
Anchorage dependent cell culture is a useful model for investigating the interface that becomes established when a synthetic polymer is placed in contact with a biological system. The primary aim of this interdisciplinary study was to systematically investigate a number of properties that were already considered to have an influence on cell behaviour and thereby establish the extent of their importance. It is envisaged that investigations such as these will not only further the understanding of the mechanisms that affect cell adhesion but may ultimately lead to the development of improved biomaterials. In this study, surface analysis of materials was carried out in parallel with culture studies using fibroblast cells. Polarity, in it's ability to undergo hydrogen bonding (eg with water and proteins), had an important affect on cell behaviour, although structural arrangement and crystallinity were not found to exert any marked influence. In addition, the extent of oxidation that had occurred during the process of manufacture of substrates was also important. The treatment of polystyrene with a selected series of acids and gas plasmas confirmed the importance of polarity, structural groups and surface charge and it was shown that this polymer was not unique among `hydrophobic' materials in it's inability to support cell adhesion. The individual water structuring groups within hydrogel polymers were also observed to have controlling effects on cell behaviour. An overall view of the biological response to both hydrogel and non-hydrogel materials highlighted the importance of surface oxidation, polarity, water structuring groups and surface charge. Initial steps were also taken to analyse foetal calf serum, which is widely used to supplement cell culture media. Using an array of analytical techniques, further experiments were carried out to observe any possible differences in the amounts of lipids and calcium that become deposited to tissue culture and bacteriological grade plastic under cell culture conditions.
Resumo:
The present thesis is located within the framework of descriptive translation studies and critical discourse analysis. Modern translation studies have increasingly taken into account the complexities of power relations and ideological management involved in the production of translations. Paradoxically, persuasive political discourse has not been much touched upon, except for studies following functional (e.g. Schäffner 2002) or systemic-linguistic approaches (e.g. Calzada Pérez 2001). By taking 11 English translations of Hitler’s Mein Kampf as prime examples, the thesis aims to contribute to a better understanding of the translation of politically sensitive texts. Actors involved in political discourse are usually more concerned with the emotional appeal of their message than they are with its factual content. When such political discourse becomes the locus of translation, it may equally be crafted rhetorically, being used as a tool to persuade. It is thus the purpose of the thesis to describe subtle ‘persuasion strategies’ in institutionally translated political discourse. The subject of the analysis is an illustrative corpus of four full-text translations, two abridgements, and five extract translations of Mein Kampf. Methodologically, the thesis pursues a top-down approach. It begins by delineating sociocultural and situative-agentive conditions as causal factors impinging on the individual translations. Such interactive and interpersonal factors determined textual choices. The overall textual analysis consists of an interrelated corpus-driven and corpus-based approach. It demonstrates how corpus software can be fruitfully harnessed to discern ‘ideological significations’ in the translated texts. Altogether, the thesis investigates how translational decision-makers attempted to position the source text author and his narrative in line with overall rhetorical purposes.
Resumo:
The densities of diffuse, primitive, and classic ß-amyloid (Aß) deposits were studied in the temporal lobe in cognitively normal brain, dementia with Lewy bodies (DLB), familial Alzheimer’s disease (FAD), and sporadic AD (SAD). Principal components analysis (PCA) was used to determine whether there were distinct differences between groups or whether Aß pathology was more continuously distributed from group to group. Three principal components (PC) were extracted from the data accounting for 56% of the total variance. Plots of cases in relation to the PC did not result in distinct groups but suggested overlap in Aß deposition between the groups. In addition, there were linear correlations between the densities of Aß deposits and the distribution of the cases along the PC in specific brain regions suggesting continuous variation from group to group. PC1 was associated with the degree of maturation of Aß deposits, PC2 with differences between FAD and SAD, and PC3 with the degree of spread of Aß pathology into the hippocampus. Apolipoprotein E (APOE) genotype was not associated with variation in Aß deposition between cases. PCA may be a useful method of studying the pathological interface between closely related neurodegenerative disorders.
Resumo:
The standard reference clinical score quantifying average Parkinson's disease (PD) symptom severity is the Unified Parkinson's Disease Rating Scale (UPDRS). At present, UPDRS is determined by the subjective clinical evaluation of the patient's ability to adequately cope with a range of tasks. In this study, we extend recent findings that UPDRS can be objectively assessed to clinically useful accuracy using simple, self-administered speech tests, without requiring the patient's physical presence in the clinic. We apply a wide range of known speech signal processing algorithms to a large database (approx. 6000 recordings from 42 PD patients, recruited to a six-month, multi-centre trial) and propose a number of novel, nonlinear signal processing algorithms which reveal pathological characteristics in PD more accurately than existing approaches. Robust feature selection algorithms select the optimal subset of these algorithms, which is fed into non-parametric regression and classification algorithms, mapping the signal processing algorithm outputs to UPDRS. We demonstrate rapid, accurate replication of the UPDRS assessment with clinically useful accuracy (about 2 UPDRS points difference from the clinicians' estimates, p < 0.001). This study supports the viability of frequent, remote, cost-effective, objective, accurate UPDRS telemonitoring based on self-administered speech tests. This technology could facilitate large-scale clinical trials into novel PD treatments.
Resumo:
This thesis provides a set of tools for managing uncertainty in Web-based models and workflows.To support the use of these tools, this thesis firstly provides a framework for exposing models through Web services. An introduction to uncertainty management, Web service interfaces,and workflow standards and technologies is given, with a particular focus on the geospatial domain.An existing specification for exposing geospatial models and processes, theWeb Processing Service (WPS), is critically reviewed. A processing service framework is presented as a solutionto usability issues with the WPS standard. The framework implements support for Simple ObjectAccess Protocol (SOAP), Web Service Description Language (WSDL) and JavaScript Object Notation (JSON), allowing models to be consumed by a variety of tools and software. Strategies for communicating with models from Web service interfaces are discussed, demonstrating the difficultly of exposing existing models on the Web. This thesis then reviews existing mechanisms for uncertainty management, with an emphasis on emulator methods for building efficient statistical surrogate models. A tool is developed to solve accessibility issues with such methods, by providing a Web-based user interface and backend to ease the process of building and integrating emulators. These tools, plus the processing service framework, are applied to a real case study as part of the UncertWeb project. The usability of the framework is proved with the implementation of aWeb-based workflow for predicting future crop yields in the UK, also demonstrating the abilities of the tools for emulator building and integration. Future directions for the development of the tools are discussed.
Resumo:
Developers of interactive software are confronted by an increasing variety of software tools to help engineer the interactive aspects of software applications. Not only do these tools fall into different categories in terms of functionality, but within each category there is a growing number of competing tools with similar, although not identical, features. Choice of user interface development tool (UIDT) is therefore becoming increasingly complex.
Resumo:
Impedance spectroscopy (IS) analysis is carried out to investigate the electrical properties of the metal-oxide-semiconductor (MOS) structure fabricated on hydrogen-terminated single crystal diamond. The low-temperature atomic layer deposition Al2O3 is employed as the insulator in the MOS structure. By numerically analysing the impedance of the MOS structure at various biases, the equivalent circuit of the diamond MOS structure is derived, which is composed of two parallel capacitive and resistance pairs, in series connection with both resistance and inductance. The two capacitive components are resulted from the insulator, the hydrogenated-diamond surface, and their interface. The physical parameters such as the insulator capacitance are obtained, circumventing the series resistance and inductance effect. By comparing the IS and capacitance-voltage measurements, the frequency dispersion of the capacitance-voltage characteristic is discussed.
Resumo:
Aim: Identify the incidence of vitreomacular traction (VMT) and frequency of reduced vision in the absence of other coexisting macular pathology using a pragmatic classification system for VMT in a population of patients referred to the hospital eye service. Methods: A detailed survey of consecutive optical coherence tomography (OCT) scans was done in a high-throughput ocular imaging service to ascertain cases of vitreomacular adhesion (VMA) and VMT using a departmental classification system. Analysis was done on the stages of traction, visual acuity, and association with other macular conditions. Results: In total, 4384 OCT scan episodes of 2223 patients were performed. Two hundred and fourteen eyes had VMA/VMT, with 112 eyes having coexisting macular pathology. Of 102 patients without coexisting pathology, 57 patients had VMT grade between 2 and 8, with a negative correlation between VMT grade and number of Snellen lines (r= -0.61717). There was a distinct cutoff in visual function when VMT grade was higher than 4 with the presence of cysts and sub retinal separation and breaks in the retinal layers. Conclusions: VMT is a common encounter often associated with other coexisting macular pathology. We estimated an incidence rate of 0.01% of VMT cases with reduced vision and without coexisting macular pathology that may potentially benefit from intervention. Grading of VMT to select eyes with cyst formation as well as hole formation may be useful for targeting patients who are at higher risk of visual loss from VMT.
Resumo:
The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated the model of the analysis of the text of the technical project is submitted, the attribute grammar of a technical specification, intended for formalization of limited Russian is constructed with the purpose of analysis of offers of text of a technical specification, style features of the technical project as class of documents are considered, recommendations on preparation of text of a technical specification for the automated processing are formulated. The computer-aided system of semantic text analysis of a technical specification is considered. This system consists of the following subsystems: preliminary text processing, the syntactic and semantic analysis and construction of software models, storage of documents and interface.
Resumo:
In 1962, D. June Sutor published the first crystallographic analysis of C–H…O hydrogen bonding based on a selection of structures then known. Her follow-up paper the next year cited more structures and provided more details, but her ideas met with formidable opposition. This review begins by describing knowledge of C-H…O hydrogen bonding available at the time from physico-chemical and spectroscopic studies. By comparison of structures cited by Sutor with modern redeterminations, the soundness of her basic data set is assessed. The plausibility of the counter-arguments against her is evaluated. Finally, her biographical details are presented along with consideration of factors that might have impeded the acceptance of her work. © 2012 Taylor & Francis.
Resumo:
The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated a technique of the text analysis of a technical specification is submitted, the expanded fuzzy attribute grammar of a technical specification, intended for formalization of limited Russian language is constructed with the purpose of analysis of offers of text of a technical specification, style features of the technical specification as class of documents are considered, recommendations on preparation of text of a technical specification for the automated processing are formulated. The computer-aided system of semantic text analysis of a technical specification is considered. This system consist of the following subsystems: preliminary text processing, the syntactic and semantic analysis and construction of software models, storage of documents and interface.
Resumo:
In 1972 the ionized cluster beam (ICB) deposition technique was introduced as a new method for thin film deposition. At that time the use of clusters was postulated to be able to enhance film nucleation and adatom surface mobility, resulting in high quality films. Although a few researchers reported singly ionized clusters containing 10$\sp2$-10$\sp3$ atoms, others were unable to repeat their work. The consensus now is that film effects in the early investigations were due to self-ion bombardment rather than clusters. Subsequently in recent work (early 1992) synthesis of large clusters of zinc without the use of a carrier gas was demonstrated by Gspann and repeated in our laboratory. Clusters resulted from very significant changes in two source parameters. Crucible pressure was increased from the earlier 2 Torr to several thousand Torr and a converging-diverging nozzle 18 mm long and 0.4 mm in diameter at the throat was used in place of the 1 mm x 1 mm nozzle used in the early work. While this is practical for zinc and other high vapor pressure materials it remains impractical for many materials of industrial interest such as gold, silver, and aluminum. The work presented here describes results using gold and silver at pressures of around 1 and 50 Torr in order to study the effect of the pressure and nozzle shape. Significant numbers of large clusters were not detected. Deposited films were studied by atomic force microscopy (AFM) for roughness analysis, and X-ray diffraction.^ Nanometer size islands of zinc deposited on flat silicon substrates by ICB were also studied by atomic force microscopy and the number of atoms/cm$\sp2$ was calculated and compared to data from Rutherford backscattering spectrometry (RBS). To improve the agreement between data from AFM and RBS, convolution and deconvolution algorithms were implemented to study and simulate the interaction between tip and sample in atomic force microscopy. The deconvolution algorithm takes into account the physical volume occupied by the tip resulting in an image that is a more accurate representation of the surface.^ One method increasingly used to study the deposited films both during the growth process and following, is ellipsometry. Ellipsometry is a surface analytical technique used to determine the optical properties and thickness of thin films. In situ measurements can be made through the windows of a deposition chamber. A method for determining the optical properties of a film, that is sensitive only to the growing film and accommodates underlying interfacial layers, multiple unknown underlayers, and other unknown substrates was developed. This method is carried out by making an initial ellipsometry measurement well past the real interface and by defining a virtual interface in the vicinity of this measurement. ^