40 resultados para derivation
em Aston University Research Archive
Resumo:
This research involves a study of the questions, "what is considered safe", how are safety levels defined or decided, and according to whom. Tolerable or acceptable risk questions raise various issues: about values and assumptions inherent in such levels; about decision-making frameworks at the highest level of policy making as well as on the individual level; and about the suitability and competency of decision-makers to decide and to communicate their decisions. The wide-ranging topics covering philosophical and practical concerns examined in the literature review reveal the multi-disciplined scope of this research. To support this theoretical study empirical research was undertaken at the European Space Research and Technology Centre (ESTEC) of the European Space Agency (ESA). ESTEC is a large, multi-nationality, high technology organisation which presented an ideal case study for exploring how decisions are made with respect to safety from a personal as well as organisational aspect. A qualitative methodology was employed to gather, analyse and report the findings of this research. Significant findings reveal how experts perceive risks and the prevalence of informal decision-making processes partly due to the inadequacy of formal methods for deciding risk tolerability. In the field of occupational health and safety, this research has highlighted the importance and need for criteria to decide whether a risk is great enough to warrant attention in setting standards and priorities for risk control and resources. From a wider perspective and with the recognition that risk is an inherent part of life, the establishment of tolerability risk levels can be viewed as cornerstones indicating our progress, expectations and values, of life and work, in an increasingly litigious, knowledgeable and global society.
Resumo:
In efficiency studies using the stochastic frontier approach, the main focus is to explain inefficiency in terms of some exogenous variables and computation of marginal effects of each of these determinants. Although inefficiency is estimated by its mean conditional on the composed error term (the Jondrow et al., 1982 estimator), the marginal effects are computed from the unconditional mean of inefficiency (Wang, 2002). In this paper we derive the marginal effects based on the Jondrow et al. estimator and use the bootstrap method to compute confidence intervals of the marginal effects.
Resumo:
This paper proposes a semantic analysis of the French free-choice indefinite 'n’importe qui'. The semantics of the indefinite is organised as a ternary structure. The (1) abstract meaning underlies all uses of the item and acts as a principle of creative interpretation generation and comprehension. This principle is actualised via (2) discrete contextual features through to (3) contextual interpretations. Thus, the “existential” reading of 'n’importe qui' is derived by a veridical reading of the arbitrary selection of a qualitatively-marked occurrence from the set of human animates. The derivation of contextual readings from the enrichment by contextual cues of an underspecified meaning has a claim to an explanatory model of the semantics of grammatical polysemous items, and is certainly relevant to model-theoretic approaches in as much as formal semantic notions are intricately linked to the contextual interpretation of items. It is not 'n’importe qui' itself, but its contextual interpretations which may be weak or strong, and an homonymous treatment is not possible given the continuity of the quality and free-choice dimensions from one observed reading of n’importe qui to the next.
Resumo:
This paper examines the status of scalarity in the analysis of the meaning of the English determiner any. The latter’s position as a prime exemplar of the category of polarity-sensitive items has led it to be generally assumed to have scalar meaning. Scalar effects are absent however from a number of common uses of this word. This suggests that any does not involve scales as part of its core meaning, but produces them as a derived interpretative property. The role of three factors in the derivation of the expressive effect of scalarity is explored: grammatical number, stress and the presence of gradable concepts in the NP. The general conclusions point to the importance of developing a causal semantic analysis in which the contributions of each of the various meaningful components of an utterance to the overall message expressed are carefully distinguished.
Resumo:
On-line learning is examined for the radial basis function network, an important and practical type of neural network. The evolution of generalization error is calculated within a framework which allows the phenomena of the learning process, such as the specialization of the hidden units, to be analyzed. The distinct stages of training are elucidated, and the role of the learning rate described. The three most important stages of training, the symmetric phase, the symmetry-breaking phase, and the convergence phase, are analyzed in detail; the convergence phase analysis allows derivation of maximal and optimal learning rates. As well as finding the evolution of the mean system parameters, the variances of these parameters are derived and shown to be typically small. Finally, the analytic results are strongly confirmed by simulations.
Resumo:
This technical report builds on previous reports to derive the likelihood and its derivatives for a Gaussian Process with a modified Bessel function based covariance function. The full derivation is shown. The likelihood (with gradient information) can be used in maximum likelihood procedures (i.e. gradient based optimisation) and in Hybrid Monte Carlo sampling (i.e. within a Bayesian framework).
Resumo:
This report outlines the derivation and application of a non-zero mean, polynomial-exponential covariance function based Gaussian process which forms the prior wind field model used in 'autonomous' disambiguation. It is principally used since the non-zero mean permits the computation of realistic local wind vector prior probabilities which are required when applying the scaled-likelihood trick, as the marginals of the full wind field prior. As the full prior is multi-variate normal, these marginals are very simple to compute.
Resumo:
Purpose - The purpose of this paper is to develop an integrated quality management model that identifies problems, suggests solutions, develops a framework for implementation and helps to evaluate dynamically healthcare service performance. Design/methodology/approach - This study used the logical framework analysis (LFA) to improve the performance of healthcare service processes. LFA has three major steps - problems identification, solution derivation, and formation of a planning matrix for implementation. LFA has been applied in a case-study environment to three acute healthcare services (Operating Room utilisation, Accident and Emergency, and Intensive Care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This study shows LFA application in three service processes in one hospital. This very limited population sample needs to be extended. Practical implications - The proposed model can be implemented in hospital-based healthcare services in order to improve performance. It may also be applied to other services. Originality/value - Quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, they are not without flaws. There is an absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues, develop a project management framework to implement those solutions. This study introduces an integrated and uniform quality management tool for healthcare services. © Emerald Group Publishing Limited.
Resumo:
Healthcare services available these days deploy high technology to satisfy both internal and external customers by continuously improving various quality parameters. Quality improvement in healthcare services is a complex and multidimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, there is absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues and develop a project management framework to implement and evaluate those solutions. This study introduces an integrated and uniform quality management framework for healthcare services. This study uses the Logical Framework Analysis (LFA) to improve the performance of healthcare services. LFA has three major steps - problem identification, solution derivation and formation of a planning matrix for implementation and evaluation. LFA has been applied in a case study environment to three acute healthcare services (Operating Room (OR) utilisation, Accident and Emergency (A&E) and intensive care) in order to demonstrate its effectiveness. Copyright © 2007 Inderscience Enterprises Ltd.
Resumo:
Using methods of statistical physics, we study the average number and kernel size of general sparse random matrices over GF(q), with a given connectivity profile, in the thermodynamical limit of large matrices. We introduce a mapping of GF(q) matrices onto spin systems using the representation of the cyclic group of order q as the q-th complex roots of unity. This representation facilitates the derivation of the average kernel size of random matrices using the replica approach, under the replica symmetric ansatz, resulting in saddle point equations for general connectivity distributions. Numerical solutions are then obtained for particular cases by population dynamics. Similar techniques also allow us to obtain an expression for the exact and average number of random matrices for any general connectivity profile. We present numerical results for particular distributions.
Resumo:
An efficient Bayesian inference method for problems that can be mapped onto dense graphs is presented. The approach is based on message passing where messages are averaged over a large number of replicated variable systems exposed to the same evidential nodes. An assumption about the symmetry of the solutions is required for carrying out the averages; here we extend the previous derivation based on a replica-symmetric- (RS)-like structure to include a more complex one-step replica-symmetry-breaking-like (1RSB-like) ansatz. To demonstrate the potential of the approach it is employed for studying critical properties of the Ising linear perceptron and for multiuser detection in code division multiple access (CDMA) under different noise models. Results obtained under the RS assumption in the noncritical regime give rise to a highly efficient signal detection algorithm in the context of CDMA; while in the critical regime one observes a first-order transition line that ends in a continuous phase transition point. Finite size effects are also observed. While the 1RSB ansatz is not required for the original problems, it was applied to the CDMA signal detection problem with a more complex noise model that exhibits RSB behavior, resulting in an improvement in performance. © 2007 The American Physical Society.
Resumo:
We examined the effects on extinction of grouping by collinearity of edges and grouping by alignment of internal axes of shapes, in a patient (GK) with simultanagnosia following bilateral parietal brain damage. GK’s visual extinction was reduced when items (equilateral triangles and angles) could be grouped by base alignment (i.e., collinearity) or by axis alignment, relative to a condition in which items were ungrouped. These grouping effects disappeared when inter-item spacing was increased, though factors such as display symmetry remained constant. Overall, the results suggest that, under some conditions, grouping by alignment of axes of symmetry can have an equal beneficial effect on visual extinction as edge-based grouping; thus, in the extinguished field, there is derivation of axis-based representations from the contours present.
Resumo:
An implementation of a Lexical Functional Grammar (LFG) natural language front-end to a database is presented, and its capabilities demonstrated by reference to a set of queries used in the Chat-80 system. The potential of LFG for such applications is explored. Other grammars previously used for this purpose are briefly reviewed and contrasted with LFG. The basic LFG formalism is fully described, both as to its syntax and semantics, and the deficiencies of the latter for database access application shown. Other current LFG implementations are reviewed and contrasted with the LFG implementation developed here specifically for database access. The implementation described here allows a natural language interface to a specific Prolog database to be produced from a set of grammar rule and lexical specifications in an LFG-like notation. In addition to this the interface system uses a simple database description to compile metadata about the database for later use in planning the execution of queries. Extensions to LFG's semantic component are shown to be necessary to produce a satisfactory functional analysis and semantic output for querying a database. A diverse set of natural language constructs are analysed using LFG and the derivation of Prolog queries from the F-structure output of LFG is illustrated. The functional description produced from LFG is proposed as sufficient for resolving many problems of quantification and attachment.
Resumo:
The work described in this thesis deals with the development and application of a finite element program for the analysis of several cracked structures. In order to simplify the organisation of the material presented herein, the thesis has been subdivided into two Sections : In the first Section the development of a finite element program for the analysis of two-dimensional problems of plane stress or plane strain is described. The element used in this program is the six-mode isoparametric triangular element which permits the accurate modelling of curved boundary surfaces. Various cases of material aniftropy are included in the derivation of the element stiffness properties. A digital computer program is described and examples of its application are presented. In the second Section, on fracture problems, several cracked configurations are analysed by embedding into the finite element mesh a sub-region, containing the singularities and over which an analytic solution is used. The modifications necessary to augment a standard finite element program, such as that developed in Section I, are discussed and complete programs for each cracked configuration are presented. Several examples are included to demonstrate the accuracy and flexibility of the technique.
Resumo:
The primary objective of this research has been to investigate the interfacial phenomenon of protein adsorption in relation to the bulk and surface structure-property effect s of hydrogel polymers. In order to achieve this it was first necessary to characterise the bulk and surface properties of the hydrogels, with regard to the structural chemistry of their component monomers. The bulk properties of the hydrogels were established using equilibrium water content measurements, together with water-binding studies by differential scanning calorimetry (D.S.C.). Hamilton and captive air bubble-contact angle techniques were employed to characterise the hydrogel-water interface and from which by a mathematical derivation, the interfacial free energy (ðsw) and the surface free energy components (ð psv, ðdsv, ðsv) were obtained. From the adsorption studies using the radio labelled iodinated (125I) proteins of human serum albumin (H.S.A.) and human fibrinogen (H.Fb.), it was Found that multi-layered adsorption was occurring and that the rate and type of this adsorption was dependent on the physico-chemical behaviour of the adsorbing protein (and its bulk concentration in solution), together with the surface energetics of the adsorbent polymer. A potential method for the invitro evaluation of a material's 'biocompatibility' was also investigated, based on an empirically observed relationship between the adsorption of albumin and fibrinogen and the 'biocompatibility' of polymeric materials. Furthermore, some consideration was also given to the biocompatibility problem of proteinaceous deposit formation on hydrophilic soft' contact lenses and in addition a number of potential continual wear contact lens formulations now undergoing clinical trials,were characterised by the above techniques.