63 resultados para Effects-Based Approach to Operations
Resumo:
A Bayesian approach to analysing data from family-based association studies is developed. This permits direct assessment of the range of possible values of model parameters, such as the recombination frequency and allelic associations, in the light of the data. In addition, sophisticated comparisons of different models may be handled easily, even when such models are not nested. The methodology is developed in such a way as to allow separate inferences to be made about linkage and association by including theta, the recombination fraction between the marker and disease susceptibility locus under study, explicitly in the model. The method is illustrated by application to a previously published data set. The data analysis raises some interesting issues, notably with regard to the weight of evidence necessary to convince us of linkage between a candidate locus and disease.
Resumo:
Building services are worth about 2% GDP and are essential for the effective and efficient operations of the building. It is increasingly recognised that the value of a building is related to the way it supports the client organisation’s ongoing business operations. Building services are central to the functional performance of buildings and provide the necessary conditions for health, well-being, safety and security of the occupants. They frequently comprise several technologically distinct sub-systems and their design and construction requires the involvement of numerous disciplines and trades. Designers and contractors working on the same project are frequently employed by different companies. Materials and equipment is supplied by a diverse range of manufacturers. Facilities managers are responsible for operation of the building service in use. The coordination between these participants is crucially important to achieve optimum performance, but too often is neglected. This leaves room for serious faults. The need for effective integration is important. Modern technology offers increasing opportunities for integrated personal-control systems for lighting, ventilation and security as well as interoperability between systems. Opportunities for a new mode of systems integration are provided by the emergence of PFI/PPP procurements frameworks. This paper attempts to establish how systems integration can be achieved in the process of designing, constructing and operating building services. The essence of the paper therefore is to envisage the emergent organisational responses to the realisation of building services as an interactive systems network.
Resumo:
Movement disorders (MD) include a group of neurological disorders that involve neuromotor systems. MD can result in several abnormalities ranging from an inability to move, to severe constant and excessive movements. Strokes are a leading cause of disability affecting largely the older people worldwide. Traditional treatments rely on the use of physiotherapy that is partially based on theories and also heavily reliant on the therapists training and past experience. The lack of evidence to prove that one treatment is more effective than any other makes the rehabilitation of stroke patients a difficult task. UL motor re-learning and recovery levels tend to improve with intensive physiotherapy delivery. The need for conclusive evidence supporting one method over the other and the need to stimulate the stroke patient clearly suggest that traditional methods lack high motivational content, as well as objective standardised analytical methods for evaluating a patient's performance and assessment of therapy effectiveness. Despite all the advances in machine mediated therapies, there is still a need to improve therapy tools. This chapter describes a new approach to robot assisted neuro-rehabilitation for upper limb rehabilitation. Gentle/S introduces a new approach on the integration of appropriate haptic technologies to high quality virtual environments, so as to deliver challenging and meaningful therapies to people with upper limb impairment in consequence of a stroke. The described approach can enhance traditional therapy tools, provide therapy "on demand" and can present accurate objective measurements of a patient's progression. Our recent studies suggest the use of tele-presence and VR-based systems can potentially motivate patients to exercise for longer periods of time. Two identical prototypes have undergone extended clinical trials in the UK and Ireland with a cohort of 30 stroke subjects. From the lessons learnt with the Gentle/S approach, it is clear also that high quality therapy devices of this nature have a role in future delivery of stroke rehabilitation, and machine mediated therapies should be available to patient and his/her clinical team from initial hospital admission, through to long term placement in the patient's home following hospital discharge.
Resumo:
Novel macrocyclic receptors which bind electron-donor aromatic substrates via π-stacking donor- acceptor interactions are obtained by cyclo-imidization of an amine-functionalized arylether-sulfone with pyromellitic- and 1,4,5,8-naphthalene-tetracarboxylic dianhydrides. These macrocycles complex with a wide variety of π-donor substrates including tetrathiafulvalene, naphthalene, anthracene, pyrene, perylene, and functional derivatives of these polycyclic hydrocarbons. The resulting supramolecular assemblies range from simple 1:1 complexes, to [2]- and [3]-pseudorotaxanes, and even (as a result of crystallographic disorder) an apparent polyrotaxane. Direct, five-component self-assembly of a metal-centred [3]pseudorotaxane is also observed, on complexation of a macrocyclic ether-imide with 8-hydroxyquinoline in the presence of palladium(II) ions. Binding studies in solution were carried out by 1H NMR and UV-visible spectroscopy, and the stoichiometries of binding were confirmed by Job plots based on charge-transfer absorption bands. The highest association constants are found for strong π-donor guests with large surface-areas, notably perylene and 1-hydroxypyrene, for which Ka values of 1.4 x 103 and 2.3 x 103 M-1 respectively are found. Single crystal X-ray analyses of the receptors and their derived complexes reveal large, induced-fit distortions of the macrocyclic frameworks as a result of complexation. These structures provide compelling evidence for the existence of strong, attractive forces between the electronically-complementary aromatic π-systems of host and guest.
Resumo:
This paper describes a technique that can be used as part of a simple and practical agile method for requirements engineering. It is based on disciplined goal-responsibility modelling but eschews formality in favour of a set of practicality objectives. The technique can be used together with Agile Programming to develop software in internet time. We illustrate the technique and introduce lazy refinement, responsibility composition and context sketching. Goal sketching has been used in a number of real-world development.
Resumo:
Building Management Systems (BMS) are widely adopted in modern buildings around the world in order to provide high-quality building services, and reduce the running cost of the building. However, most BMS are functionality-oriented and do not consider user personalization. The aim of this research is to capture and represent building management rules using organizational semiotics methods. We implement Semantic Analysis, which determines semantic units in building management and their relationship patterns of behaviour, and Norm Analysis, which extracts and specifies the norms that establish how and when these management actions occur. Finally, we propose a multi-agent framework for norm based building management. This framework contributes to the design domain of intelligent building management system by defining a set of behaviour patterns, and the norms that govern the real-time behaviour in a building.
Resumo:
We address the problem of automatically identifying and restoring damaged and contaminated images. We suggest a novel approach based on a semi-parametric model. This has two components, a parametric component describing known physical characteristics and a more flexible non-parametric component. The latter avoids the need for a detailed model for the sensor, which is often costly to produce and lacking in robustness. We assess our approach using an analysis of electroencephalographic images contaminated by eye-blink artefacts and highly damaged photographs contaminated by non-uniform lighting. These experiments show that our approach provides an effective solution to problems of this type.
Resumo:
Very high-resolution Synthetic Aperture Radar sensors represent an alternative to aerial photography for delineating floods in built-up environments where flood risk is highest. However, even with currently available SAR image resolutions of 3 m and higher, signal returns from man-made structures hamper the accurate mapping of flooded areas. Enhanced image processing algorithms and a better exploitation of image archives are required to facilitate the use of microwave remote sensing data for monitoring flood dynamics in urban areas. In this study a hybrid methodology combining radiometric thresholding, region growing and change detection is introduced as an approach enabling the automated, objective and reliable flood extent extraction from very high-resolution urban SAR images. The method is based on the calibration of a statistical distribution of “open water” backscatter values inferred from SAR images of floods. SAR images acquired during dry conditions enable the identification of areas i) that are not “visible” to the sensor (i.e. regions affected by ‘layover’ and ‘shadow’) and ii) that systematically behave as specular reflectors (e.g. smooth tarmac, permanent water bodies). Change detection with respect to a pre- or post flood reference image thereby reduces over-detection of inundated areas. A case study of the July 2007 Severn River flood (UK) observed by the very high-resolution SAR sensor on board TerraSAR-X as well as airborne photography highlights advantages and limitations of the proposed method. We conclude that even though the fully automated SAR-based flood mapping technique overcomes some limitations of previous methods, further technological and methodological improvements are necessary for SAR-based flood detection in urban areas to match the flood mapping capability of high quality aerial photography.
Resumo:
Top Down Induction of Decision Trees (TDIDT) is the most commonly used method of constructing a model from a dataset in the form of classification rules to classify previously unseen data. Alternative algorithms have been developed such as the Prism algorithm. Prism constructs modular rules which produce qualitatively better rules than rules induced by TDIDT. However, along with the increasing size of databases, many existing rule learning algorithms have proved to be computational expensive on large datasets. To tackle the problem of scalability, parallel classification rule induction algorithms have been introduced. As TDIDT is the most popular classifier, even though there are strongly competitive alternative algorithms, most parallel approaches to inducing classification rules are based on TDIDT. In this paper we describe work on a distributed classifier that induces classification rules in a parallel manner based on Prism.
Resumo:
Farming freshwater prawns with fish in rice fields is widespread in coastal regions of southwest Bangladesh because of favourable resources and ecological conditions. This article provides an overview of an ecosystem-based approach to integrated prawn-fish-rice farming in southwest Bangladesh. The practice of prawn and fish farming in rice fields is a form of integrated aquaculture-agriculture, which provides a wide range of social, economic and environmental benefits. Integrated prawn-fish-rice farming plays an important role in the economy of Bangladesh, earning foreign exchange and increasing food production. However, this unique farming system in coastal Bangladesh is particularly vulnerable to climatechange. We suggest that community-based adaptation strategies must be developed to cope with the challenges. We propose that integrated prawn-fish-rice farming could be relocated from the coastal region to less vulnerable upland areas, but caution that this will require appropriate adaptation strategies and an enabling institutional environment.
Resumo:
Equilibrium theory occupies an important position in chemistry and it is traditionally based on thermodynamics. A novel mathematical approach to chemical equilibrium theory for gaseous systems at constant temperature and pressure is developed. Six theorems are presented logically which illustrate the power of mathematics to explain chemical observations and these are combined logically to create a coherent system. This mathematical treatment provides more insight into chemical equilibrium and creates more tools that can be used to investigate complex situations. Although some of the issues covered have previously been given in the literature, new mathematical representations are provided. Compared to traditional treatments, the new approach relies on straightforward mathematics and less on thermodynamics, thus, giving a new and complementary perspective on equilibrium theory. It provides a new theoretical basis for a thorough and deep presentation of traditional chemical equilibrium. This work demonstrates that new research in a traditional field such as equilibrium theory, generally thought to have been completed many years ago, can still offer new insights and that more efficient ways to present the contents can be established. The work presented here can be considered appropriate as part of a mathematical chemistry course at University level.
Resumo:
Straightforward mathematical techniques are used innovatively to form a coherent theoretical system to deal with chemical equilibrium problems. For a systematic theory it is necessary to establish a system to connect different concepts. This paper shows the usefulness and consistence of the system by applications of the theorems introduced previously. Some theorems are shown somewhat unexpectedly to be mathematically correlated and relationships are obtained in a coherent manner. It has been shown that theorem 1 plays an important part in interconnecting most of the theorems. The usefulness of theorem 2 is illustrated by proving it to be consistent with theorem 3. A set of uniform mathematical expressions are associated with theorem 3. A variety of mathematical techniques based on theorems 1–3 are shown to establish the direction of equilibrium shift. The equilibrium properties expressed in initial and equilibrium conditions are shown to be connected via theorem 5. Theorem 6 is connected with theorem 4 through the mathematical representation of theorem 1.
Resumo:
The paper develops a more precise specification and understanding of the process of national-level knowledge accumulation and absorptive capabilities by applying the reasoning and evidence from the firm-level analysis pioneered by Cohen and Levinthal (1989, 1990). In doing so, we acknowledge that significant cross-border effects due to the role of both inward and outward FDI exist and that assimilation of foreign knowledge is not only confined to catching-up economies but is also carried out by countries at the frontier-sharing phase. We postulate a non-linear relationship between national absorptive capacity and the technological gap, due to the effects of the cumulative nature of the learning process and the increase in complexity of external knowledge as the country approaches the technological frontier. We argue that national absorptive capacity and the accumulation of knowledge stock are simultaneously determined. This implies that different phases of technological development require different strategies. During the catching-up phase, knowledge accumulation occurs predominately through the absorption of trade and/or inward FDI-related R&D spillovers. At the pre-frontier-sharing phase onwards, increases in the knowledge base occur largely through independent knowledge creation and actively accessing foreign-located technological spillovers, inter alia through outward FDI-related R&D, joint ventures and strategic alliances.
Resumo:
Relating the measurable, large scale, effects of anaesthetic agents to their molecular and cellular targets of action is necessary to better understand the principles by which they affect behavior, as well as enabling the design and evaluation of more effective agents and the better clinical monitoring of existing and future drugs. Volatile and intravenous general anaesthetic agents (GAs) are now known to exert their effects on a variety of protein targets, the most important of which seem to be the neuronal ion channels. It is hence unlikely that anaesthetic effect is the result of a unitary mechanism at the single cell level. However, by altering the behavior of ion channels GAs are believed to change the overall dynamics of distributed networks of neurons. This disruption of regular network activity can be hypothesized to cause the hypnotic and analgesic effects of GAs and may well present more stereotypical characteristics than its underlying microscopic causes. Nevertheless, there have been surprisingly few theories that have attempted to integrate, in a quantitative manner, the empirically well documented alterations in neuronal ion channel behavior with the corresponding macroscopic effects. Here we outline one such approach, and show that a range of well documented effects of anaesthetics on the electroencephalogram (EEG) may be putatively accounted for. In particular we parameterize, on the basis of detailed empirical data, the effects of halogenated volatile ethers (a clinically widely used class of general anaesthetic agent). The resulting model is able to provisionally account for a range of anaesthetically induced EEG phenomena that include EEG slowing, biphasic changes in EEG power, and the dose dependent appearance of anomalous ictal activity, as well as providing a basis for novel approaches to monitoring brain function in both health and disease.
Resumo:
In this paper we propose an alternative model of, what is often called, land value capture in the planning system. Based on development viability models, negotiations and policy formation regarding the level of planning obligations have taken place at the local level with little clear guidance on technique, approach and method. It is argued that current approaches are regressive and fail to reflect how the ability of sites to generate planning gain can vary over time and between sites. The alternative approach suggested here attempts to rationalise rather than replace the existing practice of development viability appraisal. It is based upon the assumption that schemes with similar development values should produce similar levels of return to the landowner, developer and other stakeholders in the development as well as similar levels of planning obligations in all parts of the country. Given the high level of input uncertainty in viability modelling, a simple viability model is ‘good enough’ to quantify the maximum level of planning obligations for a given level of development value. We have argued that such an approach can deliver a more durable, equitable, simpler, consistent and cheaper method for policy formation regarding planning obligations.