20 resultados para Quantitative systems pharmacology

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The immune system is perhaps the largest yet most diffuse and distributed somatic system in vertebrates. It plays vital roles in fighting infection and in the homeostatic control of chronic disease. As such, the immune system in both pathological and healthy states is a prime target for therapeutic interventions by drugs-both small-molecule and biologic. Comprising both the innate and adaptive immune systems, human immunity is awash with potential unexploited molecular targets. Key examples include the pattern recognition receptors of the innate immune system and the major histocompatibility complex of the adaptive immune system. Moreover, the immune system is also the source of many current and, hopefully, future drugs, of which the prime example is the monoclonal antibody, the most exciting and profitable type of present-day drug moiety. This brief review explores the identity and synergies of the hierarchy of drug targets represented by the human immune system, with particular emphasis on the emerging paradigm of systems pharmacology. © the authors, publisher and licensee Libertas Academica Limited.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The software underpinning today’s IT systems needs to adapt dynamically and predictably to rapid changes in system workload, environment and objectives. We describe a software framework that achieves such adaptiveness for IT systems whose components can be modelled as Markov chains. The framework comprises (i) an autonomic architecture that uses Markov-chain quantitative analysis to dynamically adjust the parameters of an IT system in line with its state, environment and objectives; and (ii) a method for developing instances of this architecture for real-world systems. Two case studies are presented that use the framework successfully for the dynamic power management of disk drives, and for the adaptive management of cluster availability within data centres, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work is a logical continuation of research started at Aston some years ago when studies were conducted on fermentations in bubble columns. The present work highlights typical design and operating problems that could arise in such systems as waste water, chemical, biochemical and petroleum operations involving three-phase, gas-liquid-solid fluidisation; such systems are in increasing use. It is believed that this is one of few studies concerned with `true' three-phase, gas-liquid-solid fluidised systems, and that this work will contribute significantly to closing some of the gaps in knowledge in this area. The research work was mainly experimentally based and involved studies of the hydrodynamic parameters, phase holdups (gas and solid), particle mixing and segregation, and phase flow dynamics (flow regime and circulation patterns). The studies have focused particularly on the solid behaviour and the influence of properties of solids present on the above parameters in three-phase, gas-liquid-solid fluidised systems containing single particle components and those containing binary and ternary mixtures of particles. All particles were near spherical in shape and two particle sizes and total concentration levels were used. Experiments were carried out in two- and three-dimensional bubble columns. Quantitative results are presented in graphical form and are supported by qualitative results from visual studies which are also shown as schematic diagrams and in photographic form. Gas and solid holdup results are compared for air-water containing single, binary and ternary component particle mixtures. It should be noted that the criteria for selection of the materials used are very important if true three-phase fluidisation is to be achieved: this is very evident when comparing the results with those in the literature. The fluid flow and circulation patterns observed were assessed for validation of the generally accepted patterns, and the author believes that the present work provides more accurate insight into the modelling of liquid circulation in bubble columns. The characteristic bubbly flow at low gas velocity in a two-phase system is suppressed in the three-phase system. The degree of mixing within the system is found to be dependent on flow regime, liquid circulation and the ratio of solid phase physical properties. Evidence of strong `trade-off' of properties is shown; the overall solid holdup is believed to be a major parameter influencing the gas holdup structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The generation of very short range forecasts of precipitation in the 0-6 h time window is traditionally referred to as nowcasting. Most existing nowcasting systems essentially extrapolate radar observations in some manner, however, very few systems account for the uncertainties involved. Thus deterministic forecast are produced, which have a limited use when decisions must be made, since they have no measure of confidence or spread of the forecast. This paper develops a Bayesian state space modelling framework for quantitative precipitation nowcasting which is probabilistic from conception. The model treats the observations (radar) as noisy realisations of the underlying true precipitation process, recognising that this process can never be completely known, and thus must be represented probabilistically. In the model presented here the dynamics of the precipitation are dominated by advection, so this is a probabilistic extrapolation forecast. The model is designed in such a way as to minimise the computational burden, while maintaining a full, joint representation of the probability density function of the precipitation process. The update and evolution equations avoid the need to sample, thus only one model needs be run as opposed to the more traditional ensemble route. It is shown that the model works well on both simulated and real data, but that further work is required before the model can be used operationally. © 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis reports of a study into the effect upon organisations of co-operative information systems (CIS) incorporating flexible communications, group support and group working technologies. A review of the literature leads to the development of a model of effect based upon co-operative business tasks. CIS have the potential to change how co-operative business tasks are carried out and their principal effect (or performance) may therefore be evaluated by determining to what extent they are being employed to perform these tasks. A significant feature of CIS use identified is the extent to which they may be designed to fulfil particular tasks, or by contrast, may be applied creatively by users in an emergent fashion to perform tasks. A research instrument is developed using a survey questionnaire to elicit users judgements of the extent to which a CIS is employed to fulfil a range of co-operative tasks. This research instrument is applied to a longitudinal study of Novell GroupWise introduction at Northamptonshire County Council during which qualitative as well as quantitative data were gathered. A method of analysis of questionnaire results using principles from fuzzy mathematics and artificial intelligence is developed and demonstrated. Conclusions from the longitudinal study include the importance of early experiences in setting patterns for use for CIS, the persistence of patterns of use over time and the dominance of designed usage of the technology over emergent use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modulation of 5-hydroxytryptamine (5-HT)-related head-twitchbehaviour by antimigraine drugs and migraine triggers was examined inmice. The antimigraine drugs examined produced either inhibition or noeffect on 5-HT-related head-twitching. On the basis of these resultsit is suggested that 5-HT-related head-twitching is unlikely to beuseful in the preclinical screening and discovery of systemically-activeantimigraine agents. The migraine triggers examined, tyramineand beta-PEA initially produced a repeatable complex time-relatedeffect on 5-HT-related head-twitching, with both inhibition andpotentiation of this behaviour being observed, however, when furtherexamination of the effect of the migraine triggers on 5-HT-relatedhead-twitching was attempted some time later the effects seeninitially were no longer produced. The effect of (±)-1-<2, 5-dimethoxy-4-iodophenyl)-2-aminopropane,((±)DOl), on on-going behaviour of mice and rats was examined. Shakingbehaviour was observed in both species. In mice, excessive scratchingbehaviour was also present. (±)DOl-induced scratching and shakingbehaviour were found to be differentially modulated by noradrenergicand serotonergic agents, however, the fact that both behaviours wereblocked by ritanserin (5-HT2/5-HT1c receptor antagonist) and inhibitedby FLA-63 (a dopamine-beta-oxidase inhibitor which depletesnoradrenaline), suggests the pathways mediating these behaviours mustbe convergent in some manner, and that both behaviours require intact5-HT receptors, probably 5-HT2 receptors, for their production. Ingeneral, the behavioural profile of (±)DOI was as expected for anagent which exhibits high affinity binding to 5-HT2/5-HT1c receptors.Little sign of the 5-HTl-related '5-HT syndrome' was seen in eithermice or rats. The effect of a variety of noradrenergic agents on head-twitchinginduced by a variety of shake-inducing agents was examined. A patternof modulatory effect was seen whereby the modulatory effect of thenoradrenergic agents on 5-hydroxytryptophan <5-HTP) (and in some cases, 5-methoxy-N,N-dimethyltryptamine (5-MeODMT)) was found to be the opposite of that observed with quipazine and (±)DOI. The relationship between these effects, and their implications for understanding the pharmacology of centrally acting drugs is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Manufacturing firms are driven by competitive pressures to continually improve the effectiveness and efficiency of their organisations. For this reason, manufacturing engineers often implement changes to existing processes, or design new production facilities, with the expectation of making further gains in manufacturing system performance. This thesis relates to how the likely outcome of this type of decision should be predicted prior to its implementation. The thesis argues that since manufacturing systems must also interact with many other parts of an organisation, the expected performance improvements can often be significantly hampered by constraints that arise elsewhere in the business. As a result, decision-makers should attempt to predict just how well a proposed design will perform when these other factors, or 'support departments', are taken into consideration. However, the thesis also demonstrates that, in practice, where quantitative analysis is used to evaluate design decisions, the analysis model invariably ignores the potential impact of support functions on a system's overall performance. A more comprehensive modelling approach is therefore required. A study of how various business functions interact establishes that to properly represent the kind of delays that give rise to support department constraints, a model should actually portray the dynamic and stochastic behaviour of entities in both the manufacturing and non-manufacturing aspects of a business. This implies that computer simulation be used to model design decisions but current simulation software does not provide a sufficient range of functionality to enable the behaviour of all of these entities to be represented in this way. The main objective of the research has therefore been the development of a new simulator that will overcome limitations of existing software and so enable decision-makers to conduct a more holistic evaluation of design decisions. It is argued that the application of object-oriented techniques offers a potentially better way of fulfilling both the functional and ease-of-use issues relating to development of the new simulator. An object-oriented analysis and design of the system, called WBS/Office, are therefore presented that extends to modelling a firm's administrative and other support activities in the context of the manufacturing system design process. A particularly novel feature of the design is the ability for decision-makers to model how a firm's specific information and document processing requirements might hamper shop-floor performance. The simulator is primarily intended for modelling make-to-order batch manufacturing systems and the thesis presents example models created using a working version of WBS/Office that demonstrate the feasibility of using the system to analyse manufacturing system designs in this way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This exploratory study is concerned with the integrated appraisal of multi-storey dwelling blocks which incorporate large concrete panel systems (LPS). The first step was to look at U.K. multi-storey dwelling stock in general, and under the management of Birmingham City Council in particular. The information has been taken from the databases of three departments in the City of Birmingham, and rearranged in a new database using a suite of PC software called `PROXIMA' for clarity and analysis. One hundred of their stock were built large concrete panel system. Thirteen LPS blocks were chosen for the purpose of this study as case-studies depending mainly on the height and age factors of the block. A new integrated appraisal technique has been created for the LPS dwelling blocks, which takes into account the most physical and social factors affecting the condition and acceptability of these blocks. This appraisal technique is built up in a hierarchical form moving from the general approach to particular elements (a tree model). It comprises two main approaches; physical and social. In the physical approach, the building is viewed as a series of manageable elements and sub-elements to cover every single physical or environmental factor of the block, in which the condition of the block is analysed. A quality score system has been developed which depends mainly on the qualitative and quantitative conditions of each category in the appraisal tree model, and leads to physical ranking order of the study blocks. In the social appraisal approach, the residents' satisfaction and attitude toward their multi-storey dwelling block was analysed in relation to: a. biographical and housing related characteristics; and b. social, physical and environmental factors associated with this sort of dwelling, block and estate in general.The random sample consisted of 268 residents living in the 13 case study blocks. Data collected was analysed using frequency counts, percentages, means, standard deviations, Kendall's tue, r-correlation coefficients, t-test, analysis of variance (ANOVA) and multiple regression analysis. The analysis showed a marginally positive satisfaction and attitude towards living in the block. The five most significant factors associated with the residents' satisfaction and attitude in descending order were: the estate, in general; the service categories in the block, including heating system and lift services; vandalism; the neighbours; and the security system of the block. An important attribute of this method, is that it is relatively inexpensive to implement, especially when compared to alternatives adopted by some local authorities and the BRE. It is designed to save time, money and effort, to aid decision making, and to provide ranked priority to the multi-storey dwelling stock, in addition to many other advantages. A series of solution options to the problems of the block was sought for selection and testing before implementation. The traditional solutions have usually resulted in either demolition or costly physical maintenance and social improvement of the blocks. However, a new solution has now emerged, which is particularly suited to structurally sound units. The solution of `re-cycling' might incorporate the reuse of an entire block or part of it, by removing panels, slabs and so forth from the upper floors in order to reconstruct them as low-rise accommodations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we have done back to back comparison of quantitive phase and refractive index from a microscopic image of waveguide previously obtained by Allsop et al. Paper also shows microscopic image of the first 3 waveguides from the sample. Tomlins et al. have demonstrated use of femtosecond fabricated artefacts as OCT calibration samples. Here we present the use of femtosecond waveguides, inscribed with optimized parameters, to test and calibrate the sensitivity of the OCT systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vesicular adjuvant systems composing dimethyldioctadecylammonium (DDA) can promote both cell-mediated and humoral immune responses to the tuberculosis vaccine fusion protein in mice. However, these DDA preparations were found to be physically unstable, forming aggregates under ambient storage conditions. Therefore there is a need to improve the stability of such systems without undermining their potent adjuvanticity. To this end, the effect of incorporating non-ionic surfactants, such as 1-monopalmitoyl glycerol (MP), in addition to cholesterol (Chol) and trehalose 6,6′-dibehenate (TDB), on the stability and efficacy of these vaccine delivery systems was investigated. Differential scanning calorimetry revealed a reduction in the phase transition temperature (T c) of DDA-based vesicles by ∼12°C when MP and cholesterol (1:1 molar ratio) were incorporated into the DDA system. Transmission electron microscopy (TEM) revealed the addition of MP to DDA vesicles resulted in the formation of multi-lamellar vesicles. Environmental scanning electron microscopy (ESEM) of MP-Chol-DDA-TDB (16:16:4:0.5 μmol) indicated that incorporation of antigen led to increased stability of the vesicles, perhaps as a result of the antigen embedding within the vesicle bilayers. At 4°C DDA liposomes showed significant vesicle aggregation after 28 days, although addition of MP-Chol or TDB was shown to inhibit this instability. Alternatively, at 25°C only the MP-based systems retained their original size. The presence of MP within the vesicle formulation was also shown to promote a sustained release of antigen in-vitro. The adjuvant activity of various systems was tested in mice against three subunit antigens, including mycobacterial fusion protein Ag85b-ESAT-6, and two malarial antigens (Merozoite surface protein 1, MSP1, and the glutamate rich protein, GLURP). The MP- and DDA-based systems induced antibody responses at comparable levels whereas the DDA-based systems induced more powerful cell-mediated immune responses. © 2006 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In our attempts to thwart the unwanted attentions of microbes by prophylactic and therapeutic vaccination, the knowledge of interactions at the molecular level may prove to be an invaluable asset. This article examines how particulate delivery systems such as liposomes and polymer microspheres can be applied in the light of recent advances in immunological understanding. Some of the biological interactions of these delivery systems are discussed with relevance for antigen trafficking and molecular pathways of immunogenicity and emphasis on the possible interaction of liposomal components. In particular, traditional concepts such as antigen protection, delivery to antigen presenting cells and depot formation remain important aspects, whilst the inclusion of selected co-adjuvants and enhanced delivery of these moieties in conjunction with antigen now has a firm rationale. © 2006 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the global economy, innovation is one of the most important competitive assets for companies willing to compete in international markets. As competition moves from standardised products to customised ones, depending on each specific market needs, economies of scale are not anymore the only winning strategy. Innovation requires firms to establish processes to acquire and absorb new knowledge, leading to the recent theory of Open Innovation. Knowledge sharing and acquisition happens when firms are embedded in networks with other firms, university, institutions and many other economic actors. Several typologies of innovation and firm networks have been identified, with various geographical spans. One of the first being modelled was the Industrial Cluster (or in Italian Distretto Industriale) which was for long considered the benchmark for innovation and economic development. Other kind of networks have been modelled since the late 1970s; Regional Innovation Systems represent one of the latest and more diffuse model of innovation networks, specifically introduced to combine local networks and the global economy. This model was qualitatively exploited since its introduction, but, together with National Innovation Systems, is among the most inspiring for policy makers and is often cited by them, not always properly. The aim of this research is to setup an econometric model describing Regional Innovation Systems, becoming one the first attempts to test and enhance this theory with a quantitative approach. A dataset of 104 secondary and primary data from European regions was built in order to run a multiple linear regression, testing if Regional Innovation Systems are really correlated to regional innovation and regional innovation in cooperation with foreign partners. Furthermore, an exploratory multiple linear regression was performed to verify which variables, among those describing a Regional Innovation Systems, are the most significant for innovating, alone or with foreign partners. Furthermore, the effectiveness of present innovation policies has been tested based on the findings of the econometric model. The developed model confirmed the role of Regional Innovation Systems for creating innovation even in cooperation with international partners: this represents one of the firsts quantitative confirmation of a theory previously based on qualitative models only. Furthermore the results of this model confirmed a minor influence of National Innovation Systems: comparing the analysis of existing innovation policies, both at regional and national level, to our findings, emerged the need for potential a pivotal change in the direction currently followed by policy makers. Last, while confirming the role of the presence a learning environment in a region and the catalyst role of regional administration, this research offers a potential new perspective for the whole private sector in creating a Regional Innovation System.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – This paper describes a “work in progress” research project being carried out with a public health care provider in the UK, a large NHS hospital Trust. Enhanced engagement with patients is one of the Trust’s core principles, but it is recognised that much more needs to be done to achieve this, and that ICT systems may be able to provide some support. The project is intended to find ways to better capture and evaluate the “voice of the patient” in order to lead to improvements in health care quality, safety and effectiveness. Design/methodology/approach – We propose to investigate the use of a patient-orientated knowledge management system (KMS) in managing knowledge about and from patients. The study is a mixed methods (quantitative and qualitative) investigation based on traditional action research, intended to answer the following three research questions: (1) How can a KMS be used as a mechanism to capture and evaluate patient experiences to provoke patient service change (2) How can the KMS assist in providing a mechanism for systematising patient engagement? (3) How can patient feedback be used to stimulate improvements in care, quality and safety? Originality/value –This methodology aims to involve patients at all phases of the study from its initial design onwards, thus leading to an understanding of the issues associated with using a KMS to manage knowledge about and for patients that is driven by the patients themselves. Practical implications – The outcomes of the project for the collaborating hospital will be firstly, a system for capturing and evaluating knowledge about and from patients, and then as a consequence, improved outcomes for both the patients and the service provider. More generally, it will produce a set of guidelines for managing patient knowledge in an NHS hospital that have been tested in one case example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.