26 resultados para Classifier Combination Systems

em Aston University Research Archive


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solid dispersions can be used to improve dissolution of poorly soluble drugs and PVP is a common polymeric carrier in such systems. The mechanisms controlling release of drug from solid dispersions are not fully understood and proposed theories are dependent on an understanding of the dissolution behaviour of both components of the dispersion. This study uses microviscometry to measure small changes in the viscosity of the dissolution medium as the polymer dissolves from ibuprofen-PVP solid dispersions. The microviscometer determines the dynamic and kinematic viscosity of liquids based on the rolling/falling ball principle. Using a standard USP dissolution apparatus, the dissolution of the polymer from the solid dispersion was easily measured alongside drug release. Drug release was found to closely follow polymer dissolution at the molecular weights and ratios used. The combination of sensitivity and ease of use make microviscometry a valuable technique for the elucidation of mechanisms governing drug release from polymeric delivery systems. © 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vaccination remains a key tool in the protection and eradication of diseases. However, the development of new safe and effective vaccines is not easy. Various live organism based vaccines currently licensed, exhibit high efficacy; however, this benefit is associated with risk, due to the adverse reactions found with these vaccines. Therefore, in the development of vaccines, the associated risk-benefit issues need to be addressed. Sub-unit proteins offer a much safer alternative; however, their efficacy is low. The use of adjuvanted systems have proven to enhance the immunogenicity of these sub-unit vaccines through protection (i.e. preventing degradation of the antigen in vivo) and enhanced targeting of these antigens to professional antigen-presenting cells. Understanding of the immunological implications of the related disease will enable validation for the design and development of potential adjuvant systems. Novel adjuvant research involves the combination of both pharmaceutical analysis accompanied by detailed immunological investigations, whereby, pharmaceutically designed adjuvants are driven by an increased understanding of mechanisms of adjuvant activity, largely facilitated by description of highly specific innate immune recognition of components usually associated with the presence of invading bacteria or virus. The majority of pharmaceutical based adjuvants currently being investigated are particulate based delivery systems, such as liposome formulations. As an adjuvant, liposomes have been shown to enhance immunity against the associated disease particularly when a cationic lipid is used within the formulation. In addition, the inclusion of components such as immunomodulators, further enhance immunity. Within this review, the use and application of effective adjuvants is investigated, with particular emphasis on liposomal-based systems. The mechanisms of adjuvant activity, analysis of complex immunological characteristics and formulation and delivery of these vaccines are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The detection of signals in the presence of noise is one of the most basic and important problems encountered by communication engineers. Although the literature abounds with analyses of communications in Gaussian noise, relatively little work has appeared dealing with communications in non-Gaussian noise. In this thesis several digital communication systems disturbed by non-Gaussian noise are analysed. The thesis is divided into two main parts. In the first part, a filtered-Poisson impulse noise model is utilized to calulate error probability characteristics of a linear receiver operating in additive impulsive noise. Firstly the effect that non-Gaussian interference has on the performance of a receiver that has been optimized for Gaussian noise is determined. The factors affecting the choice of modulation scheme so as to minimize the deterimental effects of non-Gaussian noise are then discussed. In the second part, a new theoretical model of impulsive noise that fits well with the observed statistics of noise in radio channels below 100 MHz has been developed. This empirical noise model is applied to the detection of known signals in the presence of noise to determine the optimal receiver structure. The performance of such a detector has been assessed and is found to depend on the signal shape, the time-bandwidth product, as well as the signal-to-noise ratio. The optimal signal to minimize the probability of error of; the detector is determined. Attention is then turned to the problem of threshold detection. Detector structure, large sample performance and robustness against errors in the detector parameters are examined. Finally, estimators of such parameters as. the occurrence of an impulse and the parameters in an empirical noise model are developed for the case of an adaptive system with slowly varying conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liposome systems are well reported for their activity as vaccine adjuvants; however novel lipid-based microbubbles have also been reported to enhance the targeting of antigens into dendritic cells (DCs) in cancer immunotherapy (Suzuki et al 2009). This research initially focused on the formulation of gas-filled lipid coated microbubbles and their potential activation of macrophages using in vitro models. Further studies in the thesis concentrated on aqueous-filled liposomes as vaccine delivery systems. Initial work involved formulating and characterising four different methods of producing lipid-coated microbubbles (sometimes referred to as gas-filled liposomes), by homogenisation, sonication, a gas-releasing chemical reaction and agitation/pressurisation in terms of stability and physico-chemical characteristics. Two of the preparations were tested as pressure probes in MRI studies. The first preparation composed of a standard phospholipid (DSPC) filled with air or nitrogen (N2), whilst in the second method the microbubbles were composed of a fluorinated phospholipid (F-GPC) filled with a fluorocarbon saturated gas. The studies showed that whilst maintaining high sensitivity, a novel contrast agent which allows stable MRI measurements of fluid pressure over time, could be produced using lipid-coated microbubbles. The F-GPC microbubbles were found to withstand pressures up to 2.6 bar with minimal damage as opposed to the DSPC microbubbles, which were damaged at above 1.3 bar. However, it was also found that DSPC-filled with N2 microbubbles were also extremely robust to pressure and their performance was similar to that of F-GPC based microbubbles. Following on from the MRI studies, the DSPC-air and N2 filled lipid-based microbubbles were assessed for their potential activation of macrophages using in vitro models and compared to equivalent aqueous-filled liposomes. The microbubble formulations did not stimulate macrophage uptake, so studies thereafter focused on aqueous-filled liposomes. Further studies concentrated on formulating and characterising, both physico-chemically and immunologically, cationic liposomes based on the potent adjuvant dimethyldioctadecylammonium (DDA) and immunomodulatory trehalose dibehenate (TDB) with the addition of polyethylene glycol (PEG). One of the proposed hypotheses for the mechanism behind the immunostimulatory effect obtained with DDA:TDB is the ‘depot effect’ in which the liposomal carrier helps to retain the antigen at the injection site thereby increasing the time of vaccine exposure to the immune cells. The depot effect has been suggested to be primarily due to their cationic nature. Results reported within this thesis demonstrate that higher levels of PEG i.e. 25 % were able to significantly inhibit the formation of a liposome depot at the injection site and also severely limit the retention of antigen at the site. This therefore resulted in a faster drainage of the liposomes from the site of injection. The versatility of cationic liposomes based on DDA:TDB in combination with different immunostimulatory ligands including, polyinosinic-polycytidylic acid (poly (I:C), TLR 3 ligand), and CpG (TLR 9 ligand) either entrapped within the vesicles or adsorbed onto the liposome surface was investigated for immunogenic capacity as vaccine adjuvants. Small unilamellar (SUV) DDA:TDB vesicles (20-100 nm native size) with protein antigen adsorbed to the vesicle surface were the most potent in inducing both T cell (7-fold increase) and antibody (up to 2 log increase) antigen specific responses. The addition of TLR agonists poly(I:C) and CpG to SUV liposomes had small or no effect on their adjuvanticity. Finally, threitol ceramide (ThrCer), a new mmunostimulatory agent, was incorporated into the bilayers of liposomes composed of DDA or DSPC to investigate the uptake of ThrCer, by dendritic cells (DCs), and presentation on CD1d molecules to invariant natural killer T cells. These systems were prepared both as multilamellar vesicles (MLV) and Small unilamellar (SUV). It was demonstrated that the IFN-g secretion was higher for DDA SUV liposome formulation (p<0.05), suggesting that ThrCer encapsulation in this liposome formulation resulted in a higher uptake by DCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the prototype tool CADS* for the computer-aided development of an important class of self-* systems, namely systems whose components can be modelled as Markov chains. Given a Markov chain representation of the IT components to be included into a self-* system, CADS* automates or aids (a) the development of the artifacts necessary to build the self-* system; and (b) their integration into a fully-operational self-* solution. This is achieved through a combination of formal software development techniques including model transformation, model-driven code generation and dynamic software reconfiguration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a novel approach to characterize the parabolically-shaped pulses that can be generated from more conventional pulses via nonlinear propagation in cascaded sections of commercially available normally dispersive (ND) fibers. The impact of the initial pulse chirp on the passive pulse reshaping is examined. We furthermore demonstrate that the combination of pulse pre-chirping and propagation in a single ND fiber yields a simple, passive method for generating various temporal waveforms of practical interest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investigation of the different approaches used by Expert Systems researchers to solve problems in the domain of Mechanical Design and Expert Systems was carried out. The techniques used for conventional formal logic programming were compared with those used when applying Expert Systems concepts. A literature survey of design processes was also conducted with a view to adopting a suitable model of the design process. A model, comprising a variation on two established ones, was developed and applied to a problem within what are described as class 3 design tasks. The research explored the application of these concepts to Mechanical Engineering Design problems and their implementation on a microcomputer using an Expert System building tool. It was necessary to explore the use of Expert Systems in this manner so as to bridge the gap between their use as a control structure and for detailed analytical design. The former application is well researched into and this thesis discusses the latter. Some Expert System building tools available to the author at the beginning of his work were evaluated specifically for their suitability for Mechanical Engineering design problems. Microsynics was found to be the most suitable on which to implement a design problem because of its simple but powerful Semantic Net Knowledge Representation structure and the ability to use other types of representation schemes. Two major implementations were carried out. The first involved a design program for a Helical compression spring and the second a gearpair system design. Two concepts were proposed in the thesis for the modelling and implementation of design systems involving many equations. The method proposed enables equation manipulation and analysis using a combination of frames, semantic nets and production rules. The use of semantic nets for purposes other than for psychology and natural language interpretation, is quite new and represents one of the major contributions to knowledge by the author. The development of a purpose built shell program for this type of design problems was recommended as an extension of the research. Microsynics may usefully be used as a platform for this development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigates the contribution that Geographic Information Systems (GIS) can make to the land suitability process used to determine the effects of a climate change scenario. The research is intended to redress the severe under representation of Developing countries within the literature examining the impacts of climatic change upon crop productivity. The methodology adopts some of the Intergovernmental Panel on Climate Change (IPCC) estimates for regional climate variations, based upon General Circulation Model predictions (GCMs) and applies them to a baseline climate for Bangladesh. Utilising the United Nations Food & Agricultural Organisation's Agro-ecological Zones land suitability methodology and crop yield model, the effects of the scenario upon agricultural productivity on 14 crops are determined. A Geographic Information System (IDRISI) is adopted in order to facilitate the methodology, in conjunction with a specially designed spreadsheet, used to determine the yield and suitability rating for each crop. A simple optimisation routine using the GIS is incorporated to provide an indication of the 'maximum theoretical' yield available to the country, should the most calorifically significant crops be cultivated on each land unit both before and after the climate change scenario. This routine will provide an estimate of the theoretical population supporting capacity of the country, both now and in the future, to assist with planning strategies and research. The research evaluates the utility of this alternative GIS based methodology for the land evaluation process and determines the relative changes in crop yields that may result from changes in temperature, photosynthesis and flooding hazard frequency. In summary, the combination of a GIS and a spreadsheet was successful, the yield prediction model indicates that the application of the climate change scenario will have a deleterious effect upon the yields of the study crops. Any yield reductions will have severe implications for agricultural practices. The optimisation routine suggests that the 'theoretical maximum' population supporting capacity is well in excess of current and future population figures. If this agricultural potential could be realised however, it may provide some amelioration from the effects of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Product-Service System (PSS) is an integrated combination of products and services. This Western concept embraces a service-led competitive strategy, environmental sustainability, and the basis to differentiate from competitors who simply offer lower priced products. This paper aims to report the state-of-the-art of PSS research by presenting a clinical review of literature currently available on this topic. The literature is classified and the major outcomes of each study are addressed and analysed. On this basis, this paper defines the PSS concept, reports on its origin and features, gives examples of applications along with potential benefits and barriers to adoption, summarizes available tools and methodologies, and identifies future research challenges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing interest around the potential value of service-led competitive strategies to UK based manufacturers. A Product Service-System (PSS) is one form of such a strategy and is based on integrated combination of products and services. This concept also embraces environmental sustainability. This paper aims to summarise the state-of-the-art of PSS research by presenting a review of literature currently available on this topic. The literature search is described and the major outcomes of the study are presented. On this basis, this paper defines the PSS concept, reports on its origin and features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper discusses both the complementary factors and contradictions of adoption ERP based systems with enterprise 2.0. ERP is well known as its' efficient business process management. Also the high failure rate the system implementation is famous as well. According to [1], ERP systems could achieve efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. However, enterprise 2.0 supports flexible business process management, informal and less structured interactions [3],[4],[21]. Traditional researcher claimed efficiency and flexibility may seem incompatible in that they are different business objectives and may exist in different organizational environments. However, the paper will break traditional norms that combine ERP and enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on the multiple cases studies, four cases presented different attitudes on usage ERP systems and enterprise social systems. Based on socio-technical theory, the paper presents in-depth analysis benefits of combination ERP with enterprise 2.0 for these firms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses demand and supply chain management and examines how artificial intelligence techniques and RFID technology can enhance the responsiveness of the logistics workflow. This proposed system is expected to have a significant impact on the performance of logistics networks by virtue of its capabilities to adapt unexpected supply and demand changes in the volatile marketplace with the unique feature of responsiveness with the advanced technology, Radio Frequency Identification (RFID). Recent studies have found that RFID and artificial intelligence techniques drive the development of total solution in logistics industry. Apart from tracking the movement of the goods, RFID is able to play an important role to reflect the inventory level of various distribution areas. In today’s globalized industrial environment, the physical logistics operations and the associated flow of information are the essential elements for companies to realize an efficient logistics workflow scenario. Basically, a flexible logistics workflow, which is characterized by its fast responsiveness in dealing with customer requirements through the integration of various value chain activities, is fundamental to leverage business performance of enterprises. The significance of this research is the demonstration of the synergy of using a combination of advanced technologies to form an integrated system that helps achieve lean and agile logistics workflow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The chapter discusses both the complementary factors and contradictions of adoption ERP-based systems with Enterprise 2.0. ERP is well known as IT's efficient business process management. Enterprise 2.0 supports flexible business process management, informal, and less structured interactions. Traditional studies indicate efficiency and flexibility may seem incompatible because they are different business objectives and may exist in different organizational environments. However, the chapter breaks traditional norms that combine ERP and Enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on multiple case studies, the chapter analyzes the benefits and risks of the combination of ERP with Enterprise 2.0 from process, organization, and people paradigms. © 2013 by IGI Global.