809 resultados para Agent-based brokerage platform
Resumo:
Multi-agent systems have become increasingly mature, but their appearance does not make the traditional OO approach obsolete. On the contrary, OO methodologies can benefit from the principles and tools designed for agent systems. The Agent-Rule-Class (ARC) framework is proposed as an approach that builds agents upon traditional OO system components and makes use of business rules to dictate agent behaviour with the aid of OO components. By modelling agent knowledge in business rules, the proposed paradigm provides a straightforward means to develop agent-oriented systems based on the existing object-oriented systems and offers features that are otherwise difficult to achieve in the original OO systems. The main outcome of using ARC is the achievement of adaptivity. The framework is supported by a tool that ensures agents implement up-to-date requirements from business people, reflecting desired current behaviour, without the need for frequent system rebuilds. ARC is illustrated with a rail track example.
Resumo:
Paper describes an effcicient approach for provisioning of network resources based on SLAs and a range of negotiating agents. The work arose from direct collboration with Fujitsu research and invlolved a worldwide press reslease of their agent brokering system which was based on this; also, a plenary address: A.Marshall (QUB) & A.Campbell (Columbia, USA) at 4th IFIP/IEEE International conference on Management of Multimedia Networks and Services' 2001 (MMNS'01). ISSN: 0926-6801
Resumo:
We present a generic Service Level Agreement (SLA)-driven service provisioning architecture, which enables dynamic and flexible bandwidth reservation schemes on a per-user or a per-application basis. Various session level SLA negotiation schemes involving bandwidth allocation, service start time and service duration parameters are introduced and analysed. The results show that these negotiation schemes can be utilised for the benefits of both end user and network provide such as getting the highest individual SLA optimisation in terms of Quality of Service (QoS) and price. A prototype based on an industrial agent platform has also been built to demonstrate the negotiation scenario and this is presented and discussed.
Resumo:
The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for intelligent time analysis of aircraft assembly processes within a digital manufacturing framework. A knowledge system is developed so that the design knowledge can be intelligently retrieved for implementing assembly time analysis automatically. A time estimation method based on MOST, is reviewed and employed. Knowledge capture, transfer and storage within the digital manufacturing environment are extensively discussed. Configured plantypes, GUIs and functional modules are designed and developed for the automated time analysis. An exemplar study using an aircraft panel assembly from a regional jet is also presented. Although the method currently focuses on aircraft assembly, it can also be well utilized in other industry sectors, such as transportation, automobile and shipbuilding. The main contribution of the work is to present a methodology that facilitates the integration of time analysis with design and manufacturing using a digital manufacturing platform solution.
Resumo:
Modifications based upon a metabolite of ciglitazone afforded BRL 49653 (I), a novel potent insulin sensitizer. A facile synthesis of this compd. is described.
Resumo:
The development of artificial neural network (ANN) models to predict the rheological behavior of grouts is described is this paper and the sensitivity of such parameters to the variation in mixture ingredients is also evaluated. The input parameters of the neural network were the mixture ingredients influencing the rheological behavior of grouts, namely the cement content, fly ash, ground-granulated blast-furnace slag, limestone powder, silica fume, water-binder ratio (w/b), high-range water-reducing admixture, and viscosity-modifying agent (welan gum). The six outputs of the ANN models were the mini-slump, the apparent viscosity at low shear, and the yield stress and plastic viscosity values of the Bingham and modified Bingham models, respectively. The model is based on a multi-layer feed-forward neural network. The details of the proposed ANN with its architecture, training, and validation are presented in this paper. A database of 186 mixtures from eight different studies was developed to train and test the ANN model. The effectiveness of the trained ANN model is evaluated by comparing its responses with the experimental data that were used in the training process. The results show that the ANN model can accurately predict the mini-slump, the apparent viscosity at low shear, the yield stress, and the plastic viscosity values of the Bingham and modified Bingham models of the pseudo-plastic grouts used in the training process. The results can also predict these properties of new mixtures within the practical range of the input variables used in the training with an absolute error of 2%, 0.5%, 8%, 4%, 2%, and 1.6%, respectively. The sensitivity of the ANN model showed that the trend data obtained by the models were in good agreement with the actual experimental results, demonstrating the effect of mixture ingredients on fluidity and the rheological parameters with both the Bingham and modified Bingham models.
Resumo:
The Microarray Innovations in Leukemia study assessed the clinical utility of gene expression profiling as a single test to subtype leukemias into conventional categories of myeloid and lymphoid malignancies. METHODS: The investigation was performed in 11 laboratories across three continents and included 3,334 patients. An exploratory retrospective stage I study was designed for biomarker discovery and generated whole-genome expression profiles from 2,143 patients with leukemias and myelodysplastic syndromes. The gene expression profiling-based diagnostic accuracy was further validated in a prospective second study stage of an independent cohort of 1,191 patients. RESULTS: On the basis of 2,096 samples, the stage I study achieved 92.2% classification accuracy for all 18 distinct classes investigated (median specificity of 99.7%). In a second cohort of 1,152 prospectively collected patients, a classification scheme reached 95.6% median sensitivity and 99.8% median specificity for 14 standard subtypes of acute leukemia (eight acute lymphoblastic leukemia and six acute myeloid leukemia classes, n = 693). In 29 (57%) of 51 discrepant cases, the microarray results had outperformed routine diagnostic methods. CONCLUSION: Gene expression profiling is a robust technology for the diagnosis of hematologic malignancies with high accuracy. It may complement current diagnostic algorithms and could offer a reliable platform for patients who lack access to today's state-of-the-art diagnostic work-up. Our comprehensive gene expression data set will be submitted to the public domain to foster research focusing on the molecular understanding of leukemias
Resumo:
The synthesis and photophysical evaluation of a new lanthanide luminescence imaging agent is presented. The agent, a terbium-based cyclen complex can, through the use of an iminodiacetate moiety, bind to damaged bone surface via chelation to exposed Ca(II) sites, enabling the imaging of the damage using confocal fluorescence scanning microscopy.
Resumo:
Background: Digital pathology provides a digital environment for the management and interpretation of pathological images and associated data. It is becoming increasing popular to use modern computer based tools and applications in pathological education, tissue based research and clinical diagnosis. Uptake of this new technology is stymied by its single user orientation and its prerequisite and cumbersome combination of mouse and keyboard for navigation and annotation.
Methodology: In this study we developed SurfaceSlide, a dedicated viewing platform which enables the navigation and annotation of gigapixel digitised pathological images using fingertip touch. SurfaceSlide was developed using the Microsoft Surface, a 30 inch multitouch tabletop computing platform. SurfaceSlide users can perform direct panning and zooming operations on digitised slide images. These images are downloaded onto the Microsoft Surface platform from a remote server on-demand. Users can also draw annotations and key in texts using an on-screen virtual keyboard. We also developed a smart caching protocol which caches the surrounding regions of a field of view in multi-resolutions thus providing a smooth and vivid user experience and reducing the delay for image downloading from the internet. We compared the usability of SurfaceSlide against Aperio ImageScope and PathXL online viewer.
Conclusion: SurfaceSlide is intuitive, fast and easy to use. SurfaceSlide represents the most direct, effective and intimate human–digital slide interaction experience. It is expected that SurfaceSlide will significantly enhance digital pathology tools and applications in education and clinical practice.
Resumo:
Paralytic shellfish poisoning (PSP) toxins are produced by certain marine dinoflagellates and may accumulate in bivalve molluscs through filter feeding. The Mouse Bioassay (MBA) is the internationally recognised reference method of analysis, but it is prone to technical difficulties and regarded with increasing disapproval due to ethical reasons. As such, alternative methods are required. A rapid surface plasmon resonance (SPR) biosensor inhibition assay was developed to detect PSP toxins in shellfish by employing a saxitoxin polyclonal antibody (R895). Using an assay developed for and validated on the Biacore Q biosensor system, this project focused on transferring the assay to a high-throughput, Biacore T100 biosensor in another laboratory. This was achieved using a prototype PSP toxin kit and recommended assay parameters based on the Biacore Q method. A monoclonal antibody (GT13A) was also assessed. Even though these two instruments are based on SPR principles, they vary widely in their mode of operation including differences in the integrated mu-fluidic cartridges, autosampler system, and sensor chip compatibilities. Shellfish samples (n = 60), extracted using a simple, rapid procedure, were analysed using each platform, and results were compared to AOAC high performance liquid chromatography (HPLC) and MBA methods. The overall agreement, based on statistical 2 x 2 comparison tables, between each method ranged from 85% to 94.4% using R895 and 77.8% to 100% using GT13A. The results demonstrated that the antibody based assays with high sensitivity and broad specificity to PSP toxins can be applied to different biosensor platforms. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
A solvent-based, irreversible oxygen indicator ink is described, comprising semiconductor photocatalyst nanoparticles, a solvent-soluble redox dye, mild reducing agent and polymer. Based on such an ink, a film - made of titanium dioxide, a blue, solvent-soluble, coloured ion-paired methylene blue dye, glycerol and the polymer zein - loses its colour rapidly (
Resumo:
Convincing conversational agents require a coherent set of behavioral responses that can be interpreted by a human observer as indicative of a personality. This paper discusses the continued development and subsequent evaluation of virtual agents based on sound psychological principles. We use Eysenck's theoretical basis to explain aspects of the characterization of our agents, and we describe an architecture where personality affects the agent's global behavior quality as well as their back-channel productions. Drawing on psychological research, we evaluate perception of our agents' personalities and credibility by human viewers (N = 187). Our results suggest that we succeeded in validating theoretically grounded indicators of personality in our virtual agents, and that it is feasible to place our characters on Eysenck's scales. A key finding is that the presence of behavioral characteristics reinforces the prescribed personality profiles that are already emerging from the still images. Our long-term goal is to enhance agents' ability to sustain realistic interaction with human users, and we discuss how this preliminary work may be further developed to include more systematic variation of Eysenck's personality scales. © 2012 IEEE.
--------------------------------------------------------------------------------
Reaxys Database Information|
--------------------------------------------------------------------------------
Resumo:
OpenPMU is an open platform for the development of phasor measurement unit (PMU) technology. A need has been identified for an open-source alternative to commercial PMU devices tailored to the needs of the university researcher and for enabling the development of new synchrophasor instruments from this foundation. OpenPMU achieves this through open-source hardware design specifications and software source code, allowing duplicates of the OpenPMU to be fabricated under open-source licenses. This paper presents the OpenPMU device based on the Labview development environment. The device is performance tested according to the IEEE C37.118.1 standard. Compatibility with the IEEE C37.118.2 messaging format is achieved through middleware which is readily adaptable to other PMU projects or applications. Improvements have been made to the original design to increase its flexibility. A new modularized architecture for the OpenPMU is presented using an open messaging format which the authors propose is adopted as a platform for PMU research.
Resumo:
To develop real-time simulations of wind instruments, digital waveguides filters can be used as an efficient representation of the air column. Many aerophones are shaped as horns which can be approximated using conical sections. Therefore the derivation of conical waveguide filters is of special interest. When these filters are used in combination with a generalized reed excitation, several classes of wind instruments can be simulated. In this paper we present the methods for transforming a continuous description of conical tube segments to a discrete filter representation. The coupling of the reed model with the conical waveguide and a simplified model of the termination at the open end are described in the same way. It turns out that the complete lossless conical waveguide requires only one type of filter.Furthermore, we developed a digital reed excitation model, which is purely based on numerical integration methods, i.e., without the use of a look-up table.
Resumo:
Norms constitute a powerful coordination mechanism among heterogeneous agents. In this paper, we propose a rule language to specify and explicitly manage the normative positions of agents (permissions, prohibitions and obligations), with which distinct deontic notions and their relationships can be captured. Our rule-based formalism includes constraints for more expressiveness and precision and allows to supplement (and implement) electronic institutions with norms. We also show how some normative aspects are given computational interpretation. © 2008 Springer Science+Business Media, LLC.