963 resultados para common agent architecture design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes work carried out on the design of new routes to a range of bisindolylmaleimide and indolo[2,3-a]carbazole analogs, and investigation of their potential as successful anti-cancer agents. Following initial investigation of classical routes to indolo[2,3-a]pyrrolo[3,4-c]carbazole aglycons, a new strategy employing base-mediated condensation of thiourea and guanidine with a bisindolyl β-ketoester intermediate afforded novel 5,6-bisindolylpyrimidin-4(3H)-ones in moderate yields. Chemical diversity within this H-bonding scaffold was then studied by substitution with a panel of biologically relevant electrophiles, and by reductive desulfurisation. Optimisation of difficult heterogeneous literature conditions for oxidative desulfurisation of thiouracils was also accomplished, enabling a mild route to a novel 5,6-bisindolyluracil pharmacophore to be developed within this work. The oxidative cyclisation of selected acyclic bisindolyl systems to form a new planar class of indolo[2,3-a]pyrimido[5,4-c]carbazoles was also investigated. Successful conditions for this transformation, as well as the limitations currently prevailing for this approach are discussed. Synthesis of 3,4-bisindolyl-5-aminopyrazole as a potential isostere of bisindolylmaleimide agents was encountered, along with a comprehensive derivatisation study, in order to probe the chemical space for potential protein backbone H-bonding interactions. Synthesis of a related 3,4-arylindolyl-5-aminopyrazole series was also undertaken, based on identification of potent kinase inhibition within a closely related heterocyclic template. Following synthesis of approximately 50 novel compounds with a diversity of H-bonding enzyme-interacting potential within these classes, biological studies confirmed that significant topo II inhibition was present for 9 lead compounds, in previously unseen pyrazolo[1,5-a]pyrimidine, indolo[2,3-c]carbazole and branched S,N-disubstituted thiouracil derivative series. NCI-60 cancer cell line growth inhibition data for 6 representative compounds also revealed interesting selectivity differences between each compound class, while a new pyrimido[5,4-c]carbazole agent strongly inhibited cancer cell division at 10 µM, with appreciable cytotoxic activity observed across several tumour types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emerging healthcare applications can benefit enormously from recent advances in pervasive technology and computing. This paper introduces the CLARITY Modular Ambient Health and Wellness Measurement Platform:, which is a heterogeneous and robust pervasive healthcare solution currently under development at the CLARITY Center for Sensor Web Technologies. This intelligent and context-aware platform comprises the Tyndall Wireless Sensor Network prototyping system, augmented with an agent-based middleware and frontend computing architecture. The key contribution of this work is to highlight how interoperability, expandability, reusability and robustness can be manifested in the modular design of the constituent nodes and the inherently distributed nature of the controlling software architecture.Emerging healthcare applications can benefit enormously from recent advances in pervasive technology and computing. This paper introduces the CLARITY Modular Ambient Health and Wellness Measurement Platform:, which is a heterogeneous and robust pervasive healthcare solution currently under development at the CLARITY Center for Sensor Web Technologies. This intelligent and context-aware platform comprises the Tyndall Wireless Sensor Network prototyping system, augmented with an agent-based middleware and frontend computing architecture. The key contribution of this work is to highlight how interoperability, expandability, reusability and robustness can be manifested in the modular design of the constituent nodes and the inherently distributed nature of the controlling software architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid growth of the Internet and digital communications, the volume of sensitive electronic transactions being transferred and stored over and on insecure media has increased dramatically in recent years. The growing demand for cryptographic systems to secure this data, across a multitude of platforms, ranging from large servers to small mobile devices and smart cards, has necessitated research into low cost, flexible and secure solutions. As constraints on architectures such as area, speed and power become key factors in choosing a cryptosystem, methods for speeding up the development and evaluation process are necessary. This thesis investigates flexible hardware architectures for the main components of a cryptographic system. Dedicated hardware accelerators can provide significant performance improvements when compared to implementations on general purpose processors. Each of the designs proposed are analysed in terms of speed, area, power, energy and efficiency. Field Programmable Gate Arrays (FPGAs) are chosen as the development platform due to their fast development time and reconfigurable nature. Firstly, a reconfigurable architecture for performing elliptic curve point scalar multiplication on an FPGA is presented. Elliptic curve cryptography is one such method to secure data, offering similar security levels to traditional systems, such as RSA, but with smaller key sizes, translating into lower memory and bandwidth requirements. The architecture is implemented using different underlying algorithms and coordinates for dedicated Double-and-Add algorithms, twisted Edwards algorithms and SPA secure algorithms, and its power consumption and energy on an FPGA measured. Hardware implementation results for these new algorithms are compared against their software counterparts and the best choices for minimum area-time and area-energy circuits are then identified and examined for larger key and field sizes. Secondly, implementation methods for another component of a cryptographic system, namely hash functions, developed in the recently concluded SHA-3 hash competition are presented. Various designs from the three rounds of the NIST run competition are implemented on FPGA along with an interface to allow fair comparison of the different hash functions when operating in a standardised and constrained environment. Different methods of implementation for the designs and their subsequent performance is examined in terms of throughput, area and energy costs using various constraint metrics. Comparing many different implementation methods and algorithms is nontrivial. Another aim of this thesis is the development of generic interfaces used both to reduce implementation and test time and also to enable fair baseline comparisons of different algorithms when operating in a standardised and constrained environment. Finally, a hardware-software co-design cryptographic architecture is presented. This architecture is capable of supporting multiple types of cryptographic algorithms and is described through an application for performing public key cryptography, namely the Elliptic Curve Digital Signature Algorithm (ECDSA). This architecture makes use of the elliptic curve architecture and the hash functions described previously. These components, along with a random number generator, provide hardware acceleration for a Microblaze based cryptographic system. The trade-off in terms of performance for flexibility is discussed using dedicated software, and hardware-software co-design implementations of the elliptic curve point scalar multiplication block. Results are then presented in terms of the overall cryptographic system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of embedded systems design, coprocessors play an important role as a component to increase performance. Many embedded systems are built around a small General Purpose Processor (GPP). If the GPP cannot meet the performance requirements for a certain operation, a coprocessor can be included in the design. The GPP can then offload the computationally intensive operation to the coprocessor; thus increasing the performance of the overall system. A common application of coprocessors is the acceleration of cryptographic algorithms. The work presented in this thesis discusses coprocessor architectures for various cryptographic algorithms that are found in many cryptographic protocols. Their performance is then analysed on a Field Programmable Gate Array (FPGA) platform. Firstly, the acceleration of Elliptic Curve Cryptography (ECC) algorithms is investigated through the use of instruction set extension of a GPP. The performance of these algorithms in a full hardware implementation is then investigated, and an architecture for the acceleration the ECC based digital signature algorithm is developed. Hash functions are also an important component of a cryptographic system. The FPGA implementation of recent hash function designs from the SHA-3 competition are discussed and a fair comparison methodology for hash functions presented. Many cryptographic protocols involve the generation of random data, for keys or nonces. This requires a True Random Number Generator (TRNG) to be present in the system. Various TRNG designs are discussed and a secure implementation, including post-processing and failure detection, is introduced. Finally, a coprocessor for the acceleration of operations at the protocol level will be discussed, where, a novel aspect of the design is the secure method in which private-key data is handled

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Can my immediate physical environment affect how I feel? The instinctive answer to this question must be a resounding “yes”. What might seem a throwaway remark is increasingly borne out by research in environmental and behavioural psychology, and in the more recent discipline of Evidence-Based Design. Research outcomes are beginning to converge with findings in neuroscience and neurophysiology, as we discover more about how the human brain and body functions, and reacts to environmental stimuli. What we see, hear, touch, and sense affects each of us psychologically and, by extension, physically, on a continual basis. The physical characteristics of our daily environment thus have the capacity to profoundly affect all aspects of our functioning, from biological systems to cognitive ability. This has long been understood on an intuitive basis, and utilised on a more conscious basis by architects and other designers. Recent research in evidence-based design, coupled with advances in neurophysiology, confirm what have been previously held as commonalities, but also illuminate an almost frightening potential to do enormous good, or alternatively, terrible harm, by virtue of how we make our everyday surroundings. The thesis adopts a design methodology in its approach to exploring the potential use of wireless sensor networks in environments for elderly people. Vitruvian principles of “commodity, firmness and delight” inform the research process and become embedded in the final design proposals and research conclusions. The issue of person-environment fit becomes a key principle in describing a model of continuously-evolving responsive architecture which makes the individual user its focus, with the intention of promoting wellbeing. The key research questions are: What are the key system characteristics of an adaptive therapeutic single-room environment? How can embedded technologies be utilised to maximise the adaptive and therapeutic aspects of the personal life-space of an elderly person with dementia?.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Focussing on Paul Rudolph’s Art & Architecture Building at Yale, this thesis demonstrates how the building synthesises the architect’s attitude to architectural education, urbanism and materiality. It tracks the evolution of the building from its origins – which bear a relationship to Rudolph’s pedagogical ideas – to later moments when its occupants and others reacted to it in a series of ways that could never have been foreseen. The A&A became the epicentre of the university’s counter culture movement before it was ravaged by a fire of undetermined origins. Arguably, it represents the last of its kind in American architecture, a turning point at the threshold of postmodernism. Using an archive that was only made available to researchers in 2009, this is the first study to draw extensively on the research files of the late architectural writer and educator, C. Ray Smith. Smith’s 1981 manuscript about the A&A entitled “The Biography of a Building,” was never published. The associated research files and transcripts of discussions with some thirty interviewees, including Rudolph, provide a previously unavailable wealth of information. Following Smith’s methodology, meetings were recorded with those involved in the A&A including, where possible, some of Smith’s original interviewees. When placed within other significant contexts – the physicality of the building itself as well as the literature which surrounds it – these previously untold accounts provide new perspectives and details, which deepen the understanding of the building and its place within architectural discourse. Issues revealed include the importance of the influence of Louis Kahn’s Yale Art Gallery and Yale’s Collegiate Gothic Campus on the building’s design. Following a tumultuous first fifty years, the A&A remains an integral part of the architectural education of Yale students and, furthermore, constitutes an important didactic tool for all students of architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The insider threat is a security problem that is well-known and has a long history, yet it still remains an invisible enemy. Insiders know the security processes and have accesses that allow them to easily cover their tracks. In recent years the idea of monitoring separately for these threats has come into its own. However, the tools currently in use have disadvantages and one of the most effective techniques of human review is costly. This paper explores the development of an intelligent agent that uses already in-place computing material for inference as an inexpensive monitoring tool for insider threats. Design Science Research (DSR) is a methodology used to explore and develop an IT artifact, such as for this intelligent agent research. This methodology allows for a structure that can guide a deep search method for problems that may not be possible to solve or could add to a phenomenological instantiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a design science approach to solving persistent problems in the international shipping eco system by creating the missing common information infrastructures. Specifically, this paper reports on an ongoing dialogue between stakeholders in the shipping industry and information systems researchers engaged in the design and development of a prototype for an innovative IT-artifact called Shipping Information Pipeline which is a kind of “an internet” for shipping information. The instrumental aim is to enable information seamlessly to cross the organizational boundaries and national borders within international shipping which is a rather complex domain. The intellectual objective is to generate and evaluate the efficacy and effectiveness of design principles for inter-organizational information infrastructures in the international shipping domain that can have positive impacts on global trade and local economies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Copper is the main interconnect material in microelectronic devices, and a 2 nm-thick continuous Cu film seed layer needs to be deposited to produce microelectronic devices with the smallest features and more functionality. Atomic layer deposition (ALD) is the most suitable method to deposit such thin films. However, the reaction mechanism and the surface chemistry of copper ALD remain unclear, which is deterring the development of better precursors and design of new ALD processes. In this thesis, we study the surface chemistries during ALD of copper by means of density functional theory (DFT). To understand the effect of temperature and pressure on the composition of copper with substrates, we used ab initio atomistic thermodynamics to obtain phase diagram of the Cu(111)/SiO2(0001) interface. We found that the interfacial oxide Cu2O phases prefer high oxygen pressure and low temperature while the silicide phases are stable at low oxygen pressure and high temperature for Cu/SiO2 interface, which is in good agreement with experimental observations. Understanding the precursor adsorption on surfaces is important for understanding the surface chemistry and reaction mechanism of the Cu ALD process. Focusing on two common Cu ALD precursors, Cu(dmap)2 and Cu(acac)2, we studied the precursor adsorption on Cu surfaces by means of van der Waals (vdW) inclusive DFT methods. We found that the adsorption energies and adsorption geometries are dependent on the adsorption sites and on the method used to include vdW in the DFT calculation. Both precursor molecules are partially decomposed and the Cu cations are partially reduced in their chemisorbed structure. It is found that clean cleavage of the ligand−metal bond is one of the requirements for selecting precursors for ALD of metals. 2 Bonding between surface and an atom in the ligand which is not coordinated with the Cu may result in impurities in the thin film. To have insight into the reaction mechanism of a full ALD cycle of Cu ALD, we proposed reaction pathways based on activation energies and reaction energies for a range of surface reactions between Cu(dmap)2 and Et2Zn. The butane formation and desorption steps are found to be extremely exothermic, explaining the ALD reaction scheme of original experimental work. Endothermic ligand diffusion and re-ordering steps may result in residual dmap ligands blocking surface sites at the end of the Et2Zn pulse, and in residual Zn being reduced and incorporated as an impurity. This may lead to very slow growth rate, as was the case in the experimental work. By investigating the reduction of CuO to metallic Cu, we elucidated the role of the reducing agent in indirect ALD of Cu. We found that CuO bulk is protected from reduction during vacuum annealing by the CuO surface and that H2 is required in order to reduce that surface, which shows that the strength of reducing agent is important to obtain fully reduced metal thin films during indirect ALD processes. Overall, in this thesis, we studied the surface chemistries and reaction mechanisms of Cu ALD processes and the nucleation of Cu to form a thin film.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whooping cough still represents a major health problem, despite the use of effective vaccines for several decades. Being classically a typical childhood disease, whooping cough in young adults is now more common than it used to be, suggesting that protection after vaccination wanes during adolescence. As an alternative to the current vaccines, we wish to develop live attenuated vaccines to be delivered by the nasal route, such as to mimic the natural route of infection and to induce long lasting immunity. Bordetella pertussis, the etiological agent of whooping cough, produces a number of virulence factors, including toxins. Its recently determined genome sequence makes it now possible to apply functional genomics, such as transcriptomics and systematic knock-out mutagenesis. The expression of most known B. pertussis virulence genes is controlled by the two-component system BvgA/S. DNA microarray analyses have led to the identification of novel genes in the BvgA/S regulon, some of which are activated by BvgA/S and others are repressed by BvgA/S. In addition, some genes appear to be differentially modulated by nicotinic acid and MgSO4, both known to modulate the expression of BvgA/S-regulated genes. Among others, the functional genomics approach has uncovered two strongly BvgA/S-activated genes, named hotA and hotB (for 'homolog of toxin'), the products of which show high sequence similarities to pertussis toxin subunits. The identification of the full array of virulence factors, as well as an integrated understanding of the bacterial physiology should allow us to design attenuated B. pertussis strains useful for intranasal vaccination. A first generation of attenuated strains has already shown full protection in mice after a single intranasal administration. Such strains may also serve as vaccine carriers for heterologous antigens, in order to vaccinate against several different pathogens simultaneously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an industrial application of case-based reasoning in engineering. The application involves an integration of case-based reasoning (CBR) retrieval techniques with a relational database. The database is specially designed as a repository of experiential knowledge and with the CBR application in mind such as to include qualitative search indices. The application is for an intelligent assistant for design and material engineers in the submarine cable industry. The system consists of three components; a material classifier and a database of experiential knowledge and a CBR system is used to retrieve similar past cases based on component descriptions. Work has shown that an uncommon retrieval technique, hierarchical searching, well represents several search indices and that this techniques aids the implementation of advanced techniques such as context sensitive weights. The system is currently undergoing user testing at the Alcatel Submarine Cables site in Greenwich. Plans are for wider testing and deployment over several sites internationally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The future of many companies will depend to a large extent on their ability to initiate techniques that bring schedules, performance, tests, support, production, life-cycle-costs, reliability prediction and quality control into the earliest stages of the product creation process. Important questions for an engineer who is responsible for the quality of electronic parts such as printed circuit boards (PCBs) during design, production, assembly and after-sales support are: What is the impact of temperature? What is the impact of this temperature on the stress produced in the components? What is the electromagnetic compatibility (EMC) associated with such a design? At present, thermal, stress and EMC calculations are undertaken using different software tools that each require model build and meshing. This leads to a large investment in time, and hence cost, to undertake each of these simulations. This paper discusses the progression towards a fully integrated software environment, based on a common data model and user interface, having the capability to predict temperature, stress and EMC fields in a coupled manner. Such a modelling environment used early within the design stage of an electronic product will provide engineers with fast solutions to questions regarding thermal, stress and EMC issues. The paper concentrates on recent developments in creating such an integrated modeling environment with preliminary results from the analyses conducted. Further research into the thermal and stress related aspects of the paper is being conducted under a nationally funded project, while their application in reliability prediction will be addressed in a new European project called PROFIT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When designing a new passenger ship or modifiying an existing design, how do we ensure that the proposed design is safe from an evacuation point of view? In the building and aviation industries, computer based evacuation models are being used to tackle similar issues. In these industries, the traditonal restrictive prescriptive approach to design is making way for performance based design methodologies using risk assessment and computer simulation. In the maritime industry, ship evacuation models off the promise to quickly and efficiently bring these considerations into the design phase, while the ship is "on the drawing board". This paper describes the development of evacuation models with applications to passenger ships and further discusses issues concerning data requirements and validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When designing a new passenger ship or modifying an existing design, how do we ensure that the proposed design and crew emergency procedures are safe from an evacuation resulting from fire or other incident? In the wake of major maritime disasters such as the Scandinavian Star, Herald of Free Enterprise, Estonia and in light of the growth in the number of high density, high-speed ferries and large capacity cruise ships, issues concerning the evacuation of passengers and crew at sea are receiving renewed interest. Fire and evacuation models with features such as the ability to realistically simulate the spread of heat and smoke and the human response to fire as well as the capability to model human performance in heeled orientations linked to a virtual reality environment that produces realistic visualisations of the modelled scenarios are now available and can be used to aid the engineer in assessing ship design and procedures. This paper describes the maritimeEXODUS ship evacuation and the SMARTFIRE fire simulation model and provides an example application demonstrating the use of the models in performing fire and evacuation analysis for a large passenger ship partially based on the requirements of MSC circular 1033