5 resultados para Agent-based brokerage platform
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
Emerging healthcare applications can benefit enormously from recent advances in pervasive technology and computing. This paper introduces the CLARITY Modular Ambient Health and Wellness Measurement Platform:, which is a heterogeneous and robust pervasive healthcare solution currently under development at the CLARITY Center for Sensor Web Technologies. This intelligent and context-aware platform comprises the Tyndall Wireless Sensor Network prototyping system, augmented with an agent-based middleware and frontend computing architecture. The key contribution of this work is to highlight how interoperability, expandability, reusability and robustness can be manifested in the modular design of the constituent nodes and the inherently distributed nature of the controlling software architecture.Emerging healthcare applications can benefit enormously from recent advances in pervasive technology and computing. This paper introduces the CLARITY Modular Ambient Health and Wellness Measurement Platform:, which is a heterogeneous and robust pervasive healthcare solution currently under development at the CLARITY Center for Sensor Web Technologies. This intelligent and context-aware platform comprises the Tyndall Wireless Sensor Network prototyping system, augmented with an agent-based middleware and frontend computing architecture. The key contribution of this work is to highlight how interoperability, expandability, reusability and robustness can be manifested in the modular design of the constituent nodes and the inherently distributed nature of the controlling software architecture.
Resumo:
The technological role of handheld devices is fundamentally changing. Portable computers were traditionally application specific. They were designed and optimised to deliver a specific task. However, it is now commonly acknowledged that future handheld devices need to be multi-functional and need to be capable of executing a range of high-performance applications. This thesis has coined the term pervasive handheld computing systems to refer to this type of mobile device. Portable computers are faced with a number of constraints in trying to meet these objectives. They are physically constrained by their size, their computational power, their memory resources, their power usage, and their networking ability. These constraints challenge pervasive handheld computing systems in achieving their multi-functional and high-performance requirements. This thesis proposes a two-pronged methodology to enable pervasive handheld computing systems meet their future objectives. The methodology is a fusion of two independent and yet complementary concepts. The first step utilises reconfigurable technology to enhance the physical hardware resources within the environment of a handheld device. This approach recognises that reconfigurable computing has the potential to dynamically increase the system functionality and versatility of a handheld device without major loss in performance. The second step of the methodology incorporates agent-based middleware protocols to support handheld devices to effectively manage and utilise these reconfigurable hardware resources within their environment. The thesis asserts the combined characteristics of reconfigurable computing and agent technology can meet the objectives of pervasive handheld computing systems.
Resumo:
This Portfolio is about the changes that can be supported and achieved through transformational education that impacts on personal, professional and organisational levels. Having lived through an era of tremendous change over the second half of the twentieth century and into the twenty-first the author has a great drawing board to contemplate in the context of professional career experience as an engineer. The ability to engage in ‘subject-object’ separation is the means by which Kegan (1994, 2009) explains that transformation takes place and the Essays in this Portfolio aim to support and bring about such change. Exploration of aspects of ‘Kerry’ is the material selected to both challenge support change in the way of knowing from being subject to certain information and knowledge that to being able to consider it more objectively. The task of being able to distance judgement about the economy and economic development of Kerry was facilitated by various readings around of a number of key thinkers including Kegan, Drucker, Porter and Penrose. The central themes of Kerry or the potential for economic development are built into each Essay. Essay One focuses on reflections of Kerry life - on Kerry people within and without Kerry - and events as they affected understandings of how people related to and worked with one another. These reflections formed the basis for transformational goals identified which required a shift from an engineering mindset to encompass an economics-based view. In Essay Two knowledge of economic concepts is developed by exploring the writings of Drucker, Penrose, and Porter with pertinence to considering economic development generally, and for Kerry in particular in the form of an ‘entrepreneurial platform’. The concepts and theories were the basis of explorations presented in Essays Three and Four. Essay Three focuses on Kerry’s potential for economic development give its current economic profile and includes results from interviews with selected businesses. Essay Four is an exercise in the application of Porter’s ‘Cluster’ concept to the equine sector.
Resumo:
Real time monitoring of oxygenation and respiration is on the cutting edge of bioanalysis, including studies of cell metabolism, bioenergetics, mitochondrial function and drug toxicity. This thesis presents the development and evaluation of new luminescent probes and techniques for intracellular O2 sensing and imaging. A new oxygen consumption rate (OCR) platform based on the commercial microfluidic perfusion channel μ-slides compatible with extra- and intracellular O2 sensitive probes, different cell lines and measurement conditions was developed. The design of semi-closed channels allowed cell treatments, multiplexing with other assays and two-fold higher sensitivity to compare with microtiter plate. We compared three common OCR platforms: hermetically sealed quartz cuvettes for absolute OCRs, partially sealed with mineral oil 96-WPs for relative OCRs, and open 96-WPs for local cell oxygenation. Both 96-WP platforms were calibrated against absolute OCR platform with MEF cell line, phosphorescent O2 probe MitoXpress-Intra and time-resolved fluorescence reader. Found correlations allow tracing of cell respiration over time in a high throughput format with the possibility of cell stimulation and of changing measurement conditions. A new multimodal intracellular O2 probe, based on the phosphorescent reporter dye PtTFPP, fluorescent FRET donor and two-photon antennae PFO and cationic nanoparticles RL-100 was described. This probe, called MM2, possesses high brightness, photo- and chemical stability, low toxicity, efficient cell staining and high-resolution intracellular O2 imaging with 2D and 3D cell cultures in intensity, ratiometric and lifetime-based modalities with luminescence readers and FLIM microscopes. Extended range of O2 sensitive probes was designed and studied in order to optimize their spectral characteristics and intracellular targeting, using different NPs materials, delivery vectors, ratiometric pairs and IR dyes. The presented improvements provide useful tool for high sensitive monitoring and imaging of intracellular O2 in different measurement formats with wide range of physiological applications.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain