6 resultados para modularised computing unit

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless Inertial Measurement Units (WIMUs) combine motion sensing, processing & communications functionsin a single device. Data gathered using these sensors has the potential to be converted into high quality motion data. By outfitting a subject with multiple WIMUs full motion data can begathered. With a potential cost of ownership several orders of magnitude less than traditional camera based motion capture, WIMU systems have potential to be crucially important in supplementing or replacing traditional motion capture and opening up entirely new application areas and potential markets particularly in the rehabilitative, sports & at-home healthcarespaces. Currently WIMUs are underutilized in these areas. A major barrier to adoption is perceived complexity. Sample rates, sensor types & dynamic sensor ranges may need to be adjusted on multiple axes for each device depending on the scenario. As such we present an advanced WIMU in conjunction with a Smart WIMU system to simplify this aspect with 3 usage modes: Manual, Intelligent and Autonomous. Attendees will be able to compare the 3 different modes and see the effects of good andbad set-ups on the quality of data gathered in real time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constraint programming has emerged as a successful paradigm for modelling combinatorial problems arising from practical situations. In many of those situations, we are not provided with an immutable set of constraints. Instead, a user will modify his requirements, in an interactive fashion, until he is satisfied with a solution. Examples of such applications include, amongst others, model-based diagnosis, expert systems, product configurators. The system he interacts with must be able to assist him by showing the consequences of his requirements. Explanations are the ideal tool for providing this assistance. However, existing notions of explanations fail to provide sufficient information. We define new forms of explanations that aim to be more informative. Even if explanation generation is a very hard task, in the applications we consider, we must manage to provide a satisfactory level of interactivity and, therefore, we cannot afford long computational times. We introduce the concept of representative sets of relaxations, a compact set of relaxations that shows the user at least one way to satisfy each of his requirements and at least one way to relax them, and present an algorithm that efficiently computes such sets. We introduce the concept of most soluble relaxations, maximising the number of products they allow. We present algorithms to compute such relaxations in times compatible with interactivity, achieving this by indifferently making use of different types of compiled representations. We propose to generalise the concept of prime implicates to constraint problems with the concept of domain consequences, and suggest to generate them as a compilation strategy. This sets a new approach in compilation, and allows to address explanation-related queries in an efficient way. We define ordered automata to compactly represent large sets of domain consequences, in an orthogonal way from existing compilation techniques that represent large sets of solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The technological role of handheld devices is fundamentally changing. Portable computers were traditionally application specific. They were designed and optimised to deliver a specific task. However, it is now commonly acknowledged that future handheld devices need to be multi-functional and need to be capable of executing a range of high-performance applications. This thesis has coined the term pervasive handheld computing systems to refer to this type of mobile device. Portable computers are faced with a number of constraints in trying to meet these objectives. They are physically constrained by their size, their computational power, their memory resources, their power usage, and their networking ability. These constraints challenge pervasive handheld computing systems in achieving their multi-functional and high-performance requirements. This thesis proposes a two-pronged methodology to enable pervasive handheld computing systems meet their future objectives. The methodology is a fusion of two independent and yet complementary concepts. The first step utilises reconfigurable technology to enhance the physical hardware resources within the environment of a handheld device. This approach recognises that reconfigurable computing has the potential to dynamically increase the system functionality and versatility of a handheld device without major loss in performance. The second step of the methodology incorporates agent-based middleware protocols to support handheld devices to effectively manage and utilise these reconfigurable hardware resources within their environment. The thesis asserts the combined characteristics of reconfigurable computing and agent technology can meet the objectives of pervasive handheld computing systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis I theoretically study quantum states of ultracold atoms. The majority of the Chapters focus on engineering specific quantum states of single atoms with high fidelity in experimentally realistic systems. In the sixth Chapter, I investigate the stability and dynamics of new multidimensional solitonic states that can be created in inhomogeneous atomic Bose-Einstein condensates. In Chapter three I present two papers in which I demonstrate how the coherent tunnelling by adiabatic passage (CTAP) process can be implemented in an experimentally realistic atom chip system, to coherently transfer the centre-of-mass of a single atom between two spatially distinct magnetic waveguides. In these works I also utilise GPU (Graphics Processing Unit) computing which offers a significant performance increase in the numerical simulation of the Schrödinger equation. In Chapter four I investigate the CTAP process for a linear arrangement of radio frequency traps where the centre-of-mass of both, single atoms and clouds of interacting atoms, can be coherently controlled. In Chapter five I present a theoretical study of adiabatic radio frequency potentials where I use Floquet theory to more accurately model situations where frequencies are close and/or field amplitudes are large. I also show how one can create highly versatile 2D adiabatic radio frequency potentials using multiple radio frequency fields with arbitrary field orientation and demonstrate their utility by simulating the creation of ring vortex solitons. In the sixth Chapter I discuss the stability and dynamics of a family of multidimensional solitonic states created in harmonically confined Bose-Einstein condensates. I demonstrate that these solitonic states have interesting dynamical instabilities, where a continuous collapse and revival of the initial state occurs. Through Bogoliubov analysis, I determine the modes responsible for the observed instabilities of each solitonic state and also extract information related to the time at which instability can be observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The contribution of buildings towards total worldwide energy consumption in developed countries is between 20% and 40%. Heating Ventilation and Air Conditioning (HVAC), and more specifically Air Handling Units (AHUs) energy consumption accounts on average for 40% of a typical medical device manufacturing or pharmaceutical facility’s energy consumption. Studies have indicated that 20 – 30% energy savings are achievable by recommissioning HVAC systems, and more specifically AHU operations, to rectify faulty operation. Automated Fault Detection and Diagnosis (AFDD) is a process concerned with potentially partially or fully automating the commissioning process through the detection of faults. An expert system is a knowledge-based system, which employs Artificial Intelligence (AI) methods to replicate the knowledge of a human subject matter expert, in a particular field, such as engineering, medicine, finance and marketing, to name a few. This thesis details the research and development work undertaken in the development and testing of a new AFDD expert system for AHUs which can be installed in minimal set up time on a large cross section of AHU types in a building management system vendor neutral manner. Both simulated and extensive field testing was undertaken against a widely available and industry known expert set of rules known as the Air Handling Unit Performance Assessment Rules (APAR) (and a later more developed version known as APAR_extended) in order to prove its effectiveness. Specifically, in tests against a dataset of 52 simulated faults, this new AFDD expert system identified all 52 derived issues whereas the APAR ruleset identified just 10. In tests using actual field data from 5 operating AHUs in 4 manufacturing facilities, the newly developed AFDD expert system for AHUs was shown to identify four individual fault case categories that the APAR method did not, as well as showing improvements made in the area of fault diagnosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain