7 resultados para standards-based reforms
em University of Queensland eSpace - Australia
Resumo:
Pervasive computing applications must be sufficiently autonomous to adapt their behaviour to changes in computing resources and user requirements. This capability is known as context-awareness. In some cases, context-aware applications must be implemented as autonomic systems which are capable of dynamically discovering and replacing context sources (sensors) at run-time. Unlike other types of application autonomy, this kind of dynamic reconfiguration has not been sufficiently investigated yet by the research community. However, application-level context models are becoming common, in order to ease programming of context-aware applications and support evolution by decoupling applications from context sources. We can leverage these context models to develop general (i.e., application-independent) solutions for dynamic, run-time discovery of context sources (i.e., context management). This paper presents a model and architecture for a reconfigurable context management system that supports interoperability by building on emerging standards for sensor description and classification.
Resumo:
A specialised reconfigurable architecture is targeted at wireless base-band processing. It is built to cater for multiple wireless standards. It has lower power consumption than the processor-based solution. It can be scaled to run in parallel for processing multiple channels. Test resources are embedded on the architecture and testing strategies are included. This architecture is functionally partitioned according to the common operations found in wireless standards, such as CRC error correction, convolution and interleaving. These modules are linked via Virtual Wire Hardware modules and route-through switch matrices. Data can be processed in any order through this interconnect structure. Virtual Wire ensures the same flexibility as normal interconnects, but the area occupied and the number of switches needed is reduced. The testing algorithm scans all possible paths within the interconnection network exhaustively and searches for faults in the processing modules. The testing algorithm starts by scanning the externally addressable memory space and testing the master controller. The controller then tests every switch in the route-through switch matrix by making loops from the shared memory to each of the switches. The local switch matrix is also tested in the same way. Next the local memory is scanned. Finally, pre-defined test vectors are loaded into local memory to check the processing modules. This paper compares various base-band processing solutions. It describes the proposed platform and its implementation. It outlines the test resources and algorithm. It concludes with the mapping of Bluetooth and GSM base-band onto the platform.
Resumo:
Current policy issues surrounding management of the Great Artesian Basin - historical development of existing legislation and institutions - hydrological and historical background information - development of concerns over unsustainable use of resources and possible adverse environmental impacts - recent developments associated with the general reforms to water law and policy initiated by the Council of Australian Governments (COAG) - comparison of issues surrounding the Murray-Darling Basin and the Great Artesian Basin.
Resumo:
Polyethylene-based passive air samplers (PSDs) were loaded with performance reference compounds (PRCs) and deployed in a wind tunnel to examine the effects of wind speed on sampler performance. PRCs could be loaded reproducibly into PSDs, with coefficients of variation only exceeding 20% for the more volatile compounds. When PSDs were exposed to low (0.5-1.5 m s(-1)) and high (3.5-4.5 m s(-1)) wind speeds, PRC loss rate constants generally increased with increasing wind speed and decreased with increasing sampler/air partition coefficients. PSD-based air concentrations calculated using PRC loss rate constants and sampler/air partition coefficients and air concentrations measured using active samplers compared closely. PRCs can be used to account for the effect of differences in wind speeds on sampler performance and measure air concentrations with reasonable accuracy. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
In the last decade, with the expansion of organizational scope and the tendency for outsourcing, there has been an increasing need for Business Process Integration (BPI), understood as the sharing of data and applications among business processes. The research efforts and development paths in BPI pursued by many academic groups and system vendors, targeting heterogeneous system integration, continue to face several conceptual and technological challenges. This article begins with a brief review of major approaches and emerging standards to address BPI. Further, we introduce a rule-driven messaging approach to BPI, which is based on the harmonization of messages in order to compose a new, often cross-organizational process. We will then introduce the design of a temporal first order language (Harmonized Messaging Calculus) that provides the formal foundation for general rules governing the business process execution. Definitions of the language terms, formulae, safety, and expressiveness are introduced and considered in detail.
Resumo:
The introduction of standard on-chip buses has eased integration and boosted the production of IP functional cores. However, once an IP is bus specific retargeting to a different bus is time-consuming and tedious, and this reduces the reusability of the bus-specific IP. As new bus standards are introduced and different interconnection methods are proposed, this problem increases. Many solutions have been proposed, however these solutions either limit the IP block performance or are restricted to a particular platform. A new concept is presented that can connect IP blocks to a wide variety of interface architectures with low overhead. This is achieved through the use a special interface adaptor logic layer.
Resumo:
The importance of appropriate training in the use of videoconferencing equipment for clinical purposes is often underestimated when telemedicine projects are established. We developed a user training programme which was delivered via videoconferencing to a group of 130 nurses. Training was delivered on a one-to-one basis. A questionnaire was developed to evaluate user satisfaction and the effectiveness of training. One hundred and two fully completed questionnaires were returned (a 79% response rate). High levels of satisfaction were obtained but the level of user competence reached 100% only when training was supported by a training manual and at least weekly practice. Before establishing a telemedicine service, the following steps appear to be important: identify the required training competencies; deliver a 'hands on' training programme based on the required training competencies; back up the training programme with an instruction booklet; ensure that trainees have at least weekly practice; measure the level of user competence.