87 resultados para Pervasive computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the important goals of the intelligent buildings especially in commercial applications is not only to minimize the energy consumption but also to enhance the occupant’s comfort. However, most of current development in the intelligent buildings focuses on an implementation of the automatic building control systems that can support energy efficiency approach. The consideration of occupants’ preferences is not adequate. To improve occupant’s wellbeing and energy efficiency in intelligent environments, we develop four types of agent combined together to form a multi-agent system to control the intelligent buildings. Users’ preferential conflicts are discussed. Furthermore, a negotiation mechanism for conflict resolution, has been proposed in order to reach an agreement, and has been represented in syntax directed translation schemes for future implementation and testing. Keywords: conflict resolution, intelligent buildings, multi-agent systems (MAS), negotiation strategy, syntax directed translation schemes (SDTS).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Markowitz showed that assets can be combined to produce an 'Efficient' portfolio that will give the highest level of portfolio return for any level of portfolio risk, as measured by the variance or standard deviation. These portfolios can then be connected to generate what is termed an 'Efficient Frontier' (EF). In this paper we discuss the calculation of the Efficient Frontier for combinations of assets, again using the spreadsheet Optimiser. To illustrate the derivation of the Efficient Frontier, we use the data from the Investment Property Databank Long Term Index of Investment Returns for the period 1971 to 1993. Many investors might require a certain specific level of holding or a restriction on holdings in at least some of the assets. Such additional constraints may be readily incorporated into the model to generate a constrained EF with upper and/or lower bounds. This can then be compared with the unconstrained EF to see whether the reduction in return is acceptable. To see the effect that these additional constraints may have, we adopt a fairly typical pension fund profile, with no more than 20% of the total held in Property. The paper shows that it is now relatively easy to use the Optimiser available in at least one spreadsheet (EXCEL) to calculate efficient portfolios for various levels of risk and return, both constrained and unconstrained, so as to be able to generate any number of Efficient Frontiers.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pocket Data Mining (PDM) is our new term describing collaborative mining of streaming data in mobile and distributed computing environments. With sheer amounts of data streams are now available for subscription on our smart mobile phones, the potential of using this data for decision making using data stream mining techniques has now been achievable owing to the increasing power of these handheld devices. Wireless communication among these devices using Bluetooth and WiFi technologies has opened the door wide for collaborative mining among the mobile devices within the same range that are running data mining techniques targeting the same application. This paper proposes a new architecture that we have prototyped for realizing the significant applications in this area. We have proposed using mobile software agents in this application for several reasons. Most importantly the autonomic intelligent behaviour of the agent technology has been the driving force for using it in this application. Other efficiency reasons are discussed in details in this paper. Experimental results showing the feasibility of the proposed architecture are presented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The introduction of multimedia on pervasive and mobile communication devices raises a number of perceptual quality issues. However, limited work has been done examining the 3-way interaction between use of equipment, user perceptual quality and quality of service. Our work measures user perceptual quality with the quality of perception (QoP) metrics which comprises levels of informational transfer (objective) and user satisfaction (subjective) when users are presented with multimedia video clips at three different frame rates, using four different display devices. Finally, our results will show that variation in frame-rate does not impact a user’s level of information assimilation (IA), however, does impact a users’ perception of multimedia video ‘quality’.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: This paper aims to design an evaluation method that enables an organization to assess its current IT landscape and provide readiness assessment prior to Software as a Service (SaaS) adoption. Design/methodology/approach: The research employs a mixed of quantitative and qualitative approaches for conducting an IT application assessment. Quantitative data such as end user’s feedback on the IT applications contribute to the technical impact on efficiency and productivity. Qualitative data such as business domain, business services and IT application cost drivers are used to determine the business value of the IT applications in an organization. Findings: The assessment of IT applications leads to decisions on suitability of each IT application that can be migrated to cloud environment. Research limitations/implications: The evaluation of how a particular IT application impacts on a business service is done based on the logical interpretation. Data mining method is suggested in order to derive the patterns of the IT application capabilities. Practical implications: This method has been applied in a local council in UK. This helps the council to decide the future status of the IT applications for cost saving purpose.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information architecture (IA) is defined as high level information requirements of an organisation. It is applied in areas such as information systems development, enterprise architecture, business processes management and organisational change management. Still, the lack of methods and theories prevents information architecture becoming a distinct discipline. Healthcare organisation is always seen as information intensive organisation, moreover in a pervasive healthcare environment. Pervasive healthcare aims to provide healthcare services to anyone, anywhere and anytime by incorporating mobile devices and wireless network. Information architecture hence plays an important role in information provisioning within the context of pervasive healthcare in order to support decision making and communication between clinician and patients. Organisational semiotics is one of the social technical approaches that contemplate information through the norms or activities performed within an organisation prior to pervasive healthcare implementation. This paper proposes a conceptual design of information architecture for pervasive healthcare. It is illustrated with a scenario of mental health patient monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wireless technology based pervasive healthcare has been proposed in many applications such as disease management and accident prevention for cost saving and promoting citizen’s wellbeing. However, the emphasis so far is on the artefacts with limited attentions to guiding the development of an effective and efficient solution for pervasive healthcare. Therefore, this paper aims to propose a framework of multi-agent systems design for pervasive healthcare by adopting the concept of pervasive informatics and using the methods of organisational semiotics. The proposed multi-agent system for pervasive healthcare utilises sensory information to support healthcare professionals for providing appropriate care. The key contributions contain theoretical aspect and practical aspect. In theory, this paper articulates the information interactions between the pervasive healthcare environment and stakeholders by using the methods of organisational semiotics; in practice, the proposed framework improves the healthcare quality by providing appropriate medical attentions when and as needed. In this paper, both systems and functional architecture of the multi-agent system are elaborated with the use of wireless technologies such as RFID and wireless sensor networks. The future study will focus on the implementation of the proposed framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Safety is an element of extreme priority in mining operations, currently many traditional mining countries are investing in the implementation of wireless sensors capable of detecting risk factors; through early warning signs to prevent accidents and significant economic losses. The objective of this research is to contribute to the implementation of sensors for continuous monitoring inside underground mines providing technical parameters for the design of sensor networks applied in underground coal mines. The application of sensors capable of measuring in real time variables of interest, promises to be of great impact on safety for mining industry. The relationship between the geological conditions and mining method design, establish how to implement a system of continuous monitoring. In this paper, the main causes of accidents for underground coal mines are established based on existing worldwide reports. Variables (temperature, gas, structural faults, fires) that can be related to the most frequent causes of disaster and its relevant measuring range are then presented, also the advantages, management and mining operations are discussed, including the analyzed of applying these systems in terms of Benefit, Opportunity, Cost and Risk. The publication focuses on coal mining, based on the proportion of these events a year worldwide, where a significant number of workers are seriously injured or killed. Finally, a dynamic assessment of safety at underground mines it is proposed, this approach offers a contribution to design personalized monitoring networks, the experience developed in coal mines provides a tool that facilitates the application development of technology within underground coal mines.