973 resultados para Platform approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Can autonomic computing concepts be applied to traditional multi-core systems found in high performance computing environments? In this paper, we propose a novel synergy between parallel computing and swarm robotics to offer a new computing paradigm, `Swarm-Array Computing' that can harness and apply autonomic computing for parallel computing systems. One approach among three proposed approaches in swarm-array computing based on landscapes of intelligent cores, in which the cores of a parallel computing system are abstracted to swarm agents, is investigated. A task gets executed and transferred seamlessly between cores in the proposed approach thereby achieving self-ware properties that characterize autonomic computing. FPGAs are considered as an experimental platform taking into account its application in space robotics. The feasibility of the proposed approach is validated on the SeSAm multi-agent simulator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Body Sensor Networks (BSNs) have been recently introduced for the remote monitoring of human activities in a broad range of application domains, such as health care, emergency management, fitness and behaviour surveillance. BSNs can be deployed in a community of people and can generate large amounts of contextual data that require a scalable approach for storage, processing and analysis. Cloud computing can provide a flexible storage and processing infrastructure to perform both online and offline analysis of data streams generated in BSNs. This paper proposes BodyCloud, a SaaS approach for community BSNs that supports the development and deployment of Cloud-assisted BSN applications. BodyCloud is a multi-tier application-level architecture that integrates a Cloud computing platform and BSN data streams middleware. BodyCloud provides programming abstractions that allow the rapid development of community BSN applications. This work describes the general architecture of the proposed approach and presents a case study for the real-time monitoring and analysis of cardiac data streams of many individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Component-based software engineering has recently emerged as a promising solution to the development of system-level software. Unfortunately, current approaches are limited to specific platforms and domains. This lack of generality is particularly problematic as it prevents knowledge sharing and generally drives development costs up. In the past, we have developed a generic approach to component-based software engineering for system-level software called OpenCom. In this paper, we present OpenComL an instantiation of OpenCom to Linux environments and show how it can be profiled to meet a range of system-level software in Linux environments. For this, we demonstrate its application to constructing a programmable router platform and a middleware for parallel environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stakeholders perceive the role of accountants to reflect trust, honesty, impartiality, fairness and transparency. The aim of this paper is to explore avenues to strengthen the moral integrity of professional bodies and their members. The resulting recommendations include a community or "milieu" approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shared clusters represent an excellent platform for the execution of parallel applications given their low price/performance ratio and the presence of cluster infrastructure in many organisations. The focus of recent research efforts are on parallelism management, transport and efficient access to resources, and making clusters easy to use. In this thesis, we examine reliable parallel computing on clusters. The aim of this research is to demonstrate the feasibility of developing an operating system facility providing transport fault tolerance using existing, enhanced and newly built operating system services for supporting parallel applications. In particular, we use existing process duplication and process migration services, and synthesise a group communications facility for use in a transparent checkpointing facility. This research is carried out using the methods of experimental computer science. To provide a foundation for the synthesis of the group communications and checkpointing facilities, we survey and review related work in both fields. For group communications, we examine the V Distributed System, the x-kernel and Psync, the ISIS Toolkit, and Horus. We identify a need for services that consider the placement of processes on computers in the cluster. For Checkpointing, we examine Manetho, KeyKOS, libckpt, and Diskless Checkpointing. We observe the use of remote computer memories for storing checkpoints, and the use of copy-on-write mechanisms to reduce the time to create a checkpoint of a process. We propose a group communications facility providing two sets of services: user-oriented services and system-oriented services. User-oriented services provide transparency and target application. System-oriented services supplement the user-oriented services for supporting other operating systems services and do not provide transparency. Additional flexibility is achieved by providing delivery and ordering semantics independently. An operating system facility providing transparent checkpointing is synthesised using coordinated checkpointing. To ensure a consistent set of checkpoints are generated by the facility, instead of blindly blocking the processes of a parallel application, only non-deterministic events are blocked. This allows the processes of the parallel application to continue execution during the checkpoint operation. Checkpoints are created by adapting process duplication mechanisms, and checkpoint data is transferred to remote computer memories and disk for storage using the mechanisms of process migration. The services of the group communications facility are used to coordinate the checkpoint operation, and to transport checkpoint data to remote computer memories and disk. Both the group communications facility and the checkpointing facility have been implemented in the GENESIS cluster operating system and provide proof-of-concept. GENESIS uses a microkernel and client-server based operating system architecture, and is demonstrated to provide an appropriate environment for the development of these facilities. We design a number of experiments to test the performance of both the group communications facility and checkpointing facility, and to provide proof-of-performance. We present our approach to testing, the challenges raised in testing the facilities, and how we overcome them. For group communications, we examine the performance of a number of delivery semantics. Good speed-ups are observed and system-oriented group communication services are shown to provide significant performance advantages over user-oriented semantics in the presence of packet loss. For checkpointing, we examine the scalability of the facility given different levels of resource usage and a variable number of computers. Low overheads are observed for checkpointing a parallel application. It is made clear by this research that the microkernel and client-server based cluster operating system provide an ideal environment for the development of a high performance group communications facility and a transparent checkpointing facility for generating a platform for reliable parallel computing on clusters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a practical security model based on key security considerations by looking at a number of infrastructure aspects of Cloud Computing such as SaaS, Utility, Web, Platform and Managed Services, Service commerce platforms and Internet Integration which was introduced with a concise literature review. The purpose of this paper is to offer a macro level solution for identified common infrastructure security requirements. This model with a number of emerged patterns can be applied to infrastructure aspect of Cloud Computing as a proposed shared security approach in system development life cycle focusing on the plan-built-run scope.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To perform under water robotic research requires specialized equipment. A few pieces of electronics atop a set of wheels are not going to cut it. An underwater research platform must be waterproof, reliable, robust, recoverable and easy to maintain. It must also be able to move in 3 dimensions. Also it must be able to navigate and avoid obstacles. Further if this platform is to be part of a swarm of like platforms then it must be cost effective and relatively small. To purchase such a platform can be very expensive. However, for shallow water, a suitable platform can be built from mostly off the shelf items at little cost. This article describes the design of one such underwater robot including various sensors and communications systems that allow for swarm robotics. Whilst the robotic platform performs well, to explore what many of them would do, that is more than are available, simulation is required. This article continues to study how best to simulate these robots for a swarm, or system of systems, approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is paucity of data regarding hydrocarbon exposure of tropical fish species inhabiting the waters near oil and gas platforms on the Northwest Shelf of Australia. A comprehensive field study assessed the exposure and potential effects associated with the produced water (PW) plume from the Harriet A production platform on the northwest shelf in a local reef species, Stripey seaperch (Lutjanus carponotatus). This field study was a continuation of an earlier pilot study which concluded that there were “warning signs” of potential biological effects on fish populations exposed to PW. A 10-day field caging study was conducted deploying 15 individual fish into 6 separate steel cages set 1-m subsurface at 3 stations in a concentration gradient moving away from the platform. A battery of biomarkers were evaluated including hepatosomatic index (HSI), total cytochrome P450, bile metabolites, CYP1A-, CYP2K- and CYP2M-like proteins, cholinesterase (ChE) activity, and histopathology of liver and gill tissues. Water column and PW effluent samples was also collected. Results confirmed that PAH metabolites in bile, CYP1A-, CYP2K-, and CYP2M-like proteins and liver histopathology provided evidence of significant exposure and effects after 10 days at the near-field site (~200 m off the Harriet A platform). Hepatosomatic index, total cytochrome P450, and ChE did not provide site-specific differences by day 10 of exposure to PW. CYP proteins were shown by principal component analysis (PCA) to be the best diagnostic tool for determining exposure and associated biological effects of PW on L. carponotatus. Using a suite of biomarkers has been widely advocated as a vital component in environmental risk assessments worldwide. This study demonstrates the usefulness of biomarkers for assessing the Harriet A PW discharge into Australian waters with broader applications for other PW discharges. This approach has merit as a valuable addition to environmental management strategies for protecting Australia’s tropical environment and its rich biodiversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The application of “Google” econometrics (Geco) has evolved rapidly in recent years and can be applied in various fields of research. Based on accepted theories in existing economic literature, this paper seeks to contribute to the innovative use of research on Google search query data to provide a new innovative to property research.

Design/methodology/approach – In this study, existing data from Google Insights for Search (GI4S) is extended into a new potential source of consumer sentiment data based on visits to a commonly-used UK online real-estate agent platform (Rightmove.co.uk). In order to contribute to knowledge about the use of Geco's black box, namely the unknown sampling population and the specific search queries influencing the variables, the GI4S series are compared to direct web navigation.

Findings – The main finding from this study is that GI4S data produce immediate real-time results with a high level of reliability in explaining the future volume of transactions and house prices in comparison to the direct website data. Furthermore, the results reveal that the number of visits to Rightmove.co.uk is driven by GI4S data and vice versa, and indeed without a contemporaneous relationship.

Originality/value – This study contributes to the new emerging and innovative field of research involving search engine data. It also contributes to the knowledge base about the increasing use of online consumer data in economic research in property markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a vision‐based autonomous move‐to‐grasp approach for a compact mobile manipulator under some low and small environments. The visual information of specified object with a radial symbol and an overhead colour block is extracted from two CMOS cameras in an embedded way. Furthermore, the mobile platform and the postures of the manipulator are adjusted continuously by vision‐based control, which drives the mobile manipulator approaching the object. When the mobile manipulator is sufficiently close to the object, only the manipulator moves to grasp the object based on the incremental movement with its head end centre of the end‐effector conforming to a Bezier curve. The effectiveness of the proposed approach is verified by experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deakin University has a long association with e-learning platforms, utilising the functionality of various Learning Management Systems (LMS) over a period of years. Transforming learning and teaching is a key priority of the University and moving to a new generation e-learning platform that supports engaging learning experiences through quality course design is a strategic imperative.

In 2010 Deakin University selected Desire2Learn as its replacement LMS, an innovative platform that offers next generation functionality. The University is investing significant resources in 2011 to implement the new system. The Library is harnessing the opportunity to embed search and discovery and information access throughout the LMS, including presence at the highest level of navigation. A Library widget providing students with clear pathways and immediate access to key library collections, services and features is being developed by the Library in conjunction with the Faculties‟ academic champions and educational developers. Liaison Librarians are negotiating with academic staff to create context-specific pathways, to utilise Desire2Learn Web2.0 capabilities and to imbed more personalised resources and LibGuides aligned with units of study. This is happening at a time when libraries are introducing new approaches to information discovery.

This paper describes Deakin University Library‟s journey in partnering with academic staff and others across the University to implement Desire2Learn as a vital new e-learning platform. It reports on many outcomes including: value created by embedding quality information in learner-centred course delivery; increased awareness of library subscription resources when accessible within students‟ workspace; strong and continuing relationships built with academic staff; enhanced Library staff engagement with flexible learning principles and new technologies. The question of where embedding information access in online courses and units fits with the Library‟s exploration of web scale solutions is also touched upon. And finally, an insight into how recent research undertaken by Deakin University Library has influenced our approach to information discovery solutions suggests an opportunity for many more questions to be explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of transgenic plants to produce novel products has great biotechnological potential as the relatively inexpensive inputs of light, water, and nutrients are utilised in return for potentially valuable bioactive metabolites, diagnostic proteins and vaccines. Extensive research is ongoing in this area internationally with the aim of producing plant-made vaccines of importance for both animals and humans. Vaccine purification is generally regarded as being integral to the preparation of safe and effective vaccines for use in humans. However, the use of crude plant extracts for animal immunisation may enable plant-made vaccines to become a cost-effective and efficacious approach to safely immunise large numbers of farm animals against diseases such as avian influenza. Since the technology associated with genetic transformation and large-scale propagation is very well established in Nicotiana, the genus has attributes well-suited for the production of plant-made vaccines. However the presence of potentially toxic alkaloids in Nicotiana extracts impedes their use as crude vaccine preparations. In the current study we describe a Nicotiana tabacum and N. glauca hybrid that expresses the HA glycoprotein of influenza A in its leaves but does not synthesize alkaloids. We demonstrate that injection with crude leaf extracts from these interspecific hybrid plants is a safe and effective approach for immunising mice. Moreover, this antigen-producing alkaloid-free, transgenic interspecific hybrid is vigorous, with a high capacity for vegetative shoot regeneration after harvesting. These plants are easily propagated by vegetative cuttings and have the added benefit of not producing viable pollen, thus reducing potential problems associated with bio-containment. Hence, these Nicotiana hybrids provide an advantageous production platform for partially purified, plant-made vaccines which may be particularly well suited for use in veterinary immunization programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thousands of the world's offshore oil and gas structures are approaching obsolescence and will require decommissioning within the next decade. Many nations have blanket regulations requiring obsolete structures to be removed, yet this option is unlikely to yield optimal environmental, societal and economic outcomes in all situations. We propose that nations adopt a flexible approach that allows decommissioning options to be selected from the full range of alternatives (including 'rigs-to-reefs' options) on a case-by-case basis. We outline a method of multi-criteria decision analysis (Multi-criteria Approval, MA) for evaluating and comparing alternative decommissioning options across key selection criteria, including environmental, financial, socioeconomic, and health and safety considerations. The MA approach structures the decision problem, forces explicit consideration of trade-offs and directly involves stakeholder groups in the decision process. We identify major decommissioning options and provide a generic list of selection criteria for inclusion in the MA decision process. To deal with knowledge gaps concerning environmental impacts of decommissioning, we suggest that expert opinion feed into the MA approach until sufficient data become available. We conducted a limited trial of the MA decision approach to demonstrate its application to a complex and controversial decommissioning scenario; Platform Grace in southern California. The approach indicated, for this example, that the option 'leave in place intact' would likely provide best environmental outcomes in the event of future decommissioning. In summary, the MA approach will allow the environmental, social, and economic impacts of decommissioning decisions to be assessed simultaneously in a transparent manner. © 2013 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is concerned with the design of a non-fragile controller for an offshore steel jacket platform with nonlinear perturbations. The delay-dependent sufficient conditions are derived in terms of linear matrix inequalities based on suitable Lyapunov–Krasovskii functional, the second-order reciprocally convex approach and the lower bound lemma. The results indicate asymptotic stability of the offshore steel jacket platform utilizing the proposed non-fragile controller. Besides that, robust stability conditions are derived for an uncertain offshore platform subject to the non-fragile controller. A numerical example is given to illustrate the effectiveness of the proposed theoretical results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HydroShare is an online, collaborative system being developed for open sharing of hydrologic data and models. The goal of HydroShare is to enable scientists to easily discover and access hydrologic data and models, retrieve them to their desktop or perform analyses in a distributed computing environment that may include grid, cloud or high performance computing model instances as necessary. Scientists may also publish outcomes (data, results or models) into HydroShare, using the system as a collaboration platform for sharing data, models and analyses. HydroShare is expanding the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated, creating new capability to share models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. One of the fundamental concepts in HydroShare is that of a Resource. All content is represented using a Resource Data Model that separates system and science metadata and has elements common to all resources as well as elements specific to the types of resources HydroShare will support. These will include different data types used in the hydrology community and models and workflows that require metadata on execution functionality. The HydroShare web interface and social media functions are being developed using the Drupal content management system. A geospatial visualization and analysis component enables searching, visualizing, and analyzing geographic datasets. The integrated Rule-Oriented Data System (iRODS) is being used to manage federated data content and perform rule-based background actions on data and model resources, including parsing to generate metadata catalog information and the execution of models and workflows. This presentation will introduce the HydroShare functionality developed to date, describe key elements of the Resource Data Model and outline the roadmap for future development.