953 resultados para Hybrid Multicast Framework
Resumo:
This paper presents an input-orientated data envelopment analysis (DEA) framework which allows the measurement and decomposition of economic, environmental and ecological efficiency levels in agricultural production across different countries. Economic, environmental and ecological optimisations search for optimal input combinations that minimise total costs, total amount of nutrients, and total amount of cumulative exergy contained in inputs respectively. The application of the framework to an agricultural dataset of 30 OECD countries revealed that (i) there was significant scope to make their agricultural production systemsmore environmentally and ecologically sustainable; (ii) the improvement in the environmental and ecological sustainability could be achieved by being more technically efficient and, even more significantly, by changing the input combinations; (iii) the rankings of sustainability varied significantly across OECD countries within frontier-based environmental and ecological efficiency measures and between frontier-based measures and indicators.
Resumo:
Aims and objectives To evaluate the safety and quality of nurse practitioner service using the audit framework of Structure,Process and Outcome. Background Health service and workforce reform are on the agenda of governments and other service providers seeking to contain healthcare costs whilst providing safe and effective health care to communities. The nurse practitioner service is one health workforce innovation that has been adopted globally to improve timely access to clinical care, but there is scant literature reporting evaluation of the quality of this service innovation. Design. A mixed-methods design within the Donabedian evaluation framework was used. Methods The Donabedian framework was used to evaluate the Structure, Process and Outcome of nurse practitioner service. A range of data collection approaches was used, including stakeholder survey (n=36), in-depth interviews (11 patients and 13 nurse practitioners) and health records data on service processes. Results The study identified that adequate and detailed preparation of Structure and Process is essential for the successful implementation of a service innovation. The multidisciplinary team was accepting of the addition of nurse practitioner service, and nurse practitioner clinical care was shown to be effective, satisfactory and safe from the perspective of the clinician stakeholders and patients. Conclusions This study demonstrated that the Donabedian framework of Structure, Process and Outcome evaluation is a valuable and validated approach to examine the safety and quality of a service innovation. Furthermore, in this study, specific Structure elements were shown to influence the quality of service processes further validating the framework and the interdependence of the Structure, Process and Outcome components. Relevance to clinical practice Understanding the structure and process requirements for establishing nursing service innovation lays the foundation for safe, effective and patient-centred clinical care.
Resumo:
Background: Historically rail organisations have been operating in silos and devising their own training agendas. However with the harmonisation of the Australian workplace health and safety legislation and the appointment of a national rail safety regulator in 2013, rail incident investigator experts are exploring the possibility of developing a unified approach to investigator training. Objectives: The Australian CRC for Rail Innovation commissioned a training needs analysis to identify if common training needs existed between organisations and to assess support for the development of a national competency framework for rail incident investigations. Method: Fifty-two industry experts were consulted to explore the possibility of the development of a standardised training framework. These experts were sourced from within 19 Australasian organisations, comprising Rail Operators and Regulators in Queensland, New South Wales, Victoria, Western Australia, South Australia and New Zealand. Results: Although some competency requirements appear to be organisation specific, the vast majority of reported training requirements were generic across the Australasian rail operators and regulators. Industry experts consistently reported strong support for the development of a national training framework. Significance: The identification of both generic training requirements across organisations and strong support for standardised training indicates that the rail industry is receptive to the development of a structured training framework. The development of an Australasian learning framework could: increase efficiency in course development and reduce costs; establish recognised career pathways; and facilitate consistency with regards to investigator training.
Resumo:
Despite the compelling case for moving towards cloud computing, the upstream oil & gas industry faces several technical challenges—most notably, a pronounced emphasis on data security, a reliance on extremely large data sets, and significant legacy investments in information technology (IT) infrastructure—that make a full migration to the public cloud difficult at present. Private and hybrid cloud solutions have consequently emerged within the industry to yield as much benefit from cloud-based technologies as possible while working within these constraints. This paper argues, however, that the move to private and hybrid clouds will very likely prove only to be a temporary stepping stone in the industry’s technological evolution. By presenting evidence from other market sectors that have faced similar challenges in their journey to the cloud, we propose that enabling technologies and conditions will probably fall into place in a way that makes the public cloud a far more attractive option for the upstream oil & gas industry in the years ahead. The paper concludes with a discussion about the implications of this projected shift towards the public cloud, and calls for more of the industry’s services to be offered through cloud-based “apps.”
Resumo:
Recent decades have witnessed a global acceleration of legislative and private sector initiatives to deal with Cross-Border insolvency. Legislative institutions include the various national implementations of the Model Law on Cross-Border Insolvency (Model Law) published by the United Nations Commission on International Trade (UNCITRAL).3 Private mechanisms include Cross-Border protocols developed and utilised by insolvency professionals and their advisers (often with the imprimatur of the judiciary), on both general and ad hoc bases. The Asia Pacific region has not escaped the effect of those developments, and the economic turmoil of the past few years has provided an early test for some of the emerging initiatives in that region. This two-part article explores the operation of those institutions through the medium of three recent cases.
Resumo:
Recent decades have witnessed a global acceleration of legislative and private sector initiatives to deal with Cross-Border insolvency. Legislative institutions include the various national implementations of the Model Law on Cross-Border Insolvency (Model Law) published by the United Nations Commission on International Trade (UNCITRAL).3 Private mechanisms include Cross-Border protocols developed and utilised by insolvency professionals and their advisers (often with the imprimatur of the judiciary), on both general and ad hoc bases. The Asia Pacific region has not escaped the effect of those developments, and the economic turmoil of the past few years has provided an early test for some of the emerging initiatives in that region. This two-part article explores the operation of those institutions through the medium of three recent cases.
Resumo:
The aim of this paper is to implement a Game-Theory based offline mission path planner for aerial inspection tasks of large linear infrastructures. Like most real-world optimisation problems, mission path planning involves a number of objectives which ideally should be minimised simultaneously. The goal of this work is then to develop a Multi-Objective (MO) optimisation tool able to provide a set of optimal solutions for the inspection task, given the environment data, the mission requirements and the definition of the objectives to minimise. Results indicate the robustness and capability of the method to find the trade-off between the Pareto-optimal solutions.
Resumo:
This paper present an efficient method using system state sampling technique in Monte Carlo simulation for reliability evaluation of multi-area power systems, at Hierarchical Level One (HLI). System state sampling is one of the common methods used in Monte Carlo simulation. The cpu time and memory requirement can be a problem, using this method. Combination of analytical and Monte Carlo method known as Hybrid method, as presented in this paper, can enhance the efficiency of the solution. Incorporation of load model in this study can be utilised either by sampling or enumeration. Both cases are examined in this paper, by application of the methods on Roy Billinton Test System(RBTS).
Resumo:
Due to increased complexity, scale, and functionality of information and telecommunication (IT) infrastructures, every day new exploits and vulnerabilities are discovered. These vulnerabilities are most of the time used by ma¬licious people to penetrate these IT infrastructures for mainly disrupting business or stealing intellectual pro¬perties. Current incidents prove that it is not sufficient anymore to perform manual security tests of the IT infra¬structure based on sporadic security audits. Instead net¬works should be continuously tested against possible attacks. In this paper we present current results and challenges towards realizing automated and scalable solutions to identify possible attack scenarios in an IT in¬frastructure. Namely, we define an extensible frame¬work which uses public vulnerability databases to identify pro¬bable multi-step attacks in an IT infrastructure, and pro¬vide recommendations in the form of patching strategies, topology changes, and configuration updates.
Resumo:
This chapter presents a comparative survey of recent key management (key distribution, discovery, establishment and update) solutions for wireless sensor networks. We consider both distributed and hierarchical sensor network architectures where unicast, multicast and broadcast types of communication take place. Probabilistic, deterministic and hybrid key management solutions are presented, and we determine a set of metrics to quantify their security properties and resource usage such as processing, storage and communication overheads. We provide a taxonomy of solutions, and identify trade-offs in these schemes to conclude that there is no one-size-fits-all solution.
Resumo:
ZnO is a wide band-gap semiconductor that has several desirable properties for optoelectronic devices. With its large exciton binding energy of ~60 meV, ZnO is a promising candidate for high stability, room-temperature luminescent and lasing devices [1]. Ultraviolet light-emitting diodes (LEDs) based on ZnO homojunctions had been reported [2,3], while preparing stable p-type ZnO is still a challenge. An alternative way is to use other p-type semiconductors, ether inorganic or organic, to form heterojunctions with the naturally n-type ZnO. The crystal structure of wurtzite ZnO can be described as Zn and O atomic layers alternately stacked along the [0001] direction. Because of the fastest growth rate over the polar (0001) facet, ZnO crystals tend to grow into one-dimensional structures, such as nanowires and nanobelts. Since the first report of ZnO nanobelts in 2001 [4], ZnO nanostructures have been particularly studied for their potential applications in nano-sized devices. Various growth methods have been developed for growing ZnO nanostructures, such as chemical vapor deposition (CVD), Metal-organic CVD (MOCVD), aqueous growth and electrodeposition [5]. Based on the successful synthesis of ZnO nanowires/nanorods, various types of hybrid light-emitting diodes (LEDs) were made. Inorganic p-type semiconductors, such as GaN, Si and SiC, have been used as substrates to grown ZnO nanorods/nanowires for making LEDs. GaN is an ideal material that matches ZnO not only in the crystal structure but also in the energy band levels. However, to prepare Mg-doped p-GaN films via epitaxial growth is still costly. In comparison, the organic semiconductors are inexpensive and have many options to select, for a large variety of p-type polymer or small-molecule semiconductors are now commercially available. The organic semiconductor has the limitation of durability and environmental stability. Many polymer semiconductors are susceptible to damage by humidity or mere exposure to oxygen in the air. Also the carrier mobilities of polymer semiconductors are generally lower than the inorganic semiconductors. However, the combination of polymer semiconductors and ZnO nanostructures opens the way for making flexible LEDs. There are few reports on the hybrid LEDs based on ZnO/polymer heterojunctions, some of them showed the characteristic UV electroluminescence (EL) of ZnO. This chapter reports recent progress of the hybrid LEDs based on ZnO nanowires and other inorganic/organic semiconductors. We provide an overview of the ZnO-nanowire-based hybrid LEDs from the perspectives of the device configuration, growth methods of ZnO nanowires and the selection of p-type semiconductors. Also the device performances and remaining issues are presented.
Resumo:
Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.
Resumo:
Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.
Resumo:
Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.
Resumo:
Organizations invest heavily in Supply Chain Management Systems expecting the benefits promised by the software vendors and the implementation partners. However, both academic and industry reports suggest that there is growing dissatisfaction among client organizations due to an increasing gap in benefits purported by the software vendors and benefits realised by the client. In order to better manage expectations of the client organization, this study proposes a Benefit Expectation Management Framework for Supply Chain Management Systems, based on Expectation-Confirmation Theory. This study derives 60 expected benefits of Supply Chain Management Systems through 41 vendor-reported customer stories and academic papers. Through comparing those benefits with the received benefits by a case organization that has implemented SAP Supply Chain Management Systems for seven years, two salient factors – long timetable and multiple stakeholders – have been identified as the controlling factors affecting the confirmation level of Supply Chain Management System expectations and further impacting the satisfaction of a client organization. The case study also highlights the likely causes for realized benefits and enduring issues in relation to the Supply Chain Management Systems.