932 resultados para Competing Values Framework
A hybrid simulation framework to assess the impact of renewable generators on a distribution network
Resumo:
With an increasing number of small-scale renewable generator installations, distribution network planners are faced with new technical challenges (intermittent load flows, network imbalances…). Then again, these decentralized generators (DGs) present opportunities regarding savings on network infrastructure if installed at strategic locations. How can we consider both of these aspects when building decision tools for planning future distribution networks? This paper presents a simulation framework which combines two modeling techniques: agent-based modeling (ABM) and particle swarm optimization (PSO). ABM is used to represent the different system units of the network accurately and dynamically, simulating over short time-periods. PSO is then used to find the most economical configuration of DGs over longer periods of time. The infrastructure of the framework is introduced, presenting the two modeling techniques and their integration. A case study of Townsville, Australia, is then used to illustrate the platform implementation and the outputs of a simulation.
Resumo:
This paper presents an input-orientated data envelopment analysis (DEA) framework which allows the measurement and decomposition of economic, environmental and ecological efficiency levels in agricultural production across different countries. Economic, environmental and ecological optimisations search for optimal input combinations that minimise total costs, total amount of nutrients, and total amount of cumulative exergy contained in inputs respectively. The application of the framework to an agricultural dataset of 30 OECD countries revealed that (i) there was significant scope to make their agricultural production systemsmore environmentally and ecologically sustainable; (ii) the improvement in the environmental and ecological sustainability could be achieved by being more technically efficient and, even more significantly, by changing the input combinations; (iii) the rankings of sustainability varied significantly across OECD countries within frontier-based environmental and ecological efficiency measures and between frontier-based measures and indicators.
Resumo:
Resumo: Esse artigo apresenta os resultados parciais de uma pesquisa em andamento que, a partir do método etnográfico, investiga os projetos de comunicação comunitária, jornalismo e fotojornalismo desenvolvidos por duas organizações não-governamentais na cidade do Rio de Janeiro. O trabalho de campo, realizado durante três meses nas favelas cariocas, forneceu questões teóricas relevantes para os estudos do jornalismo, destacando-se as problematizações sobre a noção de valores-notícia. Voltados para a produção de narrativas centradas no cotidiano das comunidades, os fotojornalistas populares consideram fundamental discutir os valores-notícia formulados pela grande imprensa e propor “contra-valores”.
Resumo:
Aims and objectives To evaluate the safety and quality of nurse practitioner service using the audit framework of Structure,Process and Outcome. Background Health service and workforce reform are on the agenda of governments and other service providers seeking to contain healthcare costs whilst providing safe and effective health care to communities. The nurse practitioner service is one health workforce innovation that has been adopted globally to improve timely access to clinical care, but there is scant literature reporting evaluation of the quality of this service innovation. Design. A mixed-methods design within the Donabedian evaluation framework was used. Methods The Donabedian framework was used to evaluate the Structure, Process and Outcome of nurse practitioner service. A range of data collection approaches was used, including stakeholder survey (n=36), in-depth interviews (11 patients and 13 nurse practitioners) and health records data on service processes. Results The study identified that adequate and detailed preparation of Structure and Process is essential for the successful implementation of a service innovation. The multidisciplinary team was accepting of the addition of nurse practitioner service, and nurse practitioner clinical care was shown to be effective, satisfactory and safe from the perspective of the clinician stakeholders and patients. Conclusions This study demonstrated that the Donabedian framework of Structure, Process and Outcome evaluation is a valuable and validated approach to examine the safety and quality of a service innovation. Furthermore, in this study, specific Structure elements were shown to influence the quality of service processes further validating the framework and the interdependence of the Structure, Process and Outcome components. Relevance to clinical practice Understanding the structure and process requirements for establishing nursing service innovation lays the foundation for safe, effective and patient-centred clinical care.
Resumo:
There has been significant attention from the managers and purchasers of health services regarding the economic advantages that result from changes to the patterns of health care delivery in the acute hospital setting. The impact of these changes, whilst often rendering advantage at the economic management level of health care, can have different consequences for the people who deliver and the people who receive health service. This paper reports on a study that was conducted with a group of nurses to investigate the practice milieu of a critical care unit in the context of changes to health service management. Interpretive methods were used to capture the perspective of the nurses and the way they interpret the multiple factors that influence their practice and their practice environment. The findings indicate that the nurses in the study setting interpret these factors according to the influences they have on the structure, the geography and the value of their work. Explication of these findings provides a research base to inform recommendations relating to improving the practice milieu of the critical care environment.
Resumo:
Background: Historically rail organisations have been operating in silos and devising their own training agendas. However with the harmonisation of the Australian workplace health and safety legislation and the appointment of a national rail safety regulator in 2013, rail incident investigator experts are exploring the possibility of developing a unified approach to investigator training. Objectives: The Australian CRC for Rail Innovation commissioned a training needs analysis to identify if common training needs existed between organisations and to assess support for the development of a national competency framework for rail incident investigations. Method: Fifty-two industry experts were consulted to explore the possibility of the development of a standardised training framework. These experts were sourced from within 19 Australasian organisations, comprising Rail Operators and Regulators in Queensland, New South Wales, Victoria, Western Australia, South Australia and New Zealand. Results: Although some competency requirements appear to be organisation specific, the vast majority of reported training requirements were generic across the Australasian rail operators and regulators. Industry experts consistently reported strong support for the development of a national training framework. Significance: The identification of both generic training requirements across organisations and strong support for standardised training indicates that the rail industry is receptive to the development of a structured training framework. The development of an Australasian learning framework could: increase efficiency in course development and reduce costs; establish recognised career pathways; and facilitate consistency with regards to investigator training.
Resumo:
Vietnam has a unique culture which is revealed in the way that people have built and designed their traditional housing. Vietnamese dwellings reflect occupants’ activities in their everyday lives, while adapting to tropical climatic conditions impacted by seasoning monsoons. It is said that these characteristics of Vietnamese dwellings have remained unchanged until the economic reform in 1986, when Vietnam experienced an accelerated development based on the market-oriented economy. New housing types, including modern shop-houses, detached houses, and apartments, have been designed in many places, especially satisfying dwellers’ new lifestyles in Vietnamese cities. The contemporary housing, which has been mostly designed by architects, has reflected rules of spatial organisation so that occupants’ social activities are carried out. However, contemporary housing spaces seem unsustainable in relation to socio-cultural values because they has been influenced by globalism that advocates the use of homogeneous spatial patterns, modern technologies, materials and construction methods. This study investigates the rules of spaces in Vietnamese houses that were built before and after the reform to define the socio-cultural implications in Vietnamese housing design. Firstly, it describes occupants’ views of their current dwellings in terms of indoor comfort conditions and social activities in spaces. Then, it examines the use of spaces in pre-reform Vietnamese housing through occupants’ activities and material applications. Finally, it discusses the organisation of spaces in both pre- and post-reform housing to understand how Vietnamese housing has been designed for occupants to live, act, work, and conduct traditional activities. Understanding spatial organisation is a way to identify characteristics of the lived spaces of the occupants created from the conceived space, which is designed by designers. The characteristics of the housing spaces will inform the designers the way to design future Vietnamese housing in response to cultural contexts. The study applied an abductive approach for the investigation of housing spaces. It used a conceptual framework in relation to Henri Lefebvre’s (1991) theory to understand space as the main factor constituting the language of design, and the principles of semiotics to examine spatial structure in housing as a language used in the everyday life. The study involved a door-knocking survey to 350 households in four regional cities of Vietnam for interpretation of occupancy conditions and levels of occupants’ comfort. A statistical analysis was applied to interpret the survey data. The study also required a process of data selection and collection of fourteen cases of housing in three main climatic regions of the country for analysing spatial organisation and housing characteristics. The study found that there has been a shift in the relationship of spaces from the pre- to post-reform Vietnamese housing. It also indentified that the space for guest welcoming and family activity has been the central space of the Vietnamese housing. Based on the relationships of the central space with the others, theoretical models were proposed for three types of contemporary Vietnamese housing. The models will be significant in adapting to Vietnamese conditions to achieve socioenvironmental characteristics for housing design because it was developed from the occupants’ requirements for their social activities. Another contribution of the study is the use of methodological concepts to understand the language of living spaces. Further work will be needed to test future Vietnamese housing designs from the applications of the models.
Resumo:
Recent decades have witnessed a global acceleration of legislative and private sector initiatives to deal with Cross-Border insolvency. Legislative institutions include the various national implementations of the Model Law on Cross-Border Insolvency (Model Law) published by the United Nations Commission on International Trade (UNCITRAL).3 Private mechanisms include Cross-Border protocols developed and utilised by insolvency professionals and their advisers (often with the imprimatur of the judiciary), on both general and ad hoc bases. The Asia Pacific region has not escaped the effect of those developments, and the economic turmoil of the past few years has provided an early test for some of the emerging initiatives in that region. This two-part article explores the operation of those institutions through the medium of three recent cases.
Resumo:
Recent decades have witnessed a global acceleration of legislative and private sector initiatives to deal with Cross-Border insolvency. Legislative institutions include the various national implementations of the Model Law on Cross-Border Insolvency (Model Law) published by the United Nations Commission on International Trade (UNCITRAL).3 Private mechanisms include Cross-Border protocols developed and utilised by insolvency professionals and their advisers (often with the imprimatur of the judiciary), on both general and ad hoc bases. The Asia Pacific region has not escaped the effect of those developments, and the economic turmoil of the past few years has provided an early test for some of the emerging initiatives in that region. This two-part article explores the operation of those institutions through the medium of three recent cases.
Resumo:
In this paper, we examine the lawfulness of a proposal to provide elective ventilation to incompetent patients who are potential organ donors. Under the current legal framework, this depends on whether the best interests test could be satisfied. It might be argued that, because the Mental Capacity Act 2005 (UK) (and the common law) makes it clear that the best interests test is not confined to the patient's clinical interests, but extends to include the individual's own values, wishes and beliefs, the proposal will be in the patient's best interests. We reject this claim. We argue that, as things currently stand, the proposal could not lawfully be justified as a blanket proposition by reference to the best interests test. Accordingly, a modification of the law would be necessary to render the proposal lawful. We conclude with a suggestion about how that could be achieved.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.
Resumo:
Due to increased complexity, scale, and functionality of information and telecommunication (IT) infrastructures, every day new exploits and vulnerabilities are discovered. These vulnerabilities are most of the time used by ma¬licious people to penetrate these IT infrastructures for mainly disrupting business or stealing intellectual pro¬perties. Current incidents prove that it is not sufficient anymore to perform manual security tests of the IT infra¬structure based on sporadic security audits. Instead net¬works should be continuously tested against possible attacks. In this paper we present current results and challenges towards realizing automated and scalable solutions to identify possible attack scenarios in an IT in¬frastructure. Namely, we define an extensible frame¬work which uses public vulnerability databases to identify pro¬bable multi-step attacks in an IT infrastructure, and pro¬vide recommendations in the form of patching strategies, topology changes, and configuration updates.
Resumo:
Flexible information exchange is critical to successful design-analysis integration, but current top-down, standards-based and model-oriented strategies impose restrictions that contradicts this flexibility. In this article we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. We then discuss how a shared mapping process that is flexible and user friendly supports non-programmers in creating these custom connections. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We then discuss potential challenges and opportunities for its development as a flexible, visual, collaborative, scalable and open system.
Resumo:
Flexible information exchange is critical to successful design integration, but current top-down, standards-based and model-oriented strategies impose restrictions that are contradictory to this flexibility. In this paper we present a bottom-up, user-controlled and process-oriented approach to linking design and analysis applications that is more responsive to the varied needs of designers and design teams. Drawing on research into scientific workflows, we present a framework for integration that capitalises on advances in cloud computing to connect discrete tools via flexible and distributed process networks. Adopting a services-oriented system architecture, we propose a web-based platform that enables data, semantics and models to be shared on the fly. We discuss potential challenges and opportunities for the development thereof as a flexible, visual, collaborative, scalable and open system.
Resumo:
Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected at the end of a course. This paper reports the results of a study investigating the difficulty of exam questions using a subjective assessment of difficulty and a purpose-built exam question complexity classification scheme. The scheme, devised for exams in introductory programming courses, assesses the complexity of each question using six measures: external domain references, explicitness, linguistic complexity, conceptual complexity, length of code involved in the question and/or answer, and intellectual complexity (Bloom level). We apply the scheme to 20 introductory programming exam papers from five countries, and find substantial variation across the exams for all measures. Most exams include a mix of questions of low, medium, and high difficulty, although seven of the 20 have no questions of high difficulty. All of the complexity measures correlate with assessment of difficulty, indicating that the difficulty of an exam question relates to each of these more specific measures. We discuss the implications of these findings for the development of measures to assess learning standards in programming courses.