863 resultados para Ershov hierarchy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The Analytic Hierarchy Process (AHP), developed by Saaty in the late 1970s, is one of the methods for multi-criteria decision making. The AHP disaggregates a complex decision problem into different hierarchical levels. The weight for each criterion and alternative are judged in pairwise comparisons and priorities are calculated by the Eigenvector method. The slowly increasing application of the AHP was the motivation for this study to explore the current state of its methodology in the healthcare context. Methods: A systematic literature review was conducted by searching the Pubmed and Web of Science databases for articles with the following keywords in their titles or abstracts: "Analytic Hierarchy Process," "Analytical Hierarchy Process," "multi-criteria decision analysis," "multiple criteria decision," "stated preference," and "pairwise comparison." In addition, we developed reporting criteria to indicate whether the authors reported important aspects and evaluated the resulting studies' reporting. Results: The systematic review resulted in 121 articles. The number of studies applying AHP has increased since 2005. Most studies were from Asia (almost 30 %), followed by the US (25.6 %). On average, the studies used 19.64 criteria throughout their hierarchical levels. Furthermore, we restricted a detailed analysis to those articles published within the last 5 years (n = 69). The mean of participants in these studies were 109, whereas we identified major differences in how the surveys were conducted. The evaluation of reporting showed that the mean of reported elements was about 6.75 out of 10. Thus, 12 out of 69 studies reported less than half of the criteria. Conclusion: The AHP has been applied inconsistently in healthcare research. A minority of studies described all the relevant aspects. Thus, the statements in this review may be biased, as they are restricted to the information available in the papers. Hence, further research is required to discover who should be interviewed and how, how inconsistent answers should be dealt with, and how the outcome and stability of the results should be presented. In addition, we need new insights to determine which target group can best handle the challenges of the AHP. © 2015 Schmidt et al.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Operativamente entenderemos la legitimación artística como el proceso por medio del cual una producción simbólico-cultural es reconocida en alguno de los ámbitos de lo artístico como obra de arte. Partimos de la idea de que las obras de arte son reconocidas como tales debido a procesos de validación que se dan en un campo institucional en el que operan muy diversas fuerzas y agentes que interactúan de manera compleja. Lo cual nos hace pensar que esta “institución artística” (Dickie, 2005) no se comporta de manera homogénea, ni es monolítica, sino que existen diversas fuerzas, unas con más intensidad que otras, que operan y que establecen valores que no todos aceptan, pero que dependiendo de qué agente de las institución establezca el valor, y del peso que éste tenga en la institución, será el grado de reconocimiento que obtenga la obra.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Networked control over data networks has received increasing attention in recent years. Among many problems in networked control systems (NCSs) is the need to reduce control latency and jitter and to deal with packet dropouts. This paper introduces our recent progress on a queuing communication architecture for real-time NCS applications, and simple strategies for dealing with packet dropouts. Case studies for a middle-scale process or multiple small-scale processes are presented for TCP/IP based real-time NCSs. Variations of network architecture design are modelled, simulated, and analysed for evaluation of control latency and jitter performance. It is shown that a simple bandwidth upgrade or adding hierarchy does not necessarily bring benefits for performance improvement of control latency and jitter. A co-design of network and control is necessary to maximise the real-time control performance of NCSs

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In practical terms, conceptual modeling is at the core of systems analysis and design. The plurality of modeling methods available has however been regarded as detrimental, and as a strong indication that a common view or theoretical grounding of modeling is wanting. This theoretical foundation must universally address all potential matters to be represented in a model, which consequently suggested ontology as the point of departure for theory development. The Bunge–Wand–Weber (BWW) ontology has become a widely accepted modeling theory. Its application has simultaneously led to the recognition that, although suitable as a meta-model, the BWW ontology needs to be enhanced regarding its expressiveness in empirical domains. In this paper, a first step in this direction has been made by revisiting BUNGE’s ontology, and by proposing the integration of a “hierarchy of systems” in the BWW ontology for accommodating domain specific conceptualizations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lack of satisfactory consensus for characterizing the system intelligence and structured analytical decision models has inhibited the developers and practitioners to understand and configure optimum intelligent building systems in a fully informed manner. So far, little research has been conducted in this aspect. This research is designed to identify the key intelligent indicators, and develop analytical models for computing the system intelligence score of smart building system in the intelligent building. The integrated building management system (IBMS) was used as an illustrative example to present a framework. The models presented in this study applied the system intelligence theory, and the conceptual analytical framework. A total of 16 key intelligent indicators were first identified from a general survey. Then, two multi-criteria decision making (MCDM) approaches, the analytic hierarchy process (AHP) and analytic network process (ANP), were employed to develop the system intelligence analytical models. Top intelligence indicators of IBMS include: self-diagnostic of operation deviations; adaptive limiting control algorithm; and, year-round time schedule performance. The developed conceptual framework was then transformed to the practical model. The effectiveness of the practical model was evaluated by means of expert validation. The main contribution of this research is to promote understanding of the intelligent indicators, and to set the foundation for a systemic framework that provide developers and building stakeholders a consolidated inclusive tool for the system intelligence evaluation of the proposed components design configurations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce K-tree in an information retrieval context. It is an efficient approximation of the k-means clustering algorithm. Unlike k-means it forms a hierarchy of clusters. It has been extended to address issues with sparse representations. We compare performance and quality to CLUTO using document collections. The K-tree has a low time complexity that is suitable for large document collections. This tree structure allows for efficient disk based implementations where space requirements exceed that of main memory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research presented in this thesis addresses inherent problems in signaturebased intrusion detection systems (IDSs) operating in heterogeneous environments. The research proposes a solution to address the difficulties associated with multistep attack scenario specification and detection for such environments. The research has focused on two distinct problems: the representation of events derived from heterogeneous sources and multi-step attack specification and detection. The first part of the research investigates the application of an event abstraction model to event logs collected from a heterogeneous environment. The event abstraction model comprises a hierarchy of events derived from different log sources such as system audit data, application logs, captured network traffic, and intrusion detection system alerts. Unlike existing event abstraction models where low-level information may be discarded during the abstraction process, the event abstraction model presented in this work preserves all low-level information as well as providing high-level information in the form of abstract events. The event abstraction model presented in this work was designed independently of any particular IDS and thus may be used by any IDS, intrusion forensic tools, or monitoring tools. The second part of the research investigates the use of unification for multi-step attack scenario specification and detection. Multi-step attack scenarios are hard to specify and detect as they often involve the correlation of events from multiple sources which may be affected by time uncertainty. The unification algorithm provides a simple and straightforward scenario matching mechanism by using variable instantiation where variables represent events as defined in the event abstraction model. The third part of the research looks into the solution to address time uncertainty. Clock synchronisation is crucial for detecting multi-step attack scenarios which involve logs from multiple hosts. Issues involving time uncertainty have been largely neglected by intrusion detection research. The system presented in this research introduces two techniques for addressing time uncertainty issues: clock skew compensation and clock drift modelling using linear regression. An off-line IDS prototype for detecting multi-step attacks has been implemented. The prototype comprises two modules: implementation of the abstract event system architecture (AESA) and of the scenario detection module. The scenario detection module implements our signature language developed based on the Python programming language syntax and the unification-based scenario detection engine. The prototype has been evaluated using a publicly available dataset of real attack traffic and event logs and a synthetic dataset. The distinct features of the public dataset are the fact that it contains multi-step attacks which involve multiple hosts with clock skew and clock drift. These features allow us to demonstrate the application and the advantages of the contributions of this research. All instances of multi-step attacks in the dataset have been correctly identified even though there exists a significant clock skew and drift in the dataset. Future work identified by this research would be to develop a refined unification algorithm suitable for processing streams of events to enable an on-line detection. In terms of time uncertainty, identified future work would be to develop mechanisms which allows automatic clock skew and clock drift identification and correction. The immediate application of the research presented in this thesis is the framework of an off-line IDS which processes events from heterogeneous sources using abstraction and which can detect multi-step attack scenarios which may involve time uncertainty.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document provides a review of international and national practices in investment decision support tools in road asset management. Efforts were concentrated on identifying analytic frameworks, evaluation methodologies and criteria adopted by current tools. Emphasis was also given to how current approaches support Triple Bottom Line decision-making. Benefit Cost Analysis and Multiple Criteria Analysis are principle methodologies in supporting decision-making in Road Asset Management. The complexity of the applications shows significant differences in international practices. There is continuing discussion amongst practitioners and researchers regarding to which one is more appropriate in supporting decision-making. It is suggested that the two approaches should be regarded as complementary instead of competitive means. Multiple Criteria Analysis may be particularly helpful in early stages of project development, say strategic planning. Benefit Cost Analysis is used most widely for project prioritisation and selecting the final project from amongst a set of alternatives. Benefit Cost Analysis approach is useful tool for investment decision-making from an economic perspective. An extension of the approach, which includes social and environmental externalities, is currently used in supporting Triple Bottom Line decision-making in the road sector. However, efforts should be given to several issues in the applications. First of all, there is a need to reach a degree of commonality on considering social and environmental externalities, which may be achieved by aggregating the best practices. At different decision-making level, the detail of consideration of the externalities should be different. It is intended to develop a generic framework to coordinate the range of existing practices. The standard framework will also be helpful in reducing double counting, which appears in some current practices. Cautions should also be given to the methods of determining the value of social and environmental externalities. A number of methods, such as market price, resource costs and Willingness to Pay, are found in the review. The use of unreasonable monetisation methods in some cases has discredited Benefit Cost Analysis in the eyes of decision makers and the public. Some social externalities, such as employment and regional economic impacts, are generally omitted in current practices. This is due to the lack of information and credible models. It may be appropriate to consider these externalities in qualitative forms in a Multiple Criteria Analysis. Consensus has been reached in considering noise and air pollution in international practices. However, Australia practices generally omitted these externalities. Equity is an important consideration in Road Asset Management. The considerations are either between regions, or social groups, such as income, age, gender, disable, etc. In current practice, there is not a well developed quantitative measure for equity issues. More research is needed to target this issue. Although Multiple Criteria Analysis has been used for decades, there is not a generally accepted framework in the choice of modelling methods and various externalities. The result is that different analysts are unlikely to reach consistent conclusions about a policy measure. In current practices, some favour using methods which are able to prioritise alternatives, such as Goal Programming, Goal Achievement Matrix, Analytic Hierarchy Process. The others just present various impacts to decision-makers to characterise the projects. Weighting and scoring system are critical in most Multiple Criteria Analysis. However, the processes of assessing weights and scores were criticised as highly arbitrary and subjective. It is essential that the process should be as transparent as possible. Obtaining weights and scores by consulting local communities is a common practice, but is likely to result in bias towards local interests. Interactive approach has the advantage in helping decision-makers elaborating their preferences. However, computation burden may result in lose of interests of decision-makers during the solution process of a large-scale problem, say a large state road network. Current practices tend to use cardinal or ordinal scales in measure in non-monetised externalities. Distorted valuations can occur where variables measured in physical units, are converted to scales. For example, decibels of noise converts to a scale of -4 to +4 with a linear transformation, the difference between 3 and 4 represents a far greater increase in discomfort to people than the increase from 0 to 1. It is suggested to assign different weights to individual score. Due to overlapped goals, the problem of double counting also appears in some of Multiple Criteria Analysis. The situation can be improved by carefully selecting and defining investment goals and criteria. Other issues, such as the treatment of time effect, incorporating risk and uncertainty, have been given scant attention in current practices. This report suggested establishing a common analytic framework to deal with these issues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Major infrastructure assets are often governed by a mix of public and private organizations, each fulfilling a specific and separate role i.e. policy, ownership, operation or maintenance. This mix of entities is a legacy of Public Choice Theory influenced NPM reforms of the late 20th century. The privatization of the public sector has resulted in agency theory based ‘self-interest’ relationships and governance arrangements for major infrastructure assets which emphasize economic efficiency but which do not do not advance non-economic public values and the collective Public Interest. The community is now requiring that governments fulfill their stewardship role of also satisfying non-economic public values such as sustainability and intergenerational responsibility. In the 21st century governance arrangements which minimize individual self-interest alone and look to also pursue the interests of other stakeholders have emerged. Relational contracts, Public-Private Partnerships (PPP’s) and hybrid mixes of organizations from the state, market and network modes (Keast et al 2006) provide options for governance which better meet the interests of contractors, government and the community there is emerging a body of research which extends the consideration of the immediate governance configuration to the metagovernance environment constituted by hierarchy, regulation, industry standards, trust, culture and values. Stewardship theory has reemerged as a valuable aid in the understanding of the features of governance configurations which establish relationships between principal and agent which maximize the agent acting in the interests of the principal, even to the detriment of the agent. This body of literature suggests that an improved stewardship outcome from infrastructure governance configurations can be achieved by the application of the emerging options as to the immediate governance configuration, and the surrounding metagovernance environment. Stewardship theory provides a framework for the design of the relationships within that total governance environment, focusing on the achievement of a better, complete stewardship outcome. This paper explores the directions future research might take in seeking to improve the understanding of the design of the governance of major, critical infrastructure assets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For a sustainable building industry, not only should the environmental and economic indicators be evaluated but also the societal indicators for building. Current indicators can be in conflict with each other, thus decision making is difficult to clearly quantify and assess sustainability. For the sustainable building, the objectives of decreasing both adverse environmental impact and cost are in conflict. In addition, even though both objectives may be satisfied, building management systems may present other problems such as convenience of occupants, flexibility of building, or technical maintenance, which are difficult to quantify as exact assessment data. These conflicting problems confronting building managers or planners render building management more difficult. This paper presents a methodology to evaluate a sustainable building considering socio-economic and environmental characteristics of buildings, and is intended to assist the decision making for building planners or practitioners. The suggested methodology employs three main concepts: linguistic variables, fuzzy numbers, and an analytic hierarchy process. The linguistic variables are used to represent the degree of appropriateness of qualitative indicators, which are vague or uncertain. These linguistic variables are then translated into fuzzy numbers to reflect their uncertainties and aggregated into the final fuzzy decision value using a hierarchical structure. Through a case study, the suggested methodology is applied to the evaluation of a building. The result demonstrates that the suggested approach can be a useful tool for evaluating a building for sustainability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on a replication of earlier studies into a possible hierarchy of programming skills. In this study, the students from whom data was collected were at a university that had not provided data for earlier studies. Also, the students were taught the programming language Python, which had not been used in earlier studies. Thus this study serves as a test of whether the findings in the earlier studies were specific to certain institutions, student cohorts, and programming languages. Also, we used a non–parametric approach to the analysis, rather than the linear approach of earlier studies. Our results are consistent with the earlier studies. We found that students who cannot trace code usually cannot explain code, and also that students who tend to perform reasonably well at code writing tasks have also usually acquired the ability to both trace code and explain code.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

President’s Message Hello fellow AITPM members, Well I can’t believe it’s already October! My office is already organising its end of year function and looking to plan for 2010. Our whole School is moving to a different building next year, with the lovely L block eventually making way for a new shiny one. Those of you who have entered the Brisbane CBD from the south side, across the Captain Cook Bridge, would know L block as the big 9 storey brick and concrete Lego block ode to 1970’s functional architecture, which greets you on the right hand side. Onto traffic matters: an issue that has been tossing around in my mind of late is that of speed. I know I am growing older and may be prematurely becoming a “grumpy old man”, but everyone around me locally seems to be accelerating off from the stop line much faster than I was taught to for economical driving, both here and in the United States (yes they made my wife and me resit our written and practical driving tests when we lived there). People here in Australia also seem to be driving right on top of the posted speed limit, on whichever part of the Road Hierarchy, whether urban or rural. I was also taught on both sides of the planet that the posted speed limit is a maximum legal speed, not the recommended driving speed. This message did seem to sink in to the American drivers around me when we lived in Oregon - where people did appear to drive more cautiously. Further, posted speed limits in Oregon were, and I presume still are, set more conservative by about 5mph or 10km/h than Australian limits, for any given part of the Road Hierarchy. Another excellent speed limit treatment used in Oregon was in school zones, where reduced speed limits applied “when children are present” rather than during prescribed hours on school days. This would be especially useful here in Australia, where a lot of extra-curricular activities take place around schools outside of the prescribed speed limit hours. Before and after hours school care is on the increase (with parents dropping and collecting children near dawn and dusk in the winter), and many childcentred land uses are located adjacent to schools, such as Scouts/Guides halls, swimming pools and parks. Consequentially, I believe there needs to be some consideration towards more public campaigning about economical driving and the real purpose of the speed limit = or perhaps even a rethink of the speed limit concept, if people really are driving on top of it and it’s not just me becoming grumpier (our industrial psychology friends at the research centres may be able to assist us here). The Queensland organising committee is now in full swing organising the 2010 AITPM National Conference, What’s New?, so please keep a lookout for related content. Best regards to all, Jon Bunker PS A Cartoonists view of traffic engineers I thought you might enjoy this. http://xkcd.com/277/

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, practitioners and researchers alike have turned their attention to knowledge management (KM) in order to increase organisational performance (OP). As a result, many different approaches and strategies have been investigated and suggested for how knowledge should be managed to make organisations more effective and efficient. However, most research has been undertaken in the for-profit sector, with only a few studies focusing on the benefits nonprofit organisations might gain by managing knowledge. This study broadly investigates the impact of knowledge management on the organisational performance of nonprofit organisations. Organisational performance can be evaluated through either financial or non-financial measurements. In order to evaluate knowledge management and organisational performance, non-financial measurements are argued to be more suitable given that knowledge is an intangible asset which often cannot be expressed through financial indicators. Non-financial measurement concepts of performance such as the balanced scorecard or the concept of Intellectual Capital (IC) are well accepted and used within the for-profit and nonprofit sectors to evaluate organisational performance. This study utilised the concept of IC as the method to evaluate KM and OP in the context of nonprofit organisations due to the close link between KM and IC: Indeed, KM is concerned with managing the KM processes of creating, storing, sharing and applying knowledge and the organisational KM infrastructure such as organisational culture or organisational structure to support these processes. On the other hand, IC measures the knowledge stocks in different ontological levels: at the individual level (human capital), at the group level (relational capital) and at the organisational level (structural capital). In other words, IC measures the value of the knowledge which has been managed through KM. As KM encompasses the different KM processes and the KM infrastructure facilitating these processes, previous research has investigated the relationship between KM infrastructure and KM processes. Organisational culture, organisational structure and the level of IT support have been identified as the main factors of the KM infrastructure influencing the KM processes of creating, storing, sharing and applying knowledge. Other research has focused on the link between KM and OP or organisational effectiveness. Based on existing literature, a theoretical model was developed to enable the investigation of the relation between KM (encompassing KM infrastructure and KM processes) and IC. The model assumes an association between KM infrastructure and KM processes, as well as an association between KM processes and the various levels of IC (human capital, structural capital and relational capital). As a result, five research questions (RQ) with respect to the various factors of the KM infrastructure as well as with respect to the relationship between KM infrastructure and IC were raised and included into the research model: RQ 1 Do nonprofit organisations which have a Hierarchy culture have a stronger IT support than nonprofit organisations which have an Adhocracy culture? RQ 2 Do nonprofit organisations which have a centralised organisational structure have a stronger IT support than nonprofit organisations which have decentralised organisational structure? RQ 3 Do nonprofit organisations which have a stronger IT support have a higher value of Human Capital than nonprofit organisations which have a less strong IT support? RQ 4 Do nonprofit organisations which have a stronger IT support have a higher value of Structural Capital than nonprofit organisations which have a less strong IT support? RQ 5 Do nonprofit organisations which have a stronger IT support have a higher value of Relational Capital than nonprofit organisations which have a less strong IT support? In order to investigate the research questions, measurements for IC were developed which were linked to the main KM processes. The final KM/IC model contained four items for evaluating human capital, five items for evaluating structural capital and four items for evaluating relational capital. The research questions were investigated through empirical research using a case study approach with the focus on two nonprofit organisations providing trade promotions services through local offices worldwide. Data for the investigation of the assumptions were collected via qualitative as well as quantitative research methods. The qualitative study included interviews with representatives of the two participating organisations as well as in-depth document research. The purpose of the qualitative study was to investigate the factors of the KM infrastructure (organisational culture, organisational structure, IT support) of the organisations and how these factors were related to each other. On the other hand, the quantitative study was carried out through an online-survey amongst staff of the various local offices. The purpose of the quantitative study was to investigate which impact the level of IT support, as the main instrument of the KM infrastructure, had on IC. Overall several key themes were found as a result of the study: • Knowledge Management and Intellectual Capital were complementary with each other, which should be expressed through measurements of IC based on KM processes. • The various factors of the KM infrastructure (organisational culture, organisational structure and level of IT support) are interdependent. • IT was a primary instrument through which the different KM processes (creating, storing, sharing and applying knowledge) were performed. • A high level of IT support was evident when participants reported higher level of IC (human capital, structural capital and relational capital). The study supported previous research in the field of KM and replicated the findings from other case studies in this area. The study also contributed to theory by placing the KM research within the nonprofit context and analysing the linkage between KM and IC. From the managerial perspective, the findings gave clear indications that would allow interested parties, such as nonprofit managers or consultants to understand more about the implications of KM on OP and to use this knowledge for implementing efficient and effective KM strategies within their organisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most frequently told story charting the rise of mass schooling should be fairly familiar to most of us. This story normally centres around the post-enlightenment social changes of the late 18th and early 19th centuries, and details how society slowly became more caring and more humane, and how we all decided that rather than simply being fodder for the mills, all children – including those from the working-classes - had the right to an education. The more civilised we became, the more we pushed back the school leaving age, until we eventually developed schools which clearly reflected the values and ambitions of the wider community. After all, are school not simply microcosms of society at large? In addition to this, the form that modern schooling takes is regarded as an unproblematic part of the same story. Of course we should organise our learning in the way we do, with the emphasis on formalised learning spaces, graded curricula, timetables of activities, various forms of assessment, and a clear hierarchy of authority. These features of the contemporary education merely reflect the fact that this is self-evidently the best system available. After all, how else could education possibly be organised?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Place branding has become a major focus of operations for destination marketing organizations (DMOs) striving for differentiation in cluttered markets. The topic of destination branding has only received attention in the tourism literature since the late 1990s, and there has been relatively little research reported in relations to analyzing destination brand effectiveness over time. This article reports an attempt to oprationalize the concept of consumer-based brand equity (CBBE) for an emerging destination over two points in time. The purpose of the project was to track the effectiveness of the brand in 2007 against benchmarks that were established in a 2003 student at the commencement of a new destination brand campaign. The key finding was there was no change in perceived performance for the destination across the brand's performance indicators and CBBE dimensions. Because of the common challenges faced by DMOs worldwide, it is suggested the CBBE hierarchy provides destination marketers with a practical tool for evaluation brand performance over time.