174 resultados para Performance management systems


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays, Workflow Management Systems (WfMSs) and, more generally, Process Management Systems (PMPs) are process-aware Information Systems (PAISs), are widely used to support many human organizational activities, ranging from well-understood, relatively stable and structures processes (supply chain management, postal delivery tracking, etc.) to processes that are more complicated, less structured and may exhibit a high degree of variation (health-care, emergency management, etc.). Every aspect of a business process involves a certain amount of knowledge which may be complex depending on the domain of interest. The adequate representation of this knowledge is determined by the modeling language used. Some processes behave in a way that is well understood, predictable and repeatable: the tasks are clearly delineated and the control flow is straightforward. Recent discussions, however, illustrate the increasing demand for solutions for knowledge-intensive processes, where these characteristics are less applicable. The actors involved in the conduct of a knowledge-intensive process have to deal with a high degree of uncertainty. Tasks may be hard to perform and the order in which they need to be performed may be highly variable. Modeling knowledge-intensive processes can be complex as it may be hard to capture at design-time what knowledge is available at run-time. In realistic environments, for example, actors lack important knowledge at execution time or this knowledge can become obsolete as the process progresses. Even if each actor (at some point) has perfect knowledge of the world, it may not be certain of its beliefs at later points in time, since tasks by other actors may change the world without those changes being perceived. Typically, a knowledge-intensive process cannot be adequately modeled by classical, state of the art process/workflow modeling approaches. In some respect there is a lack of maturity when it comes to capturing the semantic aspects involved, both in terms of reasoning about them. The main focus of the 1st International Workshop on Knowledge-intensive Business processes (KiBP 2012) was investigating how techniques from different fields, such as Artificial Intelligence (AI), Knowledge Representation (KR), Business Process Management (BPM), Service Oriented Computing (SOC), etc., can be combined with the aim of improving the modeling and the enactment phases of a knowledge-intensive process. The 1st International Workshop on Knowledge-intensive Business process (KiBP 2012) was held as part of the program of the 2012 Knowledge Representation & Reasoning International Conference (KR 2012) in Rome, Italy, in June 2012. The workshop was hosted by the Dipartimento di Ingegneria Informatica, Automatica e Gestionale Antonio Ruberti of Sapienza Universita di Roma, with financial support of the University, through grant 2010-C26A107CN9 TESTMED, and the EU Commission through the projects FP7-25888 Greener Buildings and FP7-257899 Smart Vortex. This volume contains the 5 papers accepted and presented at the workshop. Each paper was reviewed by three members of the internationally renowned Program Committee. In addition, a further paper was invted for inclusion in the workshop proceedings and for presentation at the workshop. There were two keynote talks, one by Marlon Dumas (Institute of Computer Science, University of Tartu, Estonia) on "Integrated Data and Process Management: Finally?" and the other by Yves Lesperance (Department of Computer Science and Engineering, York University, Canada) on "A Logic-Based Approach to Business Processes Customization" completed the scientific program. We would like to thank all the Program Committee members for the valuable work in selecting the papers, Andrea Marrella for his valuable work as publication and publicity chair of the workshop, and Carola Aiello and the consulting agency Consulta Umbria for the organization of this successful event.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A building information model (BIM) is an electronic repository of structured, three-dimensional data that captures both the physical and dynamic functional characteristics of a facility. In addition to its more traditional function as a tool to aid design and construction, a BIM can be used throughout the life cycle of a facility, functioning as a living database that places resources contained within the building in their spatial and temporal context. Through its comprehension of spatial relationships, a BIM can meaningfully represent and integrate previously isolated control and management systems and processes, and thereby provide a more intuitive interface to users. By placing processes in a spatial context, decision-making can be improved, with positive flow-on effects for security and efficiency. In this article, we systematically analyse the authorization requirements involved in the use of BIMs. We introduce the concept of using a BIM as a graphical tool to support spatial access control configuration and management (including physical access control). We also consider authorization requirements for regulating access to the structured data that exists within a BIM as well as to external systems and data repositories that can be accessed via the BIM interface. With a view to addressing these requirements we present a survey of relevant spatiotemporal access control models, focusing on features applicable to BIMs and highlighting capability gaps. Finally, we present a conceptual authorization framework that utilizes BIMs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Facial expression is an important channel of human social communication. Facial expression recognition (FER) aims to perceive and understand emotional states of humans based on information in the face. Building robust and high performance FER systems that can work in real-world video is still a challenging task, due to the various unpredictable facial variations and complicated exterior environmental conditions, as well as the difficulty of choosing a suitable type of feature descriptor for extracting discriminative facial information. Facial variations caused by factors such as pose, age, gender, race and occlusion, can exert profound influence on the robustness, while a suitable feature descriptor largely determines the performance. Most present attention on FER has been paid to addressing variations in pose and illumination. No approach has been reported on handling face localization errors and relatively few on overcoming facial occlusions, although the significant impact of these two variations on the performance has been proved and highlighted in many previous studies. Many texture and geometric features have been previously proposed for FER. However, few comparison studies have been conducted to explore the performance differences between different features and examine the performance improvement arisen from fusion of texture and geometry, especially on data with spontaneous emotions. The majority of existing approaches are evaluated on databases with posed or induced facial expressions collected in laboratory environments, whereas little attention has been paid on recognizing naturalistic facial expressions on real-world data. This thesis investigates techniques for building robust and high performance FER systems based on a number of established feature sets. It comprises of contributions towards three main objectives: (1) Robustness to face localization errors and facial occlusions. An approach is proposed to handle face localization errors and facial occlusions using Gabor based templates. Template extraction algorithms are designed to collect a pool of local template features and template matching is then performed to covert these templates into distances, which are robust to localization errors and occlusions. (2) Improvement of performance through feature comparison, selection and fusion. A comparative framework is presented to compare the performance between different features and different feature selection algorithms, and examine the performance improvement arising from fusion of texture and geometry. The framework is evaluated for both discrete and dimensional expression recognition on spontaneous data. (3) Evaluation of performance in the context of real-world applications. A system is selected and applied into discriminating posed versus spontaneous expressions and recognizing naturalistic facial expressions. A database is collected from real-world recordings and is used to explore feature differences between standard database images and real-world images, as well as between real-world images and real-world video frames. The performance evaluations are based on the JAFFE, CK, Feedtum, NVIE, Semaine and self-collected QUT databases. The results demonstrate high robustness of the proposed approach to the simulated localization errors and occlusions. Texture and geometry have different contributions to the performance of discrete and dimensional expression recognition, as well as posed versus spontaneous emotion discrimination. These investigations provide useful insights into enhancing robustness and achieving high performance of FER systems, and putting them into real-world applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many substation applications require accurate time-stamping. The performance of systems such as Network Time Protocol (NTP), IRIG-B and one pulse per second (1-PPS) have been sufficient to date. However, new applications, including IEC 61850-9-2 process bus and phasor measurement, require accuracy of one microsecond or better. Furthermore, process bus applications are taking time synchronisation out into high voltage switchyards where cable lengths may have an impact on timing accuracy. IEEE Std 1588, Precision Time Protocol (PTP), is the means preferred by the smart grid standardisation roadmaps (from both the IEC and US National Institute of Standards and Technology) of achieving this higher level of performance, and integrates well into Ethernet based substation automation systems. Significant benefits of PTP include automatic path length compensation, support for redundant time sources and the cabling efficiency of a shared network. This paper benchmarks the performance of established IRIG-B and 1-PPS synchronisation methods over a range of path lengths representative of a transmission substation. The performance of PTP using the same distribution system is then evaluated and compared to the existing methods to determine if the performance justifies the additional complexity. Experimental results show that a PTP timing system maintains the synchronising performance of 1-PPS and IRIG-B timing systems, when using the same fibre optic cables, and further meets the needs of process buses in large substations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wandering is aimless and repetitive locomotion that may expose persons with dementia (PWD) to elopement, getting lost and death. This study is an Australian replication of a US study. Cross-disciplinary consensus- based analysis was applied to data from five focus groups (N =47: cognitively intact LTC residents (5), carers of PWD (11), home care workers (13) allied health professionals and health-focused engineers (7) and RNs (11). Groups received briefing about wandering monitoring and elopement management systems. Consistent with US attitudes, participants in all groups agreed on what a wandering technology should do, how it should do it, and necessary technical specifications. Within each group participants raised the need for a continuum of care for PWD and the imperative for early recognition of potentially dangerous wandering and getting lost when they occur. Global Positioning System elopement management was the preferred option. Interestingly, the prospective value of GPS to recover a lost or eloped wanderer far outweighed privacy concerns, as in the US. A pervasive theme was that technologies need to augment, but cannot replace, attentive, compassionate caregiver presence. A significant theme raised only by Australian carers of PWD was the potential for development of implantable GPS technologies and the need for public debate about attendant ethical issues. Given that 60% or more of over 200,000 Australians and 4.5 million Americans with dementia will develop wandering, there is a pressing need to develop effective locator systems that may delay institutionalization, help allay carer concern and enhance PWD safety.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a growing number of organizations and universities now utilising e-learning practices in their teaching and learning programs. These systems have allowed for knowledge sharing and provide opportunities for users to have access to learning materials regardless of time and place. However, while the uptake of these systems is quite high, there is little research into the effectiveness of such systems, particularly in higher education. This paper investigates the methods that are used to study the effectiveness of e-learning systems and the factors that are critical for the success of a learning management system (LMS). Five major success categories are identified in this study and explained in depth. These are the teacher, student, LMS design, learning materials and external support.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The continuous growth of high-rise residential properties indicates that there is a need for an effective property management system to provide a sustainable high-rise residential property development. As intensive as these studies are, they do not attempt to investigate the correlation between property management systems with the trends of Malaysia high-rise residential property development. By examining the trends and scenario of Malaysia high-rise residential property development, this paper aims to gain an understanding of impacts from the effectiveness of property management in this scope area. Findings from this scoping paper will assist in providing a greater understanding and possible solutions for the current Malaysian property management systems for the expanding high-rise residential unit market.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A pressing cost issue facing construction is the procurement of off-site pre-manufactured assemblies. In order to encourage Australian adoption of off-site manufacture (OSM), a new approach to underlying processes is required. The advent of object oriented digital models for construction design assumes intelligent use of data. However, the construction production system relies on traditional methods and data sources and is expected to benefit from the application of well-established business process management techniques. The integration of the old and new data sources allows for the development of business process models which, by capturing typical construction processes involving OSM, provides insights into such processes. This integrative approach is the foundation of research into the use of OSM to increase construction productivity in Australia. The purpose of this study is to develop business process models capturing the procurement, resources and information flow of construction projects. For each stage of the construction value chain, a number of sub-processes are identified. Business Process Modelling Notation (BPMN), a mainstream business process modelling standard, is used to create base-line generic construction process models. These models identify OSM decision-making points that could provide cost reductions in procurement workflow and management systems. This paper reports on phase one of an on-going research aiming to develop a proto-type workflow application that can provide semi-automated support to construction processes involving OSM and assist in decision-making in the adoption of OSM thus contributing to a sustainable built environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Grasslands occupy approximately half of the ice-free land area of the world, make up about 70 percent of the world's agricultural area, and are an important agricultural resource, particularly in areas where people are among the most food insecure. Despite their significant potential for carbon (C) sequestration and emission reductions, they are currently not included in international agreements to reduce greenhouse gas (GHG) emissions. The chapters in this book have presented new data on management systems that could sequester C in the soil or biomass, assessed the policy and economic aspects of C sequestration in grassland soils, and evaluated limitations and those techniques required to capitalize on grassland C sequestration as a viable component of mitigation strategy.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Laboratories and technical hands on learning have always been a part of Engineering and Science based university courses. They provide the interface where theory meets practice and students may develop professional skills through interacting with real objects in an environment that models appropriate standards and systems. Laboratories in many countries are facing challenges to their sustainable operation and effectiveness. In some countries such as Australia, significantly reduced funding and staff reduction is eroding a once strong base of technical infrastructure. Other countries such as Thailand are seeking to develop their laboratory infrastructure and are in need of staff skill development, management and staff structure in technical areas. In this paper the authors will address the need for technical development with reference to work undertaken in Thailand and Australia. The authors identify the roads which their respective university sectors are on and point out problems and opportunities. It is hoped that the cross roads where we meet will result in better directions for both.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This Australian case study of futures methodologies in local government explores the development and implementation of the Logan 2026 City Directions project. As an innovative approach to strategic planning, and forming the city visioning umbrella for the Strategic Planning and Performance Management Framework of Council, Logan 2026 City Directions has facilitated greater engagement with the community and represents an opportunity for Council to explore and build on the organisation's foresight capacity and to enhance internal communications within the organisation. One significant by-product has been ongoing dialogue and actions of the workshop groups in Council seeking to address such issues as climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The communal nature of knowledge production predicts the importance of creating learning organisations where knowledge arises out of processes that are personal, social, situated and active. It follows that workplaces must provide both formal and informal learning opportunities for interaction with ideas and among individuals. This grounded theory for developing contemporary learning organisations harvests insights from the knowledge management, systems sciences, and educational learning literatures. The resultant hybrid theoretical framework informs practical application, as reported in a case study that harnesses the accelerated information exchange possibilities enabled through web 2.0 social networking and peer production technologies. Through complementary organisational processes, 'meaning making' is negotiated in formal face-to-face meetings supplemented by informal 'boundary spanning' dialogue. The organisational capacity building potential of this participatory and inclusive approach is illustrated through the example of the Dr. Martin Luther King, Jr. Library in San Jose, California, USA. As an outcome of the strategic planning process at this joint city-university library, communication, decision-making, and planning structures, processes, and systems were re-invented. An enterprise- level redesign is presented, which fosters contextualising information interactions for knowledge sharing and community building. Knowledge management within this context envisions organisations as communities where knowledge, identity, and learning are situated. This framework acknowledges the social context of learning - i.e., that knowledge is acquired and understood through action, interaction, and sharing with others. It follows that social networks provide peer-to-peer enculturation through intentional exchange of tacit information made explicit. This, in turn, enables a dynamic process experienced as a continuous spiral that perpetually elevates collective understanding and enables knowledge creation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In competitive environments agility is emerging as an important determinant of success. Despite the widely accepted importance of agility there has been paucity in research on this construct, especially the customer’s perspective of agility. Rise of digital natives together with growth of ubiquitous information systems has changed the way firms engage with their customers. Firms are finding it difficult to establish sustained loyalty hence the long term sustained advantage over competition. Hence, firms are increasingly investing substantial resources on dynamic Customer Relationship Management systems such as mobile-CRMS to better engage with customers to sense and respond quickly (Agility of the firm) to their demands. This paper investigates firm’s customer agility from customer’s perspective, and we propose a model to understand firm’s customer agility from customer’s point of view. The proposed model is derived based on previous conceptions of agility and the expectation confirmation theory (ECT). This paper reports the initial findings of this study obtained through a pilot test. The findings of the study demonstrate that customer’s view point on firm’s customer agility is an important determinant of achieving success through sustained competitive advantage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While there are many similarities between the languages of the various workflow management systems, there are also significant differences. One particular area of differences is caused by the fact that different systems impose different syntactic restrictions. In such cases, business analysts have to choose between either conforming to the language in their specifications or transforming these specifications afterwards. The latter option is preferable as this allows for a separation of concerns. In this paper we investigate to what extent such transformations are possible in the context of various syntactical restrictions (the most restrictive of which will be referred to as structured workflows). We also provide a deep insight into the consequences, particularly in terms of expressive power, of imposing such restrictions.