211 resultados para Management techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Navigational collisions are one of the major safety concerns for many seaports. Continuing growth of shipping traffic in number and sizes is likely to result in increased number of traffic movements, which consequently could result higher risk of collisions in these restricted waters. This continually increasing safety concern warrants a comprehensive technique for modeling collision risk in port waters, particularly for modeling the probability of collision events and the associated consequences (i.e., injuries and fatalities). A number of techniques have been utilized for modeling the risk qualitatively, semi-quantitatively and quantitatively. These traditional techniques mostly rely on historical collision data, often in conjunction with expert judgments. However, these techniques are hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of collision counts for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique (NTCT), which uses traffic conflicts as an alternative to the collisions for modeling the probability of collision events quantitatively. This article explores the existing techniques for modeling collision risk in port waters. In particular, it identifies the advantages and limitations of the traditional techniques and highlights the potentials of the NTCT in overcoming the limitations. In view of the principles of the NTCT, a structured method for managing collision risk is proposed. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which consequently has great potential for managing collision risk in a fast, reliable and efficient manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally navigational safety analyses rely on historical collision data which is often hampered because of low collision counts, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these problems is using navigational traffic conflicts or near-misses as an alternative to the collision data. This book discusses how traffic conflicts can effectively be used in modeling of port water collision risks. Techniques for measuring and predicting collision risks in fairways, intersections, and anchorages are discussed by utilizing advanced statistical models. Risk measurement models, which quantitatively measure collision risks in waterways, are discussed. To predict risks, a hierarchical statistical modeling technique is discussed which identifies the factors influencing the risks. The modeling techniques are illustrated for Singapore port data. Results showed that traffic conflicts are an ethically appealing alternative to collision data for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Foot ulcers are a frequent reason for diabetes-related hospitalisation. Clinical training is known to have a beneficial impact on foot ulcer outcomes. Clinical training using simulation techniques has rarely been used in the management of diabetes-related foot complications or chronic wounds. Simulation can be defined as a device or environment that attempts to replicate the real world. The few non-web-based foot-related simulation courses have focused solely on training for a single skill or “part task” (for example, practicing ingrown toenail procedures on models). This pilot study aimed to primarily investigate the effect of a training program using multiple methods of simulation on participants’ clinical confidence in the management of foot ulcers. Methods: Sixteen podiatrists participated in a two-day Foot Ulcer Simulation Training (FUST) course. The course included pre-requisite web-based learning modules, practicing individual foot ulcer management part tasks (for example, debriding a model foot ulcer), and participating in replicated clinical consultation scenarios (for example, treating a standardised patient (actor) with a model foot ulcer). The primary outcome measure of the course was participants’ pre- and post completion of confidence surveys, using a five-point Likert scale (1 = Unacceptable-5 = Proficient). Participants’ knowledge, satisfaction and their perception of the relevance and fidelity (realism) of a range of course elements were also investigated. Parametric statistics were used to analyse the data. Pearson’s r was used for correlation, ANOVA for testing the differences between groups, and a paired-sample t-test to determine the significance between pre- and post-workshop scores. A minimum significance level of p < 0.05 was used. Results: An overall 42% improvement in clinical confidence was observed following completion of FUST (mean scores 3.10 compared to 4.40, p < 0.05). The lack of an overall significant change in knowledge scores reflected the participant populations’ high baseline knowledge and pre-requisite completion of web-based modules. Satisfaction, relevance and fidelity of all course elements were rated highly. Conclusions: This pilot study suggests simulation training programs can improve participants’ clinical confidence in the management of foot ulcers. The approach has the potential to enhance clinical training in diabetes-related foot complications and chronic wounds in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Value Management (VM) is a proven methodology that provides a structured framework using supporting tools and techniques that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. It offers an exceptionally robust approach to exploring the need and function of projects to be aligned with client’s objectives. The functional analysis and creativity phases of VM are crucial as it focused on utilising innovative thinking to understand the objectives of clients’ projects and provide value-adding solutions at the early discovery stages of projects. There is however a perception of VM as just being another cost-cutting tool, which has overshadowed the fundamental benefits of the method, therefore negating both influence and wider use in the construction industry. This paper describes findings from a series of case studies conducted at project and corporate levels of a current public funded infrastructure projects in Malaysia. The study aims to investigate VM processes practised by the project client organisation and evaluate the effects of project team involvement in VM workshops during the design-stage of these projects. The focus of the study is on how issues related to ‘upstream’ infrastructure design aimed at improving ‘downstream’ construction process on-site, are being resolved through multi-disciplinary team consideration and decision-making. Findings from the case studies indicate that the mix of disciplines of project team members at a design-stage of a VM workshop has minimal influence on improving construction processes. However, the degree of interaction, institutionalized thinking, cultural dimensions and visualization aids adopted, have a significant impact in maximizing creativity amongst project team members during VM workshop. The case studies conducted for this research have focused on infrastructure projects that utilise traditional VM workshop as client’s chosen VM methodology to review and develop designs. Documents review and semi-structured interview with project teams are used as data collection techniques for the case study. The significant outcomes of this research are expected to offer alternative perspectives for construction professionals and clients to minimise the constraints and strengthen strategies for implementing VM on future projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The management and improvement of business processes are a core topic of the information systems discipline. The persistent demand in corporations within all industry sectors for increased operational efficiency and innovation, an emerging set of established and evaluated methods, tools, and techniques as well as the quickly growing body of academic and professional knowledge are indicative for the standing that Business Process Management (BPM) has nowadays. During the last decades, intensive research has been conducted with respect to the design, implementation, execution, and monitoring of business processes. Comparatively low attention, however, has been paid to questions related to organizational issues such as the adoption, usage, implications, and overall success of BPM approaches, technologies, and initiatives. This research gap motivated us to edit a corresponding special focus issue for the journal BISE/WIRTSCHAFTSINFORMATIK. We are happy that we are able to present a selection of three research papers and a state-of-the-art paper in the scientific section of the issue at hand. As these papers differ in the topics they investigate, the research method they apply, and the theoretical foundations they build on, the diversity within the BPM field becomes evident. The academic papers are complemented by an interview with Phil Gilbert, IBM’s Vice President for Business Process and Decision Management, who reflects on the relationship between business processes and the data flowing through them, the need to establish a process context for decision making, and the calibration of BPM efforts toward executives who see processes as a means to an end, rather than a first-order concept in its own right.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Secure communications in distributed Wireless Sensor Networks (WSN) operating under adversarial conditions necessitate efficient key management schemes. In the absence of a priori knowledge of post-deployment network configuration and due to limited resources at sensor nodes, key management schemes cannot be based on post-deployment computations. Instead, a list of keys, called a key-chain, is distributed to each sensor node before the deployment. For secure communication, either two nodes should have a key in common in their key-chains, or they should establish a key through a secure-path on which every link is secured with a key. We first provide a comparative survey of well known key management solutions for WSN. Probabilistic, deterministic and hybrid key management solutions are presented, and they are compared based on their security properties and re-source usage. We provide a taxonomy of solutions, and identify trade-offs in them to conclude that there is no one size-fits-all solution. Second, we design and analyze deterministic and hybrid techniques to distribute pair-wise keys to sensor nodes before the deployment. We present novel deterministic and hybrid approaches based on combinatorial design theory and graph theory for deciding how many and which keys to assign to each key-chain before the sensor network deployment. Performance and security of the proposed schemes are studied both analytically and computationally. Third, we address the key establishment problem in WSN which requires key agreement algorithms without authentication are executed over a secure-path. The length of the secure-path impacts the power consumption and the initialization delay for a WSN before it becomes operational. We formulate the key establishment problem as a constrained bi-objective optimization problem, break it into two sub-problems, and show that they are both NP-Hard and MAX-SNP-Hard. Having established inapproximability results, we focus on addressing the authentication problem that prevents key agreement algorithms to be used directly over a wireless link. We present a fully distributed algorithm where each pair of nodes can establish a key with authentication by using their neighbors as the witnesses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A theoretical framework for a construction management decision evaluation system for project selection by means of a literature review. The theory is developed by the examination of the major factors concerning the project selection decision from a deterministic viewpoint, where the decision-maker is assumed to possess 'perfect knowledge' of all the aspects involved. Four fundamental project characteristics are identified together with three meaningful outcome variables. The relationship within and between these variables are considered together with some possible solution techniques. The theory is next extended to time-related dynamic aspects of the problem leading to the implications of imperfect knowledge and a non­deterministic model. A solution technique is proposed in which Gottinger's sequential machines are utilised to model the decision process,

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Morris' (1986) analysis of the factors affecting project success and failure is considered in relation to the psychology of judgement under uncertainty. A model is proposed whereby project managers may identify the specific circumstances in which human decision-making is prone to systematic error, and hence may apply a number of de-biasing techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increased adoption of business process management approaches, tools and practices, has led organizations to accumulate large collections of business process models. These collections can easily include hundred to thousand models, especially in the context of multinational corporations or as a result of organizational mergers and acquisitions. A concrete problem is thus how to maintain these large repositories in such a way that their complexity does not hamper their practical usefulness as a means to describe and communicate business operations. This paper proposes a technique to automatically infer suitable names for business process models and fragments thereof. This technique is useful for model abstraction scenarios, as for instance when user-specific views of a repository are required, or as part of a refactoring initiative aimed to simplify the repository’s complexity. The technique is grounded in an adaptation of the theory of meaning to the realm of business process models. We implemented the technique in a prototype tool and conducted an extensive evaluation using three process model collections from practice and a case study involving process modelers with different experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to contribute to an understanding of what actually takes place during consulting engagements. It draws on data collected from a qualitative case study of eight engagements by a niche consultancy in Australia to describe how consultants actively engage boundary crossing processes to address knowledge boundaries encountered during formal interactions with clients. While consultants actively managed knowledge boundary processes during interactions, by applying techniques such as evoking an ‘ideal state’ for clients, the engagements also yielded many missed opportunities for knowledge transformation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing is an emerging computing paradigm in which IT resources are provided over the Internet as a service to users. One such service offered through the Cloud is Software as a Service or SaaS. SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. SaaS is receiving substantial attention today from both software providers and users. It is also predicted to has positive future markets by analyst firms. This raises new challenges for SaaS providers managing SaaS, especially in large-scale data centres like Cloud. One of the challenges is providing management of Cloud resources for SaaS which guarantees maintaining SaaS performance while optimising resources use. Extensive research on the resource optimisation of Cloud service has not yet addressed the challenges of managing resources for composite SaaS. This research addresses this gap by focusing on three new problems of composite SaaS: placement, clustering and scalability. The overall aim is to develop efficient and scalable mechanisms that facilitate the delivery of high performance composite SaaS for users while optimising the resources used. All three problems are characterised as highly constrained, large-scaled and complex combinatorial optimisation problems. Therefore, evolutionary algorithms are adopted as the main technique in solving these problems. The first research problem refers to how a composite SaaS is placed onto Cloud servers to optimise its performance while satisfying the SaaS resource and response time constraints. Existing research on this problem often ignores the dependencies between components and considers placement of a homogenous type of component only. A precise problem formulation of composite SaaS placement problem is presented. A classical genetic algorithm and two versions of cooperative co-evolutionary algorithms are designed to now manage the placement of heterogeneous types of SaaS components together with their dependencies, requirements and constraints. Experimental results demonstrate the efficiency and scalability of these new algorithms. In the second problem, SaaS components are assumed to be already running on Cloud virtual machines (VMs). However, due to the environment of a Cloud, the current placement may need to be modified. Existing techniques focused mostly at the infrastructure level instead of the application level. This research addressed the problem at the application level by clustering suitable components to VMs to optimise the resource used and to maintain the SaaS performance. Two versions of grouping genetic algorithms (GGAs) are designed to cater for the structural group of a composite SaaS. The first GGA used a repair-based method while the second used a penalty-based method to handle the problem constraints. The experimental results confirmed that the GGAs always produced a better reconfiguration placement plan compared with a common heuristic for clustering problems. The third research problem deals with the replication or deletion of SaaS instances in coping with the SaaS workload. To determine a scaling plan that can minimise the resource used and maintain the SaaS performance is a critical task. Additionally, the problem consists of constraints and interdependency between components, making solutions even more difficult to find. A hybrid genetic algorithm (HGA) was developed to solve this problem by exploring the problem search space through its genetic operators and fitness function to determine the SaaS scaling plan. The HGA also uses the problem's domain knowledge to ensure that the solutions meet the problem's constraints and achieve its objectives. The experimental results demonstrated that the HGA constantly outperform a heuristic algorithm by achieving a low-cost scaling and placement plan. This research has identified three significant new problems for composite SaaS in Cloud. Various types of evolutionary algorithms have also been developed in addressing the problems where these contribute to the evolutionary computation field. The algorithms provide solutions for efficient resource management of composite SaaS in Cloud that resulted to a low total cost of ownership for users while guaranteeing the SaaS performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisations are constantly seeking cost-effective improvements for their business processes. Business process management (BPM) provides organisations with a range of methods, techniques and tools for analysing, managing, and optimising their business operations. However, BPM initiatives within organisations tend to focus on investigating time and resource utilisation inefficiencies, rather than directly on cost inefficiencies. As a result, high-level cost-based managerial decisions are still being made separately from process related decisions. This position paper describes a research agenda that envisages a holistic approach to managing the cost of business operations in a structured manner, by making an explicit link between cost and processes in all phases of the business process management life cycle. We discuss a number of research challenges that need to be addressed in order to realise such an approach as well as findings from some of the initial research outcomes. It is envisioned that the research outcomes will enable organisations to make operational and strategic decisions with confidence based on accurate and real-time cost information about their operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing awareness worldwide of the significance of social media to communication in times of both natural and human-created disasters and crises. While the media have long been used as a means of broadcasting messages to communities in times of crisis – bushfires, floods, earthquakes etc. – the significance of social media in enabling many-to-many communication through ubiquitous networked computing and mobile media devices is becoming increasingly important in the fields of disaster and emergency management. This paper undertakes an analysis of the uses made of social media during two recent natural disasters: the January 2011 floods in Brisbane and South-East Queensland in Australia, and the February 2011 earthquake in Christchurch, New Zealand. It is part of a wider project being undertaken by a research team based at the Queensland University of Technology in Brisbane, Australia, that is working with the Queensland Department of Community Safety (DCS) and the EIDOS Institute, and funded by the Australian Research Council (ARC) through its Linkages program. The project combines large-scale, quantitative social media tracking and analysis techniques with qualitative cultural analysis of communication efforts by citizens and officials, to enable both emergency management authorities and news media organisations to develop, implement, and evaluate new social media strategies for emergency communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent advances in the understanding of the pathogenesis of ovarian cancer have been helpful in addressing issues in diagnosis, prognosis and management. The study of ovarian tumours by novel techniques such as immunohistochemistry, fluorescent in situ hybridisation, comparative genomic hybridisation, polymerase chain reaction and new tumour markers have aided the evaluation and application of new concepts into clinical practice. The correlation of novel surrogate tumour specific features with response to treatment and outcome in patients has defined prognostic factors which may allow the future design of tailored therapy based on a molecular profile of the tumour. These have also been used to design new approaches to therapy such as antibody targeting and gene therapy. The delineation of roles of c-erbB2, c-fms and other novel receptor kinases in the pathogenesis of ovarian cancer has led initially to the development of anti-c-erbB2 monoclonal antibody therapy. The discovery of BRCA1 and BRCA2 genes will have an impact in the diagnosis and the prevention of familial ovarian cancer. The important role played by recessive genes such as p53 in cancer has raised the possibility of restoration of gene function by gene therapy. Although the pathological diagnosis of ovarian cancer is still confirmed principally on morphological features, addition of newer investigations will increasingly be useful in addressing difficult diagnostic problems. The increasingly rapid pace of discovery of genes important in disease, makes it imperative that the evaluation of their contribution in the pathogenesis of ovarian cancer is undertaken swiftly, thus improving the overall management of patients and their outcome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a comparative study on the response of a buried tunnel to surface blast using the arbitrary Lagrangian-Eulerian (ALE) and smooth particle hydrodynamics (SPH) techniques. Since explosive tests with real physical models are extremely risky and expensive, the results of a centrifuge test were used to validate the numerical techniques. The numerical study shows that the ALE predictions were faster and closer to the experimental results than those from the SPH simulations which over predicted the strains. The findings of this research demonstrate the superiority of the ALE modelling techniques for the present study. They also provide a comprehensive understanding of the preferred ALE modelling techniques which can be used to investigate the surface blast response of underground tunnels.