909 resultados para Constraint handling
Resumo:
A novel m-ary tree based approach is presented to solve asset management decisions which are combinatorial in nature. The approach introduces a new dynamic constraint based control mechanism which is capable of excluding infeasible solutions from the solution space. The approach also provides a solution to the challenges with ordering of assets decisions.
Resumo:
Buildings are one of the most significant infrastructures in modern societies. The construction and operation of modern buildings consume a considerable amount of energy and materials, therefore contribute significantly to the climate change process. In order to reduce the environmental impact of buildings, various green building rating tools have been developed. In this paper, energy uses of the building sector in Australia and over the world are first reviewed. This is then followed by discussions on the development and scopes of various green building rating tools, with a particular focus on the Green Star rating scheme developed in Australia. It is shown that Green Star has significant implications on almost every aspect of the design of HVAC systems, including the selection of air handling and distribution systems, fluid handling systems, refrigeration systems, heat rejection systems and building control systems.
Resumo:
A hospital consists of a number of wards, units and departments that provide a variety of medical services and interact on a day-to-day basis. Nearly every department within a hospital schedules patients for the operating theatre (OT) and most wards receive patients from the OT following post-operative recovery. Because of the interrelationships between units, disruptions and cancellations within the OT can have a flow-on effect to the rest of the hospital. This often results in dissatisfied patients, nurses and doctors, escalating waiting lists, inefficient resource usage and undesirable waiting times. The objective of this study is to use Operational Research methodologies to enhance the performance of the operating theatre by improving elective patient planning using robust scheduling and improving the overall responsiveness to emergency patients by solving the disruption management and rescheduling problem. OT scheduling considers two types of patients: elective and emergency. Elective patients are selected from a waiting list and scheduled in advance based on resource availability and a set of objectives. This type of scheduling is referred to as ‘offline scheduling’. Disruptions to this schedule can occur for various reasons including variations in length of treatment, equipment restrictions or breakdown, unforeseen delays and the arrival of emergency patients, which may compete for resources. Emergency patients consist of acute patients requiring surgical intervention or in-patients whose conditions have deteriorated. These may or may not be urgent and are triaged accordingly. Most hospitals reserve theatres for emergency cases, but when these or other resources are unavailable, disruptions to the elective schedule result, such as delays in surgery start time, elective surgery cancellations or transfers to another institution. Scheduling of emergency patients and the handling of schedule disruptions is an ‘online’ process typically handled by OT staff. This means that decisions are made ‘on the spot’ in a ‘real-time’ environment. There are three key stages to this study: (1) Analyse the performance of the operating theatre department using simulation. Simulation is used as a decision support tool and involves changing system parameters and elective scheduling policies and observing the effect on the system’s performance measures; (2) Improve viability of elective schedules making offline schedules more robust to differences between expected treatment times and actual treatment times, using robust scheduling techniques. This will improve the access to care and the responsiveness to emergency patients; (3) Address the disruption management and rescheduling problem (which incorporates emergency arrivals) using innovative robust reactive scheduling techniques. The robust schedule will form the baseline schedule for the online robust reactive scheduling model.
Resumo:
Recent studies have started to explore context-awareness as a driver in the design of adaptable business processes. The emerging challenge of identifying and considering contextual drivers in the environment of a business process are well understood, however, typical methods used in business process modeling do not yet consider this additional contextual information in their process designs. In this chapter, we describe our research towards innovative and advanced process modeling methods that include mechanisms to incorporate relevant contextual drivers and their impacts on business processes in process design models. We report on our ongoing work with an Australian insurance provider and describe the design science we employed to develop these innovative and useful artifacts as part of a context-aware method framework. We discuss the utility of these artifacts in an application in the claims handling process at the case organization.
Resumo:
It is not uncommon for enterprises today to be faced with the demand to integrate and incor- porate many different and possibly heterogeneous systems which are generally independently designed and developed, to allow seamless access. In effect, the integration of these systems results in one large whole system that must be able, at the same time, to maintain the local autonomy and to continue working as an independent entity. This problem has introduced a new distributed architecture called federated systems. The most challenging issue in federated systems is to find answers for the question of how to efficiently cooperate while preserving their autonomous characteristic, especially the security autonomy. This thesis intends to address this issue. The thesis reviews the evolution of the concept of federated systems and discusses the organisational characteristics as well as remaining security issues with the existing approaches. The thesis examines how delegation can be used as means to achieve better security, especially authorisation while maintaining autonomy for the participating member of the federation. A delegation taxonomy is proposed as one of the main contributions. The major contribution of this thesis is to study and design a mechanism to support dele- gation within and between multiple security domains with constraint management capability. A novel delegation framework is proposed including two modules: Delegation Constraint Man- agement module and Policy Management module. The first module is designed to effectively create, track and manage delegation constraints, especially for delegation processes which require re-delegation (indirect delegation). The first module employs two algorithms to trace the root authority of a delegation constraint chain and to prevent the potential conflict when creating a delegation constraint chain if necessary. The first module is designed for conflict prevention not conflict resolution. The second module is designed to support the first module via the policy comparison capability. The major function of this module is to provide the delegation framework the capability to compare policies and constraints (written under the format of a policy). The module is an extension of Lin et al.'s work on policy filtering and policy analysis. Throughout the thesis, some case studies are used as examples to illustrate the discussed concepts. These two modules are designed to capture one of the most important aspects of the delegation process: the relationships between the delegation transactions and the involved constraints, which are not very well addressed by the existing approaches. This contribution is significant because the relationships provide information to keep track and en- force the involved delegation constraints and, therefore, play a vital role in maintaining and enforcing security for transactions across multiple security domains.
Resumo:
In spite of significant research in the development of efficient algorithms for three carrier ambiguity resolution, full performance potential of the additional frequency signals cannot be demonstrated effectively without actual triple frequency data. In addition, all the proposed algorithms showed their difficulties in reliable resolution of the medium-lane and narrow-lane ambiguities in different long-range scenarios. In this contribution, we will investigate the effects of various distance-dependent biases, identifying the tropospheric delay to be the key limitation for long-range three carrier ambiguity resolution. In order to achieve reliable ambiguity resolution in regional networks with the inter-station distances of hundreds of kilometers, a new geometry-free and ionosphere-free model is proposed to fix the integer ambiguities of the medium-lane or narrow-lane observables over just several minutes without distance constraint. Finally, the semi-simulation method is introduced to generate the third frequency signals from dual-frequency GPS data and experimentally demonstrate the research findings of this paper.
Resumo:
ICT is becoming a prominent part of healthcare delivery but brings with it information privacy concerns for patients and competing concerns by the caregivers. A proper balance between these issues must be established in order to fully utilise ICT capabilities in healthcare. Information accountability is a fairly new concept to computer science which focuses on fair use of information. In this paper we investigate the different issues that need to be addressed when applying information accountability principles to manage healthcare information. We briefly introduce an information accountability framework for handling electronic health records (eHR). We focus more on digital rights management by considering data in eHRs as digital assets and how we can represent privacy policies and data usage policies as these are key factors in accountability systems.
Resumo:
Nursing personnel are consistently identified as one of the occupational groups most at risk of work-related musculoskeletal disorders. During the moving and handling of bariatric patients, the weight of the patient combined with atypical body mass contributes to a significant risk of injury to the care provider and patient. This is further compounded by the shape, mobility and co-operation of the patient. The aim of this study was determine user experiences and design requirements for mobile hoists with bariatric patients. Structured interviews were conducted with six experienced injury management staff from the Manual Task Services department of three hospitals in Adelaide, South Australia. All staff had experience in patient handling, the use of patient handling equipment and the provision of patient handling training. A series of open-ended questions were structured around five main themes: 1) patient factors; 2) building/vehicle space and design; 3) equipment and furniture; 4) communication; and 5) staff issues. Questions focussed on the use of mobile hoists for lifting and transferring bariatric patients. Interviews were supplemented with a walk-through of the hospital to view the types of mobile hoists used, and the location and storage of equipment. Across the three hospitals there were differing classification systems to define bariatric patients. Ensuring patient dignity, respect and privacy were viewed as important in the management and rehabilitation of bariatric patients. Storage and space constraints were considered factors restricting the use of mobile floor hoists, with ceiling hoists being the preferred method for patient transfers. When using mobile floor hoists, the forces required to push, pull and manoeuvre, as well as sudden unstable movements of the hoist were considered important risks factors giving rise to a risk of injury to the care provider. Record keeping and purchasing policies appeared to inhibit the effective use of patient handling equipment. The moving and handling of bariatric patients presents complex and challenging issues. A co-ordinated and collaborative approach for moving and handling bariatric patients is needed across the range of care providers. Designers must consider both user and patient requirements.
Resumo:
Flow-oriented process modeling languages have a long tradition in the area of Business Process Management and are widely used for capturing activities with their behavioral and data dependencies. Individual events were introduced for triggering process instantiation and activities. However, real-world business cases drive the need for also covering complex event patterns as they are known in the field of Complex Event Processing. Therefore, this paper puts forward a catalog of requirements for handling complex events in process models, which can be used as reference framework for assessing process definition languages and systems. An assessment of BPEL and BPMN is provided.
Resumo:
Design for Manufacturing (DFM) is a highly integral methodology in product development, starting from the concept development phase, with the aim of improving manufacturing productivity. It is used to reduce manufacturing costs in complex production environments, while maintaining product quality. While Design for Assembly (DFA) is focusing on elimination or combination of parts with other components, which in most cases relates to performing a function and manufacture operation in a simpler way, DFM is following a more holistic approach. Common consideration for DFM are standard components, manufacturing tool inventory and capability, materials compatibility with production process, part handling, logistics, tool wear and process optimization, quality control complexity or Poka-Yoke design. During DFM, the considerable background work required for the conceptual phase is compensated for by a shortening of later development phases. Current DFM projects normally apply an iterative step-by-step approach and eventually transfer to the developer team. The study is introducing a new, knowledge based approach to DFM, eliminating steps of DFM, and showing implications on the work process. Furthermore, a concurrent engineering process via transparent interface between the manufacturing engineering and product development systems is brought forward.
Resumo:
Neoproterozoic glacigenic formations are preserved in the Kimberley region and northwestern Northern Territory of northern Australia. They are distributed in the west Kimberley adjacent to the northern margins of the King Leopold Orogen, the Mt Ramsay area at the junction of the King Leopold and Halls Creek Orogens, and the east Kimberley, adjacent to the eastern margin of the Halls Creek Orogen. Small outlier glacigenic deposits are preserved in the Litchfield Province, Northern Territory (Uniya Formation) and Georgina Basin, western Queensland (Little Burke Formation). Glacigenic strata comprise diamictite, conglomerate, sandstone and pebbly mudstone and characterize the Walsh, Landrigan and Fargoo/Moonlight Valley formations. Thin units of laminated dolomite sit conformably at the top of the Walsh, Landrigan and Moonlight Valley formations. Glacigenic units are also interbedded with the carbonate platform deposits of the Egan Formation and Boonall Dolomite. δ13C data are available for all carbonate units. There is no direct chronological constraint on these successions. Dispute over regional correlation of the Neoproterozoic succession has been largely resolved through biostratigraphic, chemostratigraphic and lithostratigraphic analysis. However, palaeomagnetic results from the Walsh Formation are inconsistent with sedimentologically based correlations. Two stratigraphically defined glaciations are preserved in northwestern Australia: the ‘Landrigan Glaciation’, characterized by southwest-directed continental ice-sheet movement and correlated with late Cryogenian glaciation elsewhere in Australia and the world; and, the ‘Egan Glaciation’, a more localized glaciation of the Ediacaran Period. Future research focus should include chronology, palaeomagnetic constraint and tectonostratigraphic controls on deposition.
Resumo:
Large margin learning approaches, such as support vector machines (SVM), have been successfully applied to numerous classification tasks, especially for automatic facial expression recognition. The risk of such approaches however, is their sensitivity to large margin losses due to the influence from noisy training examples and outliers which is a common problem in the area of affective computing (i.e., manual coding at the frame level is tedious so coarse labels are normally assigned). In this paper, we leverage the relaxation of the parallel-hyperplanes constraint and propose the use of modified correlation filters (MCF). The MCF is similar in spirit to SVMs and correlation filters, but with the key difference of optimizing only a single hyperplane. We demonstrate the superiority of MCF over current techniques on a battery of experiments.
Resumo:
Retrofit projects are different from newly-built projects in many respects. A retrofit project involves an existing building, which imposes constraints on the owners, designers, operators and constructors throughout the project process. Retrofit projects are risky, complex, less predictable and difficult to be well planned, which need greater coordination. For office building retrofit project, further restrictions will apply as these buildings often locate in CBD areas and most have to remain operational during the progression of project work. Issues such as site space, material storage and handling, noise and dust, need to be considered and well addressed. In this context, waste management is even more challenging with small spaces for waste handling, uncertainties in waste control, and impact of waste management activities on project delivery and building occupants. Current literatures on waste management in office building retrofit projects focus on increasing waste recovery rate based on project planning, monitoring and stakeholders’ collaboration. However, previous research has not produced knowledge of understanding the particular retrofit processes and their impact on waste generation and management. This paper discusses the interim results of a continuing research on new strategies for waste management in office building retrofit projects. Firstly based on the literature review, it summarizes the unique characteristics of office building retrofit projects and their influence on waste management. An assumption on waste management strategies is formed. Semi-structured interviews were conducted towards industry practitioners and findings are then presented in the paper. The assumption of the research was validated in the interviews from the opinions and experiences of the respondents. Finally the research develops a process model for waste management in office building retrofit projects. It introduces two different waste management strategies. For the dismantling phase, waste is generated fast along with the work progress, so integrated planning for project delivery and waste generation is needed in order to organize prompt handling and treatment. For the fit-out phase, the work is similar as new construction. Factors which are particularly linked to generating waste on site need to be controlled and monitored. Continuing research in this space will help improve the practice of waste management in office building retrofit projects. The new strategies will help promote the practicality of project waste planning and management and stakeholders’ capability of coordinating waste management and project delivery.
Resumo:
This paper illustrates robust fixed order power oscillation damper design for mitigating power systems oscillations. From implementation and tuning point of view, such low and fixed structure is common practice for most practical applications, including power systems. However, conventional techniques of optimal and robust control theory cannot handle the constraint of fixed-order as it is, in general, impossible to ensure a target closed-loop transfer function by a controller of any given order. This paper deals with the problem of synthesizing or designing a feedback controller of dynamic order for a linear time-invariant plant for a fixed plant, as well as for an uncertain family of plants containing parameter uncertainty, so that stability, robust stability and robust performance are attained. The desired closed-loop specifications considered here are given in terms of a target performance vector representing a desired closed-loop design. The performance of the designed controller is validated through non-linear simulations for a range of contingencies.
Resumo:
In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.