950 resultados para THRESHOLD CONTACT PROCESS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many industrial processes and systems can be modelled mathematically by a set of Partial Differential Equations (PDEs). Finding a solution to such a PDF model is essential for system design, simulation, and process control purpose. However, major difficulties appear when solving PDEs with singularity. Traditional numerical methods, such as finite difference, finite element, and polynomial based orthogonal collocation, not only have limitations to fully capture the process dynamics but also demand enormous computation power due to the large number of elements or mesh points for accommodation of sharp variations. To tackle this challenging problem, wavelet based approaches and high resolution methods have been recently developed with successful applications to a fixedbed adsorption column model. Our investigation has shown that recent advances in wavelet based approaches and high resolution methods have the potential to be adopted for solving more complicated dynamic system models. This chapter will highlight the successful applications of these new methods in solving complex models of simulated-moving-bed (SMB) chromatographic processes. A SMB process is a distributed parameter system and can be mathematically described by a set of partial/ordinary differential equations and algebraic equations. These equations are highly coupled; experience wave propagations with steep front, and require significant numerical effort to solve. To demonstrate the numerical computing power of the wavelet based approaches and high resolution methods, a single column chromatographic process modelled by a Transport-Dispersive-Equilibrium linear model is investigated first. Numerical solutions from the upwind-1 finite difference, wavelet-collocation, and high resolution methods are evaluated by quantitative comparisons with the analytical solution for a range of Peclet numbers. After that, the advantages of the wavelet based approaches and high resolution methods are further demonstrated through applications to a dynamic SMB model for an enantiomers separation process. This research has revealed that for a PDE system with a low Peclet number, all existing numerical methods work well, but the upwind finite difference method consumes the most time for the same degree of accuracy of the numerical solution. The high resolution method provides an accurate numerical solution for a PDE system with a medium Peclet number. The wavelet collocation method is capable of catching up steep changes in the solution, and thus can be used for solving PDE models with high singularity. For the complex SMB system models under consideration, both the wavelet based approaches and high resolution methods are good candidates in terms of computation demand and prediction accuracy on the steep front. The high resolution methods have shown better stability in achieving steady state in the specific case studied in this Chapter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As organizations reach higher levels of Business Process Management maturity, they tend to collect numerous business process models. Such models may be linked with each other or mutually overlap, supersede one another and evolve over time. Moreover, they may be represented at different abstraction levels depending on the target audience and modeling purpose, and may be available in multiple languages (e.g. due to company mergers). Thus, it is common that organizations struggle with keeping track of their process models. This demonstration introduces AProMoRe (Advanced Process Model Repository) which aims to facilitate the management of (large) process model collections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research investigating the transactional approach to the work stressor-employee adjustment relationship has described many negative main effects between perceived stressors in the workplace and employee outcomes. A considerable amount of literature, theoretical and empirical, also describes potential moderators of this relationship. Organizational identification has been established as a significant predictor of employee job-related attitudes. To date, research has neglected investigation of the potential moderating effect of organizational identification in the work stressor-employee adjustment relationship. On the basis of identity, subjective fit and sense of belonging literature it was predicted that higher perceptions of identification at multiple levels of the organization would mitigate the negative effect of work stressors on employee adjustment. It was expected, further, that more proximal, lower order identifications would be more prevalent and potent as buffers of stressors on strain. Predictions were tested with an employee sample from five organizations (N = 267). Hierarchical moderated multiple regression analyses revealed some support for the stress-buffering effects of identification in the prediction of job satisfaction and organizational commitment, particularly for more proximal (i.e., work unit) identification. These positive stress-buffering effects, however, were present for low identifiers in some situations. The present study represents an extension of the application of organizational identity theory by identifying the effects of organizational and workgroup identification on employee outcomes in the nonprofit context. Our findings will contribute to a better understanding of the dynamics in nonprofit organizations and therefore contribute to the development of strategy and interventions to deal with identity-based issues in nonprofits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business process model repositories capture precious knowledge about an organization or a business domain. In many cases, these repositories contain hundreds or even thousands of models and they represent several man-years of effort. Over time, process model repositories tend to accumulate duplicate fragments, as new process models are created by copying and merging fragments from other models. This calls for methods to detect duplicate fragments in process models that can be refactored as separate subprocesses in order to increase readability and maintainability. This paper presents an indexing structure to support the fast detection of clones in large process model repositories. Experiments show that the algorithm scales to repositories with hundreds of models. The experimental results also show that a significant number of non-trivial clones can be found in process model repositories taken from industrial practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As organizations reach higher levels of Business Process Management maturity, they tend to accumulate large collections of process models. These repositories may contain thousands of activities and be managed by different stakeholders with varying skills and responsibilities. However, while being of great value, these repositories induce high management costs. Thus, it becomes essential to keep track of the various model versions as they may mutually overlap, supersede one another and evolve over time. We propose an innovative versioning model and associated storage structure, specifically designed to maximize sharing across process model versions, and to automatically handle change propagation. The focal point of this technique is to version single process model fragments, rather than entire process models. Indeed empirical evidence shows that real-life process model repositories have numerous duplicate fragments. Experiments on two industrial datasets confirm the usefulness of our technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business process model repositories. For example, in some cases new process models may be derived from existing models, thus finding these models and adapting them may be more effective and less error-prone than developing them from scratch. Since process model repositories may be large, query evaluation may be time consuming. Hence, we investigate the use of indexes to speed up this evaluation process. To make our approach more applicable, we consider the semantic similarity between labels. Experiments are conducted to demonstrate that our approach is efficient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business process models are becoming available in large numbers due to their popular use in many industrial applications such as enterprise and quality engineering projects. On the one hand, this raises a challenge as to their proper management: How can it be ensured that the proper process model is always available to the interested stakeholder? On the other hand, the richness of a large set of process models also offers opportunities, for example with respect to the re-use of existing model parts for new models. This paper describes the functionalities and architecture of an advanced process model repository, named APROMORE. This tool brings together a rich set of features for the analysis, management and usage of large sets of process models, drawing from state-of-the art research in the field of process modeling. A prototype of the platform is presented in this paper, demonstrating its feasibility, as well as an outlook on the further development of APROMORE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Climate change is becoming increasingly apparent that is largely caused by human activities such as asset management processes, from planning to disposal, of property and infrastructure. One essential component of asset management process is asset identification. The aims of the study are to identify the information needed in asset identification and inventory as one of public asset management process in addressing the climate change issue; and to examine its deliverability in developing countries’ local governments. In order to achieve its aims, this study employs a case study in Indonesia. This study only discusses one medium size provincial government in Indonesia. The information is gathered through interviews of the local government representatives in South Sulawesi Province, Indonesia and document analysis provided by interview participants. The study found that for local government, improving the system in managing their assets is one of emerging biggest challenge. Having the right information in the right place and at the right time are critical factors in response to this challenge. Therefore, asset identification as the frontline step in public asset management system is holding an important and critical role. Furthermore, an asset identification system should be developed to support the mainstream of adaptation to climate change vulnerability and to help local government officers to be environmentally sensitive. Finally, findings from this study provide useful input for the policy makers, scholars and asset management practitioners to develop an asset inventory system as a part of public asset management process in addressing the climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim Australian residential aged care does not have a system of quality assessment related to clinical outcomes, or comprehensive quality benchmarking. The Residential Care Quality Assessment was developed to fill this gap; and this paper discusses the process by which preliminary benchmarks representing high and low quality were developed for it. Methods Data were collected from all residents (n = 498) of nine facilities. Numerator–denominator analysis of clinical outcomes occurred at a facility-level, with rank-ordered results circulated to an expert panel. The panel identified threshold scores to indicate excellent and questionable care quality, and refined these through Delphi process. Results Clinical outcomes varied both within and between facilities; agreed thresholds for excellent and poor outcomes were finalised after three Delphi rounds. Conclusion Use of the Residential Care Quality Assessment provides a concrete means of monitoring care quality and allows benchmarking across facilities; its regular use could contribute to improved care outcomes within residential aged care in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peer-to-Patent Australia will initially run as a 12 month pilot project designed to test whether an open community of reviewers can effectively locate prior art that might not otherwise be located by the patent office during a typical examination. Patent applications will be made available for peer review for a period of 6 months and there will follow a 6 month period of joint qualitative and quantitative assessment of the pilot project by IP Australia and QUT. The objective of Peer-to-Patent Australia is to improve the patent examination process and the quality of issued patents by utilising the knowledge and skills of experts in the broader community. It is a way of linking the scientific and technical expertise of anyone with an Internet connection with the expertise of a patent examiner. That community participation consists of members of the public reviewing patent applications and contributing relevant prior art references and comments within a web-based forum. The aim is to bring to light prior art, particularly non-patent prior art, that might otherwise not be identified by patent examiners. The better the prior art resources a patent examiner has at his or her disposal, the more likely a patent application will be assessed properly in terms of novelty and inventive step. The role of Peer-to-Patent Australia in this regard is to act as both a facilitator of discussion and a collector of prior art submissions. Peer-to-Patent Australia collects relevant prior art references on behalf of the reviewing community and forwards that prior art to IP Australia. Section 27 of the Patents Act 1990 (Cth) allows for the Commissioner of Patents to receive submissions of prior art by third parties relevant to the novelty and inventiveness of a particular patent application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The aim of this study was to investigate the capabilities of laser scanning confocal microscopy (LSCM) for undertaking qualitative and quantitative investigations of the response of the bulbar conjunctiva to contact lens wear. Methods: LSCM was used to observe and measure morphological characteristics of the bulbar conjunctiva of 11 asymptomatic soft contact lens wearers and 11 healthy volunteer subjects (controls). Results: The appearance of the bulbar conjunctiva is consistent with known histology of this tissue based on light and electron microscopy. The thickness of the bulbar conjunctival epithelium of lens wearers (30.9 ± 1.1 μm) was less than that of controls (32.9 ± 1.1 μm) (P < 0.0001). Superficial and basal bulbar conjunctival epithelial cell densities in contact lens wearers were 91% and 79% higher, respectively, than that in controls (P < 0.0001). No difference was observed in goblet and Langerhans cell density between lens wearers and controls. Conjunctival microcysts were observed in greater numbers, and were larger in size, in lens wearers compared with controls. Conclusions: The effects of contact lens wear on the human bulbar conjunctiva can be investigated effectively at a cellular level using LSCM. The observations in this study suggest that contact lens wear can induce changes in the bulbar conjunctiva such as epithelial thinning and accelerated formation and enlargement of microcysts, increased epithelial cell density, but has no impact on goblet or Langerhans cell density.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process modeling is a central element in any approach to Business Process Management (BPM). However, what hinders both practitioners and academics is the lack of support for assessing the quality of process models – let alone realizing high quality process models. Existing frameworks are highly conceptual or too general. At the same time, various techniques, tools, and research results are available that cover fragments of the issue at hand. This chapter presents the SIQ framework that on the one hand integrates concepts and guidelines from existing ones and on the other links these concepts to current research in the BPM domain. Three different types of quality are distinguished and for each of these levels concrete metrics, available tools, and guidelines will be provided. While the basis of the SIQ framework is thought to be rather robust, its external pointers can be updated with newer insights as they emerge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In response to the growing proliferation of Business Process Management (BPM) in industry and the demand this creates for BPM expertise, universities across the globe are at various stages of incorporating knowledge and skills in their teaching offerings. However, there are still only a handful of institutions that offer specialized education in BPM in a systematic and in-depth manner. This article is based on a global educators’ panel discussion held at the 2009 European Conference on Information Systems in Verona, Italy. The article presents the BPM programs of five universities from Australia, Europe, Africa, and North America, describing the BPM content covered, program and course structures, and challenges and lessons learned. The article also provides a comparative content analysis of BPM education programs illustrating a heterogeneous view of BPM. The examples presented demonstrate how different courses and programs can be developed to meet the educational goals of a university department, program, or school. This article contributes insights on how best to continuously sustain and reshape BPM education to ensure it remains dynamic, responsive, and sustainable in light of the evolving and ever-changing marketplace demands for BPM expertise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim/hypothesis Immune mechanisms have been proposed to play a role in the development of diabetic neuropathy. We employed in vivo corneal confocal microscopy (CCM) to quantify the presence and density of Langerhans cells (LCs) in relation to the extent of corneal nerve damage in Bowman's layer of the cornea in diabetic patients. Methods 128 diabetic patients aged 58±1 yrs with a differing severity of neuropathy based on Neuropathy Deficit Score (NDS—4.7±0.28) and 26 control subjects aged 53±3 yrs were examined. Subjects underwent a full neurological evaluation, evaluation of corneal sensation with non-contact corneal aesthesiometry (NCCA) and corneal nerve morphology using corneal confocal microscopy (CCM). Results The proportion of individuals with LCs was significantly increased in diabetic patients (73.8%) compared to control subjects (46.1%), P=0.001. Furthermore, LC density (no/mm2) was significantly increased in diabetic patients (17.73±1.45) compared to control subjects (6.94±1.58), P=0.001 and there was a significant correlation with age (r=0.162, P=0.047) and severity of neuropathy (r=−0.202, P=0.02). There was a progressive decrease in corneal sensation with increasing severity of neuropathy assessed using NDS in the diabetic patients (r=0.414, P=0.000). Corneal nerve fibre density (P<0.001), branch density (P<0.001) and length (P<0.001) were significantly decreased whilst tortuosity (P<0.01) was increased in diabetic patients with increasing severity of diabetic neuropathy. Conclusion Utilising in vivo corneal confocal microscopy we have demonstrated increased LCs in diabetic patients particularly in the earlier phases of corneal nerve damage suggestive of an immune mediated contribution to corneal nerve damage in diabetes.