974 resultados para PARTIAL-FILLING TECHNIQUE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the ultrasonic velocity measurement method which investigates the possible effects of high voltage high frequency pulsed power on cortical bone material elasticity. Before applying a pulsed power signal on a live bone, it is essential to determine the safe parameters of pulsed power applied on bone non-destructively. Therefore, the possible changes in cortical bone material elasticity due to a specified pulsed power excitation have been investigated. A controllable positive buck-boost converter with adjustable output voltage and frequency has been used to generate high voltage pulses (500V magnitude at 10 KHz frequency). To determine bone elasticity, an ultrasonic velocity measurement has been conducted on two groups of control (unexposed to pulse power but in the same environmental condition) and cortical bone samples exposed to pulsed power. Young’s modulus of cortical bone samples have been determined and compared before and after applying the pulsed power signal. After applying the high voltage pulses, no significant variation in elastic property of cortical bone specimens was found compared to the control. The result shows that pulsed power with nominated parameters can be applied on cortical bone tissue without any considerable negative effect on elasticity of bone material.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research is one of several ongoing studies conducted within the IT Professional Services (ITPS) research programme at Queensland University of Technology (QUT). In 2003, ITPS introduced the IS-Impact model, a measurement model for measuring information systems success from the viewpoint of multiple stakeholders. The model, along with its instrument, is robust, simple, yet generalisable, and yields results that are comparable across time, stakeholders, different systems and system contexts. The IS-Impact model is defined as “a measure at a point in time, of the stream of net benefits from the Information System (IS), to date and anticipated, as perceived by all key-user-groups”. The model represents four dimensions, which are ‘Individual Impact’, ‘Organizational Impact’, ‘Information Quality’ and ‘System Quality’. The two Impact dimensions measure the up-to-date impact of the evaluated system, while the remaining two Quality dimensions act as proxies for probable future impacts (Gable, Sedera & Chan, 2008). To fulfil the goal of ITPS, “to develop the most widely employed model” this research re-validates and extends the IS-Impact model in a new context. This method/context-extension research aims to test the generalisability of the model by addressing known limitations of the model. One of the limitations of the model relates to the extent of external validity of the model. In order to gain wide acceptance, a model should be consistent and work well in different contexts. The IS-Impact model, however, was only validated in the Australian context, and packaged software was chosen as the IS understudy. Thus, this study is concerned with whether the model can be applied in another different context. Aiming for a robust and standardised measurement model that can be used across different contexts, this research re-validates and extends the IS-Impact model and its instrument to public sector organisations in Malaysia. The overarching research question (managerial question) of this research is “How can public sector organisations in Malaysia measure the impact of information systems systematically and effectively?” With two main objectives, the managerial question is broken down into two specific research questions. The first research question addresses the applicability (relevance) of the dimensions and measures of the IS-Impact model in the Malaysian context. Moreover, this research question addresses the completeness of the model in the new context. Initially, this research assumes that the dimensions and measures of the IS-Impact model are sufficient for the new context. However, some IS researchers suggest that the selection of measures needs to be done purposely for different contextual settings (DeLone & McLean, 1992, Rai, Lang & Welker, 2002). Thus, the first research question is as follows, “Is the IS-Impact model complete for measuring the impact of IS in Malaysian public sector organisations?” [RQ1]. The IS-Impact model is a multidimensional model that consists of four dimensions or constructs. Each dimension is represented by formative measures or indicators. Formative measures are known as composite variables because these measures make up or form the construct, or, in this case, the dimension in the IS-Impact model. These formative measures define different aspects of the dimension, thus, a measurement model of this kind needs to be tested not just on the structural relationship between the constructs but also the validity of each measure. In a previous study, the IS-Impact model was validated using formative validation techniques, as proposed in the literature (i.e., Diamantopoulos and Winklhofer, 2001, Diamantopoulos and Siguaw, 2006, Petter, Straub and Rai, 2007). However, there is potential for improving the validation testing of the model by adding more criterion or dependent variables. This includes identifying a consequence of the IS-Impact construct for the purpose of validation. Moreover, a different approach is employed in this research, whereby the validity of the model is tested using the Partial Least Squares (PLS) method, a component-based structural equation modelling (SEM) technique. Thus, the second research question addresses the construct validation of the IS-Impact model; “Is the IS-Impact model valid as a multidimensional formative construct?” [RQ2]. This study employs two rounds of surveys, each having a different and specific aim. The first is qualitative and exploratory, aiming to investigate the applicability and sufficiency of the IS-Impact dimensions and measures in the new context. This survey was conducted in a state government in Malaysia. A total of 77 valid responses were received, yielding 278 impact statements. The results from the qualitative analysis demonstrate the applicability of most of the IS-Impact measures. The analysis also shows a significant new measure having emerged from the context. This new measure was added as one of the System Quality measures. The second survey is a quantitative survey that aims to operationalise the measures identified from the qualitative analysis and rigorously validate the model. This survey was conducted in four state governments (including the state government that was involved in the first survey). A total of 254 valid responses were used in the data analysis. Data was analysed using structural equation modelling techniques, following the guidelines for formative construct validation, to test the validity and reliability of the constructs in the model. This study is the first research that extends the complete IS-Impact model in a new context that is different in terms of nationality, language and the type of information system (IS). The main contribution of this research is to present a comprehensive, up-to-date IS-Impact model, which has been validated in the new context. The study has accomplished its purpose of testing the generalisability of the IS-Impact model and continuing the IS evaluation research by extending it in the Malaysian context. A further contribution is a validated Malaysian language IS-Impact measurement instrument. It is hoped that the validated Malaysian IS-Impact instrument will encourage related IS research in Malaysia, and that the demonstrated model validity and generalisability will encourage a cumulative tradition of research previously not possible. The study entailed several methodological improvements on prior work, including: (1) new criterion measures for the overall IS-Impact construct employed in ‘identification through measurement relations’; (2) a stronger, multi-item ‘Satisfaction’ construct, employed in ‘identification through structural relations’; (3) an alternative version of the main survey instrument in which items are randomized (rather than blocked) for comparison with the main survey data, in attention to possible common method variance (no significant differences between these two survey instruments were observed); (4) demonstrates a validation process of formative indexes of a multidimensional, second-order construct (existing examples mostly involved unidimensional constructs); (5) testing the presence of suppressor effects that influence the significance of some measures and dimensions in the model; and (6) demonstrates the effect of an imbalanced number of measures within a construct to the contribution power of each dimension in a multidimensional model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a model-based technique for lowering the entrance barrier for service providers to register services with a marketplace broker, such that the service is rapidly configured to utilize the brokerpsilas local service delivery management components. Specifically, it uses process modeling for supporting the execution steps of a service and shows how service delivery functions (e.g. payment points) ldquolocalrdquo to a service broker can be correctly configured into the process model. By formalizing the different operations in a service delivery function (like payment or settlement) and their allowable execution sequences (full payments must follow partial payments), including cross-function dependencies, it shows how through tool support, the non-technical user can quickly configure service delivery functions in a consistent and complete way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To determine likely errors in estimating retinal shape using partial coherence interferometric instruments when no allowance is made for optical distortion. Method: Errors were estimated using Gullstrand’s No. 1 schematic eye and variants which included a 10 D axial myopic eye, an emmetropic eye with a gradient-index lens, and a 10.9 D accommodating eye with a gradient-index lens. Performance was simulated for two commercial instruments, the IOLMaster (Carl Zeiss Meditec) and the Lenstar LS 900 (Haag-Streit AG). The incident beam was directed towards either the centre of curvature of the anterior cornea (corneal-direction method) or the centre of the entrance pupil (pupil-direction method). Simple trigonometry was used with the corneal intercept and the incident beam angle to estimate retinal contour. Conics were fitted to the estimated contours. Results: The pupil-direction method gave estimates of retinal contour that were much too flat. The cornea-direction method gave similar results for IOLMaster and Lenstar approaches. The steepness of the retinal contour was slightly overestimated, the exact effects varying with the refractive error, gradient index and accommodation. Conclusion: These theoretical results suggest that, for field angles ≤30º, partial coherence interferometric instruments are of use in estimating retinal shape by the corneal-direction method with the assumptions of a regular retinal shape and no optical distortion. It may be possible to improve on these estimates out to larger field angles by using optical modeling to correct for distortion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As civil infrastructures such as bridges age, there is a concern for safety and a need for cost-effective and reliable monitoring tool. Different diagnostic techniques are available nowadays for structural health monitoring (SHM) of bridges. Acoustic emission is one such technique with potential of predicting failure. The phenomenon of rapid release of energy within a material by crack initiation or growth in form of stress waves is known as acoustic emission (AE). AEtechnique involves recording the stress waves bymeans of sensors and subsequent analysis of the recorded signals,which then convey information about the nature of the source. AE can be used as a local SHM technique to monitor specific regions with visible presence of cracks or crack prone areas such as welded regions and joints with bolted connection or as a global technique to monitor the whole structure. Strength of AE technique lies in its ability to detect active crack activity, thus helping in prioritising maintenance work by helping focus on active cracks rather than dormant cracks. In spite of being a promising tool, some challenges do still exist behind the successful application of AE technique. One is the generation of large amount of data during the testing; hence an effective data analysis and management is necessary, especially for long term monitoring uses. Complications also arise as a number of spurious sources can giveAEsignals, therefore, different source discrimination strategies are necessary to identify genuine signals from spurious ones. Another major challenge is the quantification of damage level by appropriate analysis of data. Intensity analysis using severity and historic indices as well as b-value analysis are some important methods and will be discussed and applied for analysis of laboratory experimental data in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A lectin detected in haemolymph from the Australian spiny lobster Panulirus cygnus agglutinated human ABO Group A cells to a higher titre than Group O or B. The lectin also agglutinated rat and sheep erythrocytes, with reactivity with rat erythrocytes strongly enhanced by treatment with the proteolytic enzyme papain, an observation consistent with reactivity via a glycolipid. The lectin, purified by affinity chromatography on fixed rat-erythrocyte stroma, was inhibited equally by N-acetylglucosamine and N-acetylgalactosamine. Comparison of data from gel filtration of haemolymph (behaving as a 1,800,000 Da macromolecule), and polyacrylamide gel electrophoresis of purified lectin (a single 67,000 Da band), suggested that in haemolymph the lecin was a multimer. The purified anti-A lectin autoprecipitated unless the storage solution contained chaotropic inhibitors (125 mmol/L sucrose: 500 mmol/L urea). The properties of this anti-A lectin and other similar lectins are consistent with a role in innate immunity in these invertebrates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prevention and safety promotion programmes. Traditionally, in-depth investigations of crash risks are conducted using exposure controlled study or case-control methodology. However, these studies need either observational data for control cases or exogenous exposure data like vehicle-kilometres travel, entry flow or product of conflicting flow for a particular traffic location, or a traffic site. These data are not readily available and often require extensive data collection effort on a system-wide basis. Aim: The objective of this research is to propose an alternative methodology to investigate crash risks of a road user group in different circumstances using readily available traffic police crash data. Methods: This study employs a combination of a log-linear model and the quasi-induced exposure technique to estimate crash risks of a road user group. While the log-linear model reveals the significant interactions and thus the prevalence of crashes of a road user group under various sets of traffic, environmental and roadway factors, the quasi-induced exposure technique estimates relative exposure of that road user in the same set of explanatory variables. Therefore, the combination of these two techniques provides relative measures of crash risks under various influences of roadway, environmental and traffic conditions. The proposed methodology has been illustrated using Brisbane motorcycle crash data of five years. Results: Interpretations of results on different combination of interactive factors show that the poor conspicuity of motorcycles is a predominant cause of motorcycle crashes. Inability of other drivers to correctly judge the speed and distance of an oncoming motorcyclist is also evident in right-of-way violation motorcycle crashes at intersections. Discussion and Conclusions: The combination of a log-linear model and the induced exposure technique is a promising methodology and can be applied to better estimate crash risks of other road users. This study also highlights the importance of considering interaction effects to better understand hazardous situations. A further study on the comparison between the proposed methodology and case-control method would be useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Navigational safety analysis relying on collision statistics is often hampered because of low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possesses great potential for managing collision risks in port waters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, magnetohydrodynamic natural convection boundary layer flow of an electrically conducting and viscous incompressible fluid along a heated vertical flat plate with uniform heat and mass flux in the presence of strong cross magnetic field has been investigated. For smooth integrations the boundary layer equations are transformed in to a convenient dimensionless form by using stream function formulation as well as the free variable formulation. The nonsimilar parabolic partial differential equations are integrated numerically for Pr ≪1 that is appropriate for liquid metals against the local Hartmann parameter ξ . Further, asymptotic solutions are obtained near the leading edge using regular perturbation method for smaller values of ξ . Solutions for values of ξ ≫ 1 are also obtained by employing the matched asymptotic technique. The results obtained for small, large and all ξ regimes are examined in terms of shear stress, τw, rate of heat transfer, qw, and rate of mass transfer, mw, for important physical parameter. Attention has been given to the influence of Schmidt number, Sc, buoyancy ratio parameter, N and local Hartmann parameter, ξ on velocity, temperature and concentration distributions and noted that velocity and temperature of the fluid achieve their asymptotic profiles for Sc ≥ 10:0.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical simulations for mixed convection of micropolar fluid in an open ended arc-shape cavity have been carried out in this study. Computation is performed using the Alternate Direct Implicit (ADI) method together with the Successive Over Relaxation (SOR) technique for the solution of governing partial differential equations. The flow phenomenon is examined for a range of values of Rayleigh number, 102 ≤ Ra ≤ 106, Prandtl number, 7 ≤ Pr ≤ 50, and Reynolds number, 10 ≤ Re ≤ 100. The study is mainly focused on how the micropolar fluid parameters affect the fluid properties in the flow domain. It was found that despite the reduction of flow in the core region, the heat transfer rate increases, whereas the skin friction and microrotation decrease with the increase in the vortex viscosity parameter, Δ.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Six Sigma technique is one of the quality management strategies and is utilised for improving the quality and productivity in the manufacturing process. It is inspired by the two major project methodologies of Deming’s "Plan – Do – Check – Act (PDCA)" Cycle which consists of DMAIC and DMADV. Those two methodologies are comprised of five phases. The DMAIC project methodology will be comprehensively used in this research. In brief, DMAIC is utilised for improving the existing manufacturing process and it involves the phases Define, Measure, Analyse, Improve, and Control. Mask industry has become a significant industry in today’s society since the outbreak of some serious diseases such as the Severe Acute Respiratory Syndrome (SARS), bird flu, influenza, swine flu and hay fever. Protecting the respiratory system, then, has become the fundamental requirement for preventing respiratory deceases. Mask is the most appropriate and protective product inasmuch as it is effective in protecting the respiratory tract and resisting the virus infection through air. In order to satisfy various customers’ requirements, thousands of mask products are designed in the market. Moreover, masks are also widely used in industries including medical industries, semi-conductor industries, food industries, traditional manufacturing, and metal industries. Notwithstanding the quality of masks have become the prioritisations since they are used to prevent dangerous diseases and safeguard people, the quality improvement technique are of very high significance in mask industry. The purpose of this research project is firstly to investigate the current quality control practices in a mask industry, then, to explore the feasibility of using Six Sigma technique in that industry, and finally, to implement the Six Sigma technique in the case company to develop and evaluate the product quality process. This research mainly investigates the quality problems of musk industry and effectiveness of six sigma technique in musk industry with the United Excel Enterprise Corporation (UEE) Company as a case company. The DMAIC project methodology in the Six Sigma technique is adopted and developed in this research. This research makes significant contribution to knowledge. The main results contribute to the discovering the root causes of quality problems in a mask industry. Secondly, the company was able to increase not only acceptance rate but quality level by utilising the Six Sigma technique. Hence, utilising the Six Sigma technique could increase the production capacity of the company. Third, the Six Sigma technique is necessary to be extensively modified to improve the quality control in the mask industry. The impact of the Six Sigma technique on the overall performance in the business organisation should be further explored in future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we consider the variable order time fractional diffusion equation. We adopt the Coimbra variable order (VO) time fractional operator, which defines a consistent method for VO differentiation of physical variables. The Coimbra variable order fractional operator also can be viewed as a Caputo-type definition. Although this definition is the most appropriate definition having fundamental characteristics that are desirable for physical modeling, numerical methods for fractional partial differential equations using this definition have not yet appeared in the literature. Here an approximate scheme is first proposed. The stability, convergence and solvability of this numerical scheme are discussed via the technique of Fourier analysis. Numerical examples are provided to show that the numerical method is computationally efficient. Crown Copyright © 2012 Published by Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.