966 resultados para measurement systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we report a systematic investigation of the dependence of both temperature and strain sensitivities on the jiber Bragg grating (FBG) type, including the wellknown Type I, Type IIA, and a new type which we have designated Type 1.4, using both hydrogen-Ji-ee and hydrogenated B/Ge codoped jibers. We have identijed distinct sensitivity characteristics for each grating type, and we have utilised them to implement a novel dual-grating, duul-parameter sensor device. Three dual-grating sensing schemes with different combinations of gruting types have been constructed and compared. The Type IA-Type IIA combination exhibits the best pe$ormance and is superior to that of previously reported gruting-based structures. The characteristics of the measurement errors in such dualgrating sensor systems is also presented in detail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose ‐ This study provides empirical evidence for the contextuality of marketing performance assessment (MPA) systems. It aims to introduce a taxonomical classification of MPA profiles based on the relative emphasis placed on different dimensions of marketing performance in different companies and business contexts. Design/methodology/approach ‐ The data used in this study (n=1,157) were collected using a web-based questionnaire, targeted to top managers in Finnish companies. Two multivariate data analysis techniques were used to address the research questions. First, dimensions of marketing performance underlying the current MPA systems were identified through factor analysis. Second, a taxonomy of different profiles of marketing performance measurement was created by clustering respondents based on the relative emphasis placed on the dimensions and characterizing them vis-á-vis contextual factors. Findings ‐ The study identifies nine broad dimensions of marketing performance that underlie the MPA systems in use and five MPA profiles typical of companies of varying sizes in varying industries, market life cycle stages, and competitive positions associated with varying levels of market orientation and business performance. The findings support the previously conceptual notion of contextuality in MPA and provide empirical evidence for the factors that affect MPA systems in practice. Originality/value ‐ The paper presents the first field study of current MPA systems focusing on combinations of metrics in use. The findings of the study provide empirical support for the contextuality of MPA and form a classification of existing contextual systems suitable for benchmarking purposes. Limited evidence for performance differences between MPA profiles is also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a new framework has been applied to the design of controllers which encompasses nonlinearity, hysteresis and arbitrary density functions of forward models and inverse controllers. Using mixture density networks, the probabilistic models of both the forward and inverse dynamics are estimated such that they are dependent on the state and the control input. The optimal control strategy is then derived which minimizes uncertainty of the closed loop system. In the absence of reliable plant models, the proposed control algorithm incorporates uncertainties in model parameters, observations, and latent processes. The local stability of the closed loop system has been established. The efficacy of the control algorithm is demonstrated on two nonlinear stochastic control examples with additive and multiplicative noise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a study of performance management of Complex Event Processing (CEP) systems. Since CEP systems have distinct characteristics from other well-studied computer systems such as batch and online transaction processing systems and database-centric applications, these characteristics introduce new challenges and opportunities to the performance management for CEP systems. Methodologies used in benchmarking CEP systems in many performance studies focus on scaling the load injection, but not considering the impact of the functional capabilities of CEP systems. This thesis proposes the approach of evaluating the performance of CEP engines’ functional behaviours on events and develops a benchmark platform for CEP systems: CEPBen. The CEPBen benchmark platform is developed to explore the fundamental functional performance of event processing systems: filtering, transformation and event pattern detection. It is also designed to provide a flexible environment for exploring new metrics and influential factors for CEP systems and evaluating the performance of CEP systems. Studies on factors and new metrics are carried out using the CEPBen benchmark platform on Esper. Different measurement points of response time in performance management of CEP systems are discussed and response time of targeted event is proposed to be used as a metric for quality of service evaluation combining with the traditional response time in CEP systems. Maximum query load as a capacity indicator regarding to the complexity of queries and number of live objects in memory as a performance indicator regarding to the memory management are proposed in performance management of CEP systems. Query depth is studied as a performance factor that influences CEP system performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose is to develop expert systems where by-analogy reasoning is used. Knowledge “closeness” problems are known to frequently emerge in such systems if knowledge is represented by different production rules. To determine a degree of closeness for production rules a distance between predicates is introduced. Different types of distances between two predicate value distribution functions are considered when predicates are “true”. Asymptotic features and interrelations of distances are studied. Predicate value distribution functions are found by empirical distribution functions, and a procedure is proposed for this purpose. An adequacy of obtained distribution functions is tested on the basis of the statistical 2 χ –criterion and a testing mechanism is discussed. A theorem, by which a simple procedure of measurement of Euclidean distances between distribution function parameters is substituted for a predicate closeness determination one, is proved for parametric distribution function families. The proposed distance measurement apparatus may be applied in expert systems when reasoning is created by analogy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A dual-parameter optical sensor has been realized by UV-writing a long-period and a Bragg grating structure in D-fiber. The hybrid configuration permits the detection of the temperature from the latter and measuring the external refractive index from the former responses, respectively. The employment of the D-fiber allows as effective modification and enhancement of the device sensitivity by cladding etching. The grating sensor has been used to measure the concentrations of aqueous sugar solutions, demonstrating the potential capability to detect concentration changes as small as 0.01%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High precision manufacturers continuously seek out disruptive technologies to improve the quality, cost, and delivery of their products. With the advancement of machine tool and measurement technology many companies are ready to capitalise on the opportunity of on-machine measurement (OMM). Coupled with business case, manufacturing engineers are now questioning whether OMM can soon eliminate the need for post-process inspection systems. Metrologists will however argue that the machining environment is too hostile and that there are numerous process variables which need consideration before traceable measurement on-the-machine can be achieved. In this paper we test the measurement capability of five new multi-axis machine tools enabled as OMM systems via on-machine probing. All systems are tested under various operating conditions in order to better understand the effects of potentially significant variables. This investigation has found that key process variables such as machine tool warm-up and tool-change cycles can have an effect on machine tool measurement repeatability. New data presented here is important to many manufacturers whom are considering utilising their high precision multi-axis machine tools for both the creation and verification of their products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aircraft assembly is the most important part of aircraft manufacturing. A large number of assembly fixtures must be used to ensure the assembly accuracy in the aircraft assembly process. Traditional fixed assembly fixture could not satisfy the change of the aircraft types, so the digital flexible assembly fixture was developed and was gradually applied in the aircraft assembly. Digital flexible assembly technology has also become one of the research directions in the field of aircraft manufacturing. The aircraft flexible assembly can be divided into three assembly stages that include component-level flexible assembly, large component-level flexible assembly, and large components alignment and joining. This article introduces the architecture of flexible assembly systems and the principles of three types of flexible assembly fixtures. The key technologies of the digital flexible assembly are also discussed. The digital metrology system provides the basis for the accurate digital flexible assembly. Aircraft flexible assembly systems mainly use laser tracking metrology systems and indoor Global Positioning System metrology systems. With the development of flexible assembly technology, the digital flexible assembly system will be widely used in current aircraft manufacturing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Five axis machine tools are increasing and becoming more popular as customers demand more complex machined parts. In high value manufacturing, the importance of machine tools in producing high accuracy products is essential. High accuracy manufacturing requires producing parts in a repeatable manner and precision in compliance to the defined design specifications. The performance of the machine tools is often affected by geometrical errors due to a variety of causes including incorrect tool offsets, errors in the centres of rotation and thermal growth. As a consequence, it can be difficult to produce highly accurate parts consistently. It is, therefore, essential to ensure that machine tools are verified in terms of their geometric and positioning accuracy. When machine tools are verified in terms of their accuracy, the resulting numerical values of positional accuracy and process capability can be used to define design for verification rules and algorithms so that machined parts can be easily produced without scrap and little or no after process measurement. In this paper the benefits of machine tool verification are listed and a case study is used to demonstrate the implementation of robust machine tool performance measurement and diagnostics using a ballbar system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement assisted assembly (MAA) has the potential to facilitate a step change in assembly efficiency for large structures such as airframes through the reduction of rework, manually intensive processes and expensive monolithic assembly tooling. It is shown how MAA can enable rapid part-to-part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved aerodynamic tolerances. These advances will require the development of automated networks of measurement instruments; model based thermal compensation, the automatic integration of 'live' measurement data into variation simulation and algorithms to generate cutting paths for predictive shimming and drilling processes. This paper sets out an architecture for digital systems which will enable this integrated approach to variation management. © 2013 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metrology processes contribute to entire manufacturing systems that can have a considerable impact on financial investment in coordinate measuring systems. However, there is a lack of generic methodologies to quantify their economical value in today’s industry. To solve this problem, a mathematical model is proposed in this paper by statistical deductive reasoning. This is done through defining the relationships between Process Capability Index, measurement uncertainty and tolerance band. The correctness of the mathematical model is proved by a case study. Finally, several comments and suggestions on evaluating and maximizing the benefits of metrology investment are given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators are widely used by organizations as a way of evaluating, measuring and classifying organizational performance. As part of performance evaluation systems, indicators are often shared or compared across internal sectors or with other organizations. However, indicators can be vague and imprecise, and also can lack semantics, making comparisons with other indicators difficult. Thus, this paper presents a knowledge model based on an ontology that may be used to represent indicators semantically and generically, dealing with the imprecision and vagueness, and thus facilitating better comparison. Semantic technologies are shown to be suitable for this solution, so that it could be able to represent complex data involved in indicators comparison.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a new approach to the resource allocation and scheduling mechanism that reflects the effect of user's Quality of Experience is presented. The proposed scheduling algorithm is examined in the context of 3GPP Long Term Evolution (LTE) system. Pause Intensity (PI) as an objective and no-reference quality assessment metric is employed to represent user's satisfaction in the scheduler of eNodeB. PI is in fact a measurement of discontinuity in the service. The performance of the scheduling method proposed is compared with two extreme cases: maxCI and Round Robin scheduling schemes which correspond to the efficiency and fairness oriented mechanisms, respectively. Our work reveals that the proposed method is able to perform between fairness and efficiency requirements, in favor of higher satisfaction for the users to the desired level. © VDE VERLAG GMBH.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators are widely used by organizations as a way of evaluating, measuring and classifying organizational performance. As part of performance evaluation systems, indicators are often shared or compared across internal sectors or with other organizations. However, indicators can be vague and imprecise, and also can lack semantics, making comparisons with other indicators difficult. Thus, this paper presents a knowledge model based on an ontology that may be used to represent indicators semantically and generically, dealing with the imprecision and vagueness, and thus facilitating better comparison. Semantic technologies are shown to be suitable for this solution, so that it could be able to represent complex data involved in indicators comparison.