909 resultados para measurement and metrology


Relevância:

90.00% 90.00%

Publicador:

Resumo:

We experimentally demonstrated a highly sensitive twist sensor system based on a 45° and an 81° tilted fibre grating (TFG). The 81°-TFG has a set of dual-peaks that are due to the birefringence induced by its extremely tilted structure. When the 81°-TFG subjected to twist, the coupling to the two peaks would interchange from each other, providing a mechanism to measure and monitor the twist. We have investigated the performance of the sensor system by three interrogation methods (spectral, power-measurement and voltage-measurement). The experimental results clearly show that the 81°-TFG and the 45°-TFG could be combined forming a full fibre twist sensor system capable of not just measuring the magnitude but also recognising the direction of the applied twist.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work is concerned with the development of techniques for the evaluation of large-scale highway schemes with particular reference to the assessment of their costs and benefits in the context of the current transport planning (T.P.P.) process. It has been carried out in close cooperation with West Midlands County Council, although its application and results are applicable elsewhere. The background to highway evaluation and its development in recent years has been described and the emergence of a number of deficiencies in current planning practise noted. One deficiency in particular stood out, that stemming from inadequate methods of scheme generation and the research has concentrated upon improving this stage of appraisal, to ensure that subsequent stages of design, assessment and implementation are based upon a consistent and responsive foundation. Deficiencies of scheme evaluation were found to stem from inadequate development of appraisal methodologies suffering from difficulties of valuation, measurement and aggregation of the disparate variables that characterise highway evaluation. A failure to respond to local policy priorities was also noted. A 'problem' rather than 'goals' based approach to scheme generation was taken, as it represented the current and foreseeable resource allocation context more realistically. A review of techniques with potential for highway problem based scheme generation, which would work within a series of practical and theoretical constraints were assessed and that of multivariate analysis, and classical factor analysis in particular, was selected, because it offerred considerable application to the difficulties of valuation, measurement and aggregation that existed. Computer programs were written to adapt classical factor analysis to the requirements of T.P.P. highway evaluation, using it to derive a limited number of factors which described the extensive quantity of highway problem data. From this, a series of composite problem scores for 1979 were derived for a case study area of south Birmingham, based upon the factorial solutions, and used to assess highway sites in terms of local policy issues. The methodology was assessed in the light of its ability to describe highway problems in both aggregate and disaggregate terms, to guide scheme design, coordinate with current scheme evaluation methods, and in general to improve upon current appraisal. Analysis of the results was both in subjective, 'common-sense' terms and using statistical methods to assess the changes in problem definition, distribution and priorities that emerged. Overall, the technique was found to improve upon current scheme generation methods in all respects and in particular in overcoming the problems of valuation, measurement and aggregation without recourse to unsubstantiated and questionable assumptions. A number of deficiencies which remained have been outlined and a series of research priorities described which need to be reviewed in the light of current and future evaluation needs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main purpose of this research is to develop and deploy an analytical framework for measuring the environmental performance of manufacturing supply chains. This work's theoretical bases combine and reconcile three major areas: supply chain management, environmental management and performance measurement. Researchers have suggested many empirical criteria for green supply chain (GSC) performance measurement and proposed both qualitative and quantitative frameworks. However, these are mainly operational in nature and specific to the focal company. This research develops an innovative GSC performance measurement framework by integrating supply chain processes (supplier relationship management, internal supply chain management and customer relationship management) with organisational decision levels (both strategic and operational). Environmental planning, environmental auditing, management commitment, environmental performance, economic performance and operational performance are the key level constructs. The proposed framework is then applied to three selected manufacturing organisations in the UK. Their GSC performance is measured and benchmarked by using the analytic hierarchy process (AHP), a multiple-attribute decision-making technique. The AHP-based framework offers an effective way to measure and benchmark organisations’ GSC performance. This study has both theoretical and practical implications. Theoretically it contributes holistic constructs for designing a GSC and managing it for sustainability; and practically it helps industry practitioners to measure and improve the environmental performance of their supply chain. © 2013 Copyright Taylor and Francis Group, LLC. CORRIGENDUM DOI 10.1080/09537287.2012.751186 In the article ‘Green supply chain performance measurement using the analytic hierarchy process: a comparative analysis of manufacturing organisations’ by Prasanta Kumar Dey and Walid Cheffi, Production Planning & Control, 10.1080/09537287.2012.666859, a third author is added which was not included in the paper as it originally appeared. The third author is Breno Nunes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computational Fluid Dynamics (CFD) has found great acceptance among the engineering community as a tool for research and design of processes that are practically difficult or expensive to study experimentally. One of these processes is the biomass gasification in a Circulating Fluidized Bed (CFB). Biomass gasification is the thermo-chemical conversion of biomass at a high temperature and a controlled oxygen amount into fuel gas, also sometime referred to as syngas. Circulating fluidized bed is a type of reactor in which it is possible to maintain a stable and continuous circulation of solids in a gas-solid system. The main objectives of this thesis are four folds: (i) Develop a three-dimensional predictive model of biomass gasification in a CFB riser using advanced Computational Fluid Dynamic (CFD) (ii) Experimentally validate the developed hydrodynamic model using conventional and advanced measuring techniques (iii) Study the complex hydrodynamics, heat transfer and reaction kinetics through modelling and simulation (iv) Study the CFB gasifier performance through parametric analysis and identify the optimum operating condition to maximize the product gas quality. Two different and complimentary experimental techniques were used to validate the hydrodynamic model, namely pressure measurement and particle tracking. The pressure measurement is a very common and widely used technique in fluidized bed studies, while, particle tracking using PEPT, which was originally developed for medical imaging, is a relatively new technique in the engineering field. It is relatively expensive and only available at few research centres around the world. This study started with a simple poly-dispersed single solid phase then moved to binary solid phases. The single solid phase was used for primary validations and eliminating unnecessary options and steps in building the hydrodynamic model. Then the outcomes from the primary validations were applied to the secondary validations of the binary mixture to avoid time consuming computations. Studies on binary solid mixture hydrodynamics is rarely reported in the literature. In this study the binary solid mixture was modelled and validated using experimental data from the both techniques mentioned above. Good agreement was achieved with the both techniques. According to the general gasification steps the developed model has been separated into three main gasification stages; drying, devolatilization and tar cracking, and partial combustion and gasification. The drying was modelled as a mass transfer from the solid phase to the gas phase. The devolatilization and tar cracking model consist of two steps; the devolatilization of the biomass which is used as a single reaction to generate the biomass gases from the volatile materials and tar cracking. The latter is also modelled as one reaction to generate gases with fixed mass fractions. The first reaction was classified as a heterogeneous reaction while the second reaction was classified as homogenous reaction. The partial combustion and gasification model consisted of carbon combustion reactions and carbon and gas phase reactions. The partial combustion considered was for C, CO, H2 and CH4. The carbon gasification reactions used in this study is the Boudouard reaction with CO2, the reaction with H2O and Methanation (Methane forming reaction) reaction to generate methane. The other gas phase reactions considered in this study are the water gas shift reaction, which is modelled as a reversible reaction and the methane steam reforming reaction. The developed gasification model was validated using different experimental data from the literature and for a wide range of operating conditions. Good agreement was observed, thus confirming the capability of the model in predicting biomass gasification in a CFB to a great accuracy. The developed model has been successfully used to carry out sensitivity and parametric analysis. The sensitivity analysis included: study of the effect of inclusion of various combustion reaction; and the effect of radiation in the gasification reaction. The developed model was also used to carry out parametric analysis by changing the following gasifier operating conditions: fuel/air ratio; biomass flow rates; sand (heat carrier) temperatures; sand flow rates; sand and biomass particle sizes; gasifying agent (pure air or pure steam); pyrolysis models used; steam/biomass ratio. Finally, based on these parametric and sensitivity analysis a final model was recommended for the simulation of biomass gasification in a CFB riser.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: This paper aims to explore practices and technologies successfully servitised manufacturers employ in the delivery of advanced services. Design/methodology/approach: A case study methodology is applied across four manufacturing organisations successful in servitization. Through interviews with personnel across host manufacturers, their partners, and key customers, extensive data are collected about service delivery systems. Analyses identify convergence in their practices and technologies. Findings: Six distinct technologies and practices are revealed: facilities and their location, micro-vertical integration and supplier relationships, information and communication technologies (ICTs), performance measurement and value demonstration, people deployment and their skills, and business processes and customer relationships. These are then combined in an integrative framework that illustrates how operations are configured to successfully deliver advanced services. Research limitations/implications: The analyses are reductive and rationalising. Future studies could identify other technologies and practices. Case study as a method is inherently limited in the extent to which findings can be generalised. Practical implications: Awareness and interest in servitization is growing, yet adoption of a servitization strategy requires particular organisational capabilities on the part of the manufacturer. This study identifies technologies and practices that underpin these capabilities. Originality/value: This paper contributes to the understanding of the servitization process and, in particular, the implications to broader operations of the firm. © Emerald Group Publishing Limited.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The profusion of performance measurement models suggested by Management Accounting literature in the 1990’s is one illustration of the substantial changes in Management Accounting teaching materials since the publication of “Relevance Lost” in 1987. At the same time, in the general context of increasing competition and globalisation it is widely thought that national cultural differences are tending to disappear, meaning that management techniques used in large companies, including performance measurement and management instruments (PMS), tend to be the same, irrespective of the company nationality or location. North American management practice is traditionally described as a contractually based model, mainly focused on financial performance information and measures (FPMs), more shareholder-focused than French companies. Within France, literature historically defined performance as being broadly multidimensional, driven by the idea that there are no universal rules of management and that efficient management takes into account local culture and traditions. As opposed to their North American brethren, French companies are pressured more by the financial institutions that fund them rather than by capital markets. Therefore, they pay greater attention to the long-term because they are not subject to quarterly capital market objectives. Hence, management in France should rely more on long-term qualitative information, less financial, and more multidimensional data to assess performance than their North American counterparts. The objective of this research is to investigate whether large French and US companies’ practices have changed in the way the textbooks have changed with regards to performance measurement and management, or whether cultural differences are still driving differences in performance measurement and management between them. The research findings support the idea that large US and French companies share the same PMS features, influenced by ‘universal’ PM models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper extends the smooth transition conditional correlation model by studying for the first time the impact that illiquidity shocks have on stock market return comovement. We show that firms that experience shocks that increase illiquidity are less liquid than firms that experience shocks that decrease illiquidity. Shocks that increase illiquidity have no statistical impact on comovement. However, shocks that reduce illiquidity lead to a fall in comovement, a pattern that becomes stronger as the illiquidity of the firm increases. This discovery is consistent with increased transparency and an improvement in price efficiency. We find that a small number of firms experience a double illiquidity shock. For these firms, at the first shock, a rise in illiquidity reduces comovement while a fall in illiquidity raises comovement. The second shock partly reverses these changes as a rise in illiquidity is associated with a rise in comovement and a fall in illiquidity is associated with a fall in comovement. These results have important implications for portfolio construction and also for the measurement and evolution of market beta and the cost of capital as it suggests that investors can achieve higher returns for the same amount of market risk because of the greater diversification benefits that exist. We also find that illiquidity, friction, firm size and the pre-shock correlation are all associated with the magnitude of the correlation change. © 2013 Elsevier B.V.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: There is evidence to suggest a beneficial role for growth factors, including vascular endothelial growth factor (VEGF), in tissue repair and proliferation after injury within the lung. Whether this effect is mediated predominantly by actions on endothelial cells or epithelial cells is unknown. This study tested the hypothesis that VEGF acts as an autocrine trophic factor for human adult alveolar epithelial cells and that under situations of pro-apoptotic stress, VEGF reduces cell death. Design: In vitro cell culture study looking at the effects of 0.03% H2O2 on both A549 and primary distal lung epithelial cells.Measurement and Main Results: Primary adult human distal lung epithelial cells express both the soluble and membrane-associated VEGF isoforms and VEGF receptors 1 and 2. At physiologically relevant doses, soluble VEGF isoforms stimulate wound repair and have a proliferative action. Specific receptor ligands confirmed that this effect was mediated by VEGF receptor 1. In addition to proliferation, we demonstrate that VEGF reduces A549 and distal lung epithelial cell apoptosis when administered after 0.03% H2O2 injury. This effect occurs due to reduced caspase-3 activation and is phosphatidylinositol 3′–kinase dependent. Conclusion: In addition to its known effects on endothelial cells, VEGF acts as a growth and anti-apoptotic factor on alveolar epithelial cells. VEGF treatment may have potential as a rescue therapy for diseases associated with alveolar epithelial damage such as acute respiratory distress syndrome.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As a new medium for questionnaire delivery, the internet has the potential to revolutionise the survey process. Online (web-based) questionnaires provide several advantages over traditional survey methods in terms of cost, speed, appearance, flexibility, functionality, and usability [1, 2]. For instance, delivery is faster, responses are received more quickly, and data collection can be automated or accelerated [1- 3]. Online-questionnaires can also provide many capabilities not found in traditional paper-based questionnaires: they can include pop-up instructions and error messages; they can incorporate links; and it is possible to encode difficult skip patterns making such patterns virtually invisible to respondents. Like many new technologies, however, online-questionnaires face criticism despite their advantages. Typically, such criticisms focus on the vulnerability of online-questionnaires to the four standard survey error types: namely, coverage, non-response, sampling, and measurement errors. Although, like all survey errors, coverage error (“the result of not allowing all members of the survey population to have an equal or nonzero chance of being sampled for participation in a survey” [2, pg. 9]) also affects traditional survey methods, it is currently exacerbated in online-questionnaires as a result of the digital divide. That said, many developed countries have reported substantial increases in computer and internet access and/or are targeting this as part of their immediate infrastructural development [4, 5]. Indicating that familiarity with information technologies is increasing, these trends suggest that coverage error will rapidly diminish to an acceptable level (for the developed world at least) in the near future, and in so doing, positively reinforce the advantages of online-questionnaire delivery. The second error type – the non-response error – occurs when individuals fail to respond to the invitation to participate in a survey or abandon a questionnaire before it is completed. Given today’s societal trend towards self-administration [2] the former is inevitable, irrespective of delivery mechanism. Conversely, non-response as a consequence of questionnaire abandonment can be relatively easily addressed. Unlike traditional questionnaires, the delivery mechanism for online-questionnaires makes estimation of questionnaire length and time required for completion difficult1, thus increasing the likelihood of abandonment. By incorporating a range of features into the design of an online questionnaire, it is possible to facilitate such estimation – and indeed, to provide respondents with context sensitive assistance during the response process – and thereby reduce abandonment while eliciting feelings of accomplishment [6]. For online-questionnaires, sampling error (“the result of attempting to survey only some, and not all, of the units in the survey population” [2, pg. 9]) can arise when all but a small portion of the anticipated respondent set is alienated (and so fails to respond) as a result of, for example, disregard for varying connection speeds, bandwidth limitations, browser configurations, monitors, hardware, and user requirements during the questionnaire design process. Similarly, measurement errors (“the result of poor question wording or questions being presented in such a way that inaccurate or uninterpretable answers are obtained” [2, pg. 11]) will lead to respondents becoming confused and frustrated. Sampling, measurement, and non-response errors are likely to occur when an online-questionnaire is poorly designed. Individuals will answer questions incorrectly, abandon questionnaires, and may ultimately refuse to participate in future surveys; thus, the benefit of online questionnaire delivery will not be fully realized. To prevent errors of this kind2, and their consequences, it is extremely important that practical, comprehensive guidelines exist for the design of online questionnaires. Many design guidelines exist for paper-based questionnaire design (e.g. [7-14]); the same is not true for the design of online questionnaires [2, 15, 16]. The research presented in this paper is a first attempt to address this discrepancy. Section 2 describes the derivation of a comprehensive set of guidelines for the design of online-questionnaires and briefly (given space restrictions) outlines the essence of the guidelines themselves. Although online-questionnaires reduce traditional delivery costs (e.g. paper, mail out, and data entry), set up costs can be high given the need to either adopt and acquire training in questionnaire development software or secure the services of a web developer. Neither approach, however, guarantees a good questionnaire (often because the person designing the questionnaire lacks relevant knowledge in questionnaire design). Drawing on existing software evaluation techniques [17, 18], we assessed the extent to which current questionnaire development applications support our guidelines; Section 3 describes the framework used for the evaluation, and Section 4 discusses our findings. Finally, Section 5 concludes with a discussion of further work.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Bio-impedance analysis (BIA) provides a rapid, non-invasive technique for body composition estimation. BIA offers a convenient alternative to standard techniques such as MRI, CT scan or DEXA scan for selected types of body composition analysis. The accuracy of BIA is limited because it is an indirect method of composition analysis. It relies on linear relationships between measured impedance and morphological parameters such as height and weight to derive estimates. To overcome these underlying limitations of BIA, a multi-frequency segmental bio-impedance device was constructed through a series of iterative enhancements and improvements of existing BIA instrumentation. Key features of the design included an easy to construct current-source and compact PCB design. The final device was trialled with 22 human volunteers and measured impedance was compared against body composition estimates obtained by DEXA scan. This enabled the development of newer techniques to make BIA predictions. To add a ‘visual aspect’ to BIA, volunteers were scanned in 3D using an inexpensive scattered light gadget (Xbox Kinect controller) and 3D volumes of their limbs were compared with BIA measurements to further improve BIA predictions. A three-stage digital filtering scheme was also implemented to enable extraction of heart-rate data from recorded bio-electrical signals. Additionally modifications have been introduced to measure change in bio-impedance with motion, this could be adapted to further improve accuracy and veracity for limb composition analysis. The findings in this thesis aim to give new direction to the prediction of body composition using BIA. The design development and refinement applied to BIA in this research programme suggest new opportunities to enhance the accuracy and clinical utility of BIA for the prediction of body composition analysis. In particular, the use of bio-impedance to predict limb volumes which would provide an additional metric for body composition measurement and help distinguish between fat and muscle content.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Defining 'effectiveness' in the context of community mental health teams (CMHTs) has become increasingly difficult under the current pattern of provision required in National Health Service mental health services in England. The aim of this study was to establish the characteristics of multi-professional team working effectiveness in adult CMHTs to develop a new measure of CMHT effectiveness. The study was conducted between May and November 2010 and comprised two stages. Stage 1 used a formative evaluative approach based on the Productivity Measurement and Enhancement System to develop the scale with multiple stakeholder groups over a series of qualitative workshops held in various locations across England. Stage 2 analysed responses from a cross-sectional survey of 1500 members in 135 CMHTs from 11 Mental Health Trusts in England to determine the scale's psychometric properties. Based on an analysis of its structural validity and reliability, the resultant 20-item scale demonstrated good psychometric properties and captured one overall latent factor of CMHT effectiveness comprising seven dimensions: improved service user well-being, creative problem-solving, continuous care, inter-team working, respect between professionals, engagement with carers and therapeutic relationships with service users. The scale will be of significant value to CMHTs and healthcare commissioners both nationally and internationally for monitoring, evaluating and improving team functioning in practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A method for measurement and visualization of the complex transmission coefficient of 2-D micro- objects is proposed. The method is based on calculation of the transmission coefficient from the diffraction pattern and the illumination aperture function for monochromatic light. A phase-stepping method was used for diffracted light phase determination.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We experimentally demonstrated a highly sensitive twist sensor system based on a 45° and an 81° tilted fibre grating (TFG). The 81°-TFG has a set of dual-peaks that are due to the birefringence induced by its extremely tilted structure. When the 81°-TFG subjected to twist, the coupling to the two peaks would interchange from each other, providing a mechanism to measure and monitor the twist. We have investigated the performance of the sensor system by three interrogation methods (spectral, power-measurement and voltage-measurement). The experimental results clearly show that the 81°-TFG and the 45°-TFG could be combined forming a full fibre twist sensor system capable of not just measuring the magnitude but also recognising the direction of the applied twist.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main focus of this paper is on mathematical theory and methods which have a direct bearing on problems involving multiscale phenomena. Modern technology is refining measurement and data collection to spatio-temporal scales on which observed geophysical phenomena are displayed as intrinsically highly variable and intermittant heirarchical structures,e.g. rainfall, turbulence, etc. The heirarchical structure is reflected in the occurence of a natural separation of scales which collectively manifest at some basic unit scale. Thus proper data analysis and inference require a mathematical framework which couples the variability over multiple decades of scale in which basic theoretical benchmarks can be identified and calculated. This continues the main theme of the research in this area of applied probability over the past twenty years.