35 resultados para Telemetry of process variables
em Aston University Research Archive
Resumo:
Interpolated data are an important part of the environmental information exchange as many variables can only be measured at situate discrete sampling locations. Spatial interpolation is a complex operation that has traditionally required expert treatment, making automation a serious challenge. This paper presents a few lessons learnt from INTAMAP, a project that is developing an interoperable web processing service (WPS) for the automatic interpolation of environmental data using advanced geostatistics, adopting a Service Oriented Architecture (SOA). The “rainbow box” approach we followed provides access to the functionality at a whole range of different levels. We show here how the integration of open standards, open source and powerful statistical processing capabilities allows us to automate a complex process while offering users a level of access and control that best suits their requirements. This facilitates benchmarking exercises as well as the regular reporting of environmental information without requiring remote users to have specialized skills in geostatistics.
Resumo:
thesis is developed from a real life application of performance evaluation of small and medium-sized enterprises (SMEs) in Vietnam. The thesis presents two main methodological developments on evaluation of dichotomous environment variable impacts on technical efficiency. Taking into account the selection bias the thesis proposes a revised frontier separation approach for the seminal Data Envelopment Analysis (DEA) model which was developed by Charnes, Cooper, and Rhodes (1981). The revised frontier separation approach is based on a nearest neighbour propensity score matching pairing treated SMEs with their counterfactuals on the propensity score. The thesis develops order-m frontier conditioning on propensity score from the conditional order-m approach proposed by Cazals, Florens, and Simar (2002), advocated by Daraio and Simar (2005). By this development, the thesis allows the application of the conditional order-m approach with a dichotomous environment variable taking into account the existence of the self-selection problem of impact evaluation. Monte Carlo style simulations have been built to examine the effectiveness of the aforementioned developments. Methodological developments of the thesis are applied in empirical studies to evaluate the impact of training programmes on the performance of food processing SMEs and the impact of exporting on technical efficiency of textile and garment SMEs of Vietnam. The analysis shows that training programmes have no significant impact on the technical efficiency of food processing SMEs. Moreover, the analysis confirms the conclusion of the export literature that exporters are self selected into the sector. The thesis finds no significant impact from exporting activities on technical efficiency of textile and garment SMEs. However, large bias has been eliminated by the proposed approach. Results of empirical studies contribute to the understanding of the impact of different environmental variables on the performance of SMEs. It helps policy makers to design proper policy supporting the development of Vietnamese SMEs.
Resumo:
There is an increasing need of a model for the process-based performance measurement of multispecialty tertiary care hospitals for quality improvement. Analytic hierarchy process (AHP) is utilized in this study to evolve such a model. Each step in the model was derived by group-discussions and brainstorming sessions among experienced clinicians and managers. This tool was applied to two tertiary care teaching hospitals in Barbados and India. The model enabled identification of specific areas where neither hospital performed very well, and helped to suggest recommendations to improve those areas. AHP is recommended as a valuable tool to measure the process-based performance of multispecialty tertiary care hospitals. © Emerald Group Publishing Limited.
Resumo:
Pearson's correlation coefficient (‘r’) is one of the most widely used of all statistics. Nevertheless, care needs to be used in interpreting the results because with large numbers of observations, quite small values of ‘r’ become significant and the X variable may only account for a small proportion of the variance in Y. Hence, ‘r squared’ should always be calculated and included in a discussion of the significance of ‘r’. The use of ‘r’ also assumes that the data follow a bivariate normal distribution (see Statnote 17) and this assumption should be examined prior to the study. If the data do not conform to such a distribution, the use of a non-parametric correlation coefficient should be considered. A significant correlation should not be interpreted as indicating ‘causation’ especially in observational studies, in which the two variables may be correlated because of their mutual correlations with other confounding variables.
Resumo:
The INTAMAP FP6 project has developed an interoperable framework for real-time automatic mapping of critical environmental variables by extending spatial statistical methods and employing open, web-based, data exchange protocols and visualisation tools. This paper will give an overview of the underlying problem, of the project, and discuss which problems it has solved and which open problems seem to be most relevant to deal with next. The interpolation problem that INTAMAP solves is the generic problem of spatial interpolation of environmental variables without user interaction, based on measurements of e.g. PM10, rainfall or gamma dose rate, at arbitrary locations or over a regular grid covering the area of interest. It deals with problems of varying spatial resolution of measurements, the interpolation of averages over larger areas, and with providing information on the interpolation error to the end-user. In addition, monitoring network optimisation is addressed in a non-automatic context.
Resumo:
This investigation is in two parts, theory and experimental verification. (1) Theoretical Study In this study it is, for obvious reasons, necessary to analyse the concept of formability first. For the purpose of the present investigation it is sufficient to define the four aspects of formability as follows: (a) the formability of the material at a critical section, (b) the formability of the material in general, (c) process efficiency, (d) proportional increase in surface area. A method of quantitative assessment is proposed for each of the four aspects of formability. The theoretical study also includes the distinction between coaxial and non-coaxial strains which occur, respectively, in axisymmetrical and unsymmetrical forming processes and the inadequacy of the circular grid system for the assessment of formability is explained in the light of this distinction. (2) Experimental Study As one of the bases of the experimental work, the determination of the end point of a forming process, which sets the limit to the formability of the work material, is discussed. The effects of three process parameters on draw-in are shown graphically. Then the delay of fracture in sheet metal forming resulting from draw-in is analysed in kinematical terms, namely, through the radial displacements, the radial and the circumferential strains, and the projected thickness of the workpiece. Through the equilibrium equation of the membrane stresses, the effect on the shape of the unsupported region of the workpiece, and hence the position of the critical section is explained. Then, the effect of draw-in on the four aspects of formability is discussed throughout this investigation. The triangular coordinate system is used to present and analyse the triaxial strains involved. This coordinate system has the advantage of showing all the three principal strains in a material simultaneously, as well as representing clearly the many types of strains involved in sheet metal work.
Resumo:
This investigation has been concerned with the behaviour of solid internal lubricant during mixing, compaction, ejection, dewaxing and sintering of iron powder compacts. Zinc stearate (0.01%-4.0%) was added to irregular iron powder by admixing or precipitation from solution. Pressure/density relationships, determined by continuous compaction, and loose packed densities were used to show that small additions of zinc stearate reduced interparticle friction during loose packing and at low compaction pressures. Large additions decreased particle/die-wall friction during compaction and ejection but also caused compaction inhibition. Transverse rupture strengths were determined on compacts containing various stearate based lubricants and it was found that green strength was reduced by the interposition of a thin lubricant layer within inter~particle contacts. Only materials much finer than the iron powder respectively) were able to form such layers. Investigations were undertaken to determine the effect of the decomposition of these lubricants on the development of mechanical properties in dewaxed or sintered compacts. Physical and chemical influences on tensile strength were observed. Decomposition of lubricants was associated with reductions of strength caused by the physical effects of pressure increases and removal of lubricant from interparticle contacts. There were also chemical effects associated with the influence of gaseous decomposition products and solid residues on sintering mechanisms. Thermogravimetry was used to study the decomposition behaviour of various lubricants as free compounds and within compacts. The influence of process variables such as atmosphere type, flow-rate and compact density were investigated. In a reducing atmosphere the decomposition of these lubricants was characterised by two stages. The first involved the rapid decomposition of the hydrocarbon radical. The second, higher temperature, reactions depended on lubricant type and involved solid residues. The removal of lubricant could also markedly affect dimensional change.
Resumo:
The.use of high-chromium cast irons for abrasive wear resistance is restricted due to their poor fracture toughness properties. An.attempt was made to improve the fracture characteristics by altering the distribution, size and.shape of the eutectic carbide phase without sacrificing their excellent wear resistance. This was achieved by additions of molybdenum or tungsten followed by high temperature heat treatments. The absence of these alloying elements or replacement of them with vanadium or manganese did not show any significant effect and the continuous eutectic carbide morphology remained the same after application of high temperature heat treatments. The fracture characteristics of the alloys with these metallurgical variables were evaluated for both sharp-cracks and blunt notches. The results were used in conjunction with metallographic and fractographic observations to establish possible failure mechanisms. The fracture mechanism of the austenitic alloys was found to be controlled not only by the volume percent but was also greatly influenced by the size and distribution of the eutectic carbides. On the other hand, the fracture mechanism of martensitic alloys was independent of the eutectic carbide morphology. The uniformity of the secondary carbide precipitation during hardening heat treatments was shown to be a reason for consistant fracture toughness results being obtained with this series of alloys although their eutectic carbide morphologies were different. The collected data were applied to a model which incorporated the microstructural parameters and correlated them with the experimentally obtained valid stress intensity factors. The stress intensity coefficients of different short-bar fracture toughness test specimens were evaluated from analytical and experimental compliance studies. The.validity and applicability of this non-standard testing technique for determination of the fracture toughness of high-chromium cast irons were investigated. The results obtained correlated well with the valid results obtained from standard fracture toughness tests.
Resumo:
Purpose – Research on the relationship between customer satisfaction and customer loyalty has advanced to a stage that requires a more thorough examination of moderator variables. Limited research shows how moderators influence the relationship between customer satisfaction and customer loyalty in a service context; this article aims to present empirical evidence of the conditions in which the satisfaction-loyalty relationship becomes stronger or weaker. Design/methodology/approach – Using a sample of more than 700 customers of DIY retailers and multi-group structural equation modelling, the authors examine moderating effects of several firm-related variables, variables that result from firm/employee-customer interactions and individual-level variables (i.e. loyalty cards, critical incidents, customer age, gender, income, expertise). Findings – The empirical results suggest that not all of the moderators considered influence the satisfaction-loyalty link. Specifically, critical incidents and income are important moderators of the relationship between customer satisfaction and customer loyalty. Practical implications – Several of the moderator variables considered in this study are manageable variables. Originality/value – This study should prove valuable to academic researchers as well as service and retailing managers. It systematically analyses the moderating effect of firm-related and individual-level variables on the relationship between customer satisfaction and loyalty. It shows the differential effect of different types of moderator variables on the satisfaction-loyalty link.
Resumo:
The surface composition of food powders created from spray drying solutions containing various ratios of sodium caseinate, maltodextrin and soya oil have been analysed by Electron Spectroscopy for Chemical Analysis. The results show significant enrichment of oil at the surface of particles compared to the bulk phase, and (when the non-oil components only are considered), a significant surface enrichment of sodium caseinate also. The study found evidence of high levels (80%) of surface fat even on particles of food industry grade (92.5%) sodium caseinate containing only 1% fat.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
The article presents abstracts of papers for a conference on research methods including "On the Folly of Rewarding A While Hoping for B: A Critical Assessment of Theory Development," "All That Jazz: A Methodological Story of Stories," and "An Accounting of Counting: Universalism, Particularism, and the Counting of Qualitative Data."
Resumo:
The inference and optimization in sparse graphs with real variables is studied using methods of statistical mechanics. Efficient distributed algorithms for the resource allocation problem are devised. Numerical simulations show excellent performance and full agreement with the theoretical results. © Springer-Verlag Berlin Heidelberg 2006.
Resumo:
Listening is typically the first language skill to develop in first language (L1) users and has been recognized as a basic and fundamental tool for communication. Despite the importance of listening, aural abilities are often taken for granted, and many people overlook their dependency on listening and the complexities that combine to enable this multi-faceted skill. When second language (L2) students are learning their new language, listening is crucial, as it provides access to oral input and facilitates social interaction. Yet L2 students find listening challenging, and L2 teachers often lack sufficient pedagogy to help learners develop listening abilities that they can use in and beyond the classroom. In an effort to provide a pedagogic alternative to more traditional and limited L2 listening instruction, this thesis investigated the viability of listening strategy instruction (LSI) over three semesters at a private university in Japan through a qualitative action research (AR) intervention. An LSI program was planned and implemented with six classes over the course of three AR phases. Two teachers used the LSI with 121 learners throughout the project. Following each AR phase, student and teacher perceptions of the methodology were investigated via questionnaires and interviews, which were primary data collection methods. Secondary research methods (class observations, pre/post-semester test scores, and a research journal) supplemented the primary methods. Data were analyzed and triangulated for emerging themes related to participants’ perceptions of LSI and the viability thereof. These data showed consistent positive perceptions of LSI on the parts of both learners and teachers, although some aspects of LSI required additional refinement. This project provided insights on LSI specific to the university context in Japan and also produced principles for LSI program planning and implementation that can inform the broader L2 education community.
Resumo:
Context Many large organizations juggle an application portfolio that contains different applications that fulfill similar tasks in the organization. In an effort to reduce operating costs, they are attempting to consolidate such applications. Before consolidating applications, the work that is done with these applications must be harmonized. This is also known as process harmonization. Objective The increased interest in process harmonization calls for measures to quantify the extent to which processes have been harmonized. These measures should also uncover the factors that are of interest when harmonizing processes. Currently, such measures do not exist. Therefore, this study develops and validates a measurement model to quantify the level of process harmonization in an organization. Method The measurement model was developed by means of a literature study and structured interviews. Subsequently, it was validated through a survey, using factor analysis and correlations with known related constructs. Results As a result, a valid and reliable measurement model was developed. The factors that are found to constitute process harmonization are: the technical design of the business process and its data, the resources that execute the process, and the information systems that are used in the process. In addition, strong correlations were found between process harmonization and process standardization and between process complexity and process harmonization. Conclusion The measurement model can be used by practitioners, because it shows them the factors that must be taken into account when harmonizing processes, and because it provides them with a means to quantify the extent to which they succeeded in harmonizing their processes. At the same time, it can be used by researchers to conduct further empirical research in the area of process harmonization.