944 resultados para distance measurement systems
Resumo:
This paper contributes to the rigor vs. relevance debate in the Information Systems (IS) discipline. Using the Action Research methodology, this study evaluates the relevance of a rigorously validated IS evaluation model in practice. The study captures observations of operational end-users employing a market leading Enterprise System application for procurement and order fulfillment in an organization. The analysis of the observations demonstrates the broad relevance of the measurement instrument. More importantly, the study identifies several improvements and possible confusions in applying the instrument in the practice.
Resumo:
Motivation is a major driver of project performance. Despite team member ability to deliver successful project outcomes if they are not positively motivated to pursue joint project goals, then performance will be constrained. One approach to improving the motivation of project organizations is by offering a financial reward for the achievement of set performance standards above a minimum required level. However, little investigation has been undertaken into the features of successful incentive systems as a part of an overall delivery strategy. With input from organizational management literature, and drawing on the literature covering psychological and economic theories of motivation, this paper presents an integrated framework that can be used by project organizations to assess the impact of financial reward systems on motivation in construction projects. The integrated framework offers four motivation indicators which reflect key theoretical concepts across both psychological and economic disciplines. The indicators are: (1) Goal Commitment, (2) Distributive Justice, (3) Procedural Justice, and (4) Reciprocity. The paper also interprets the integrated framework against the results of a successful Australian social infrastructure project case study and identifies key learning’s for project organizations to consider when designing financial reward systems. Case study results suggest that motivation directed towards the achievement of incentive goals is influenced not only by the value placed on the financial reward for commercial benefit, but also driven by the strength of the project initiatives that encourage just and fair dealings, supporting the establishment of trust and positive reciprocal behavior across a project team. The strength of the project relationships was found to be influenced by how attractive the achievement of the goal is to the incentive recipient and how likely they were to push for the achievement of the goal. Interestingly, findings also suggested that contractor motivation is also influenced by the fairness of the performance measurement process and their perception of the trustworthiness and transparency of their client. These findings provide the basis for future research on the impact of financial reward systems on motivation in construction projects. It is anticipated that such research will shed new light on this complex topic and further define how reward systems should be designed to promote project team motivation. Due to the unique nature of construction projects with high levels of task complexity and interdependence, results are expected to vary in comparison to previous studies based on individuals or single-entity organizations.
Resumo:
Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost. In order to reach these goals, they need good quality components from suppliers at optimum price and lead time. This actually forced all the companies to adapt different improvement practices such as lean manufacturing, Just in Time (JIT) and effective supply chain management. Applying new improvement techniques and tools cause higher establishment costs and more Information Delay (ID). On the contrary, these new techniques may reduce the risk of stock outs and affect supply chain flexibility to give a better overall performance. But industry people are unable to measure the overall affects of those improvement techniques with a standard evaluation model .So an effective overall supply chain performance evaluation model is essential for suppliers as well as manufacturers to assess their companies under different supply chain strategies. However, literature on lean supply chain performance evaluation is comparatively limited. Moreover, most of the models assumed random values for performance variables. The purpose of this paper is to propose an effective supply chain performance evaluation model using triangular linguistic fuzzy numbers and to recommend optimum ranges for performance variables for lean implementation. The model initially considers all the supply chain performance criteria (input, output and flexibility), converts the values to triangular linguistic fuzzy numbers and evaluates overall supply chain performance under different situations. Results show that with the proposed performance measurement model, improvement area for each variable can be accurately identified.
Resumo:
Experimental results for a reactive non-buoyant plume of nitric oxide (NO) in a turbulent grid flow doped with ozone (O3) are presented. The Damkohler number (Nd) for the experiment is of order unity indicating the turbulence and chemistry have similar timescales and both affect the chemical reaction rate. Continuous measurements of two components of velocity using hot-wire anemometry and the two reactants using chemiluminescent analysers have been made. A spatial resolution for the reactants of four Kolmogorov scales has been possible because of the novel design of the experiment. Measurements at this resolution for a reactive plume are not found in the literature. The experiment has been conducted relatively close to the grid in the region where self-similarity of the plume has not yet developed. Statistics of a conserved scalar, deduced from both reactive and non-reactive scalars by conserved scalar theory, are used to establish the mixing field of the plume, which is found to be consistent with theoretical considerations and with those found by other investigators in non-reative flows. Where appropriate the reactive species means and higher moments, probability density functions, joint statistics and spectra are compared with their respective frozen, equilibrium and reaction-dominated limits deduced from conserved scalar theory. The theoretical limits bracket reactive scalar statistics where this should be so according to conserved scalar theory. Both reactants approach their equilibrium limits with greater distance downstream. In the region of measurement, the plume reactant behaves as the reactant not in excess and the ambient reactant behaves as the reactant in excess. The reactant covariance lies outside its frozen and equilibrium limits for this value of Vd. The reaction rate closure of Toor (1969) is compared with the measured reaction rate. The gradient model is used to obtain turbulent diffusivities from turbulent fluxes. Diffusivity of a non-reactive scalar is found to be close to that measured in non-reactive flows by others.
Resumo:
Fiber Bragg grating (FBG) sensor technology has been attracting substantial industrial interests for the last decade. FBG sensors have seen increasing acceptance and widespread use for structural sensing and health monitoring applications in composites, civil engineering, aerospace, marine, oil & gas, and smart structures. One transportation system that has been benefitted tremendously from this technology is railways, where it is of the utmost importance to understand the structural and operating conditions of rails as well as that of freight and passenger service cars to ensure safe and reliable operation. Fiberoptic sensors, mostly in the form of FBGs, offer various important characteristics, such as EMI/RFI immunity, multiplexing capability, and very long-range interrogation (up to 230 km between FBGs and measurement unit), over the conventional electrical sensors for the distinctive operational conditions in railways. FBG sensors are unique from other types of fiber-optic sensors as the measured information is wavelength-encoded, which provides self-referencing and renders their signals less susceptible to intensity fluctuations. In addition, FBGs are reflective sensors that can be interrogated from either end, providing redundancy to FBG sensing networks. These two unique features are particularly important for the railway industry where safe and reliable operations are the major concerns. Furthermore, FBGs are very versatile and transducers based on FBGs can be designed to measure a wide range of parameters such as acceleration and inclination. Consequently, a single interrogator can deal with a large number of FBG sensors to measure a multitude of parameters at different locations that spans over a large area.
Resumo:
Rats are superior to the most advanced robots when it comes to creating and exploiting spatial representations. A wild rat can have a foraging range of hundreds of meters, possibly kilometers, and yet the rodent can unerringly return to its home after each foraging mission, and return to profitable foraging locations at a later date (Davis, et al., 1948). The rat runs through undergrowth and pipes with few distal landmarks, along paths where the visual, textural, and olfactory appearance constantly change (Hardy and Taylor, 1980; Recht, 1988). Despite these challenges the rat builds, maintains, and exploits internal representations of large areas of the real world throughout its two to three year lifetime. While algorithms exist that allow robots to build maps, the questions of how to maintain those maps and how to handle change in appearance over time remain open. The robotic approach to map building has been dominated by algorithms that optimise the geometry of the map based on measurements of distances to features. In a robotic approach, measurements of distance to features are taken with range-measuring devices such as laser range finders or ultrasound sensors, and in some cases estimates of depth from visual information. The features are incorporated into the map based on previous readings of other features in view and estimates of self-motion. The algorithms explicitly model the uncertainty in measurements of range and the measurement of self-motion, and use probability theory to find optimal solutions for the geometric configuration of the map features (Dissanayake, et al., 2001; Thrun and Leonard, 2008). Some of the results from the application of these algorithms have been impressive, ranging from three-dimensional maps of large urban strucutures (Thrun and Montemerlo, 2006) to natural environments (Montemerlo, et al., 2003).
Resumo:
DeLone and McLean (1992, p. 16) argue that the concept of “system use” has suffered from a “too simplistic definition.” Despite decades of substantial research on system use, the concept is yet to receive strong theoretical scrutiny. Many measures of system use and the development of measures have been often idiosyncratic and lack credibility or comparability. This paper reviews various attempts at conceptualization and measurement of system use and then proposes a re-conceptualization of it as “the level of incorporation of an information system within a user’s processes.” The definition is supported with the theory of work systems, system, and Key-User-Group considerations. We then go on to develop the concept of a Functional- Interface-Point (FIP) and four dimensions of system usage: extent, the proportion of the FIPs used by the business process; frequency, the rate at which FIPs are used by the participants in the process; thoroughness, the level of use of information/functionality provided by the system at an FIP; and attitude towards use, a set of measures that assess the level of comfort, degree of respect and the challenges set forth by the system. The paper argues that the automation level, the proportion of the business process encoded by the information system has a mediating impact on system use. The article concludes with a discussion of some implications of this re-conceptualization and areas for follow on research.
Resumo:
Is it possible to control identities using performance management systems (PMSs)? This paper explores the theoretical fusion of management accounting and identity studies, providing a synthesised view of control, PMSs and identification processes. It argues that the effective use of PMSs generates a range of obtrusive mechanistic and unobtrusive organic controls that mediate identification processes to achieve a high level of identity congruency between individuals and collectives—groups and organisations. This paper contends that mechanistic control of PMSs provides sensebreaking effects and also creates structural conditions for sensegiving in top-down identification processes. These processes encourage individuals to continue the bottom-up processes of sensemaking, enacting identity and constructing identity narratives. Over time, PMS activities and conversations periodically mediate several episode(s) of identification to connect past, current and future identities. To explore this relationship, the dual locus of control—collectives and individuals—is emphasised to explicate their interplay. This multidisciplinary approach contributes to explaining the multidirectional effects of PMSs in obtrusive as well as unobtrusive ways, in order to control the nature of collectives and individuals in organisations.
Resumo:
Since users have become the focus of product/service design in last decade, the term User eXperience (UX) has been frequently used in the field of Human-Computer-Interaction (HCI). Research on UX facilitates a better understanding of the various aspects of the user’s interaction with the product or service. Mobile video, as a new and promising service and research field, has attracted great attention. Due to the significance of UX in the success of mobile video (Jordan, 2002), many researchers have centered on this area, examining users’ expectations, motivations, requirements, and usage context. As a result, many influencing factors have been explored (Buchinger, Kriglstein, Brandt & Hlavacs, 2011; Buchinger, Kriglstein & Hlavacs, 2009). However, a general framework for specific mobile video service is lacking for structuring such a great number of factors. To measure user experience of multimedia services such as mobile video, quality of experience (QoE) has recently become a prominent concept. In contrast to the traditionally used concept quality of service (QoS), QoE not only involves objectively measuring the delivered service but also takes into account user’s needs and desires when using the service, emphasizing the user’s overall acceptability on the service. Many QoE metrics are able to estimate the user perceived quality or acceptability of mobile video, but may be not enough accurate for the overall UX prediction due to the complexity of UX. Only a few frameworks of QoE have addressed more aspects of UX for mobile multimedia applications but need be transformed into practical measures. The challenge of optimizing UX remains adaptations to the resource constrains (e.g., network conditions, mobile device capabilities, and heterogeneous usage contexts) as well as meeting complicated user requirements (e.g., usage purposes and personal preferences). In this chapter, we investigate the existing important UX frameworks, compare their similarities and discuss some important features that fit in the mobile video service. Based on the previous research, we propose a simple UX framework for mobile video application by mapping a variety of influencing factors of UX upon a typical mobile video delivery system. Each component and its factors are explored with comprehensive literature reviews. The proposed framework may benefit in user-centred design of mobile video through taking a complete consideration of UX influences and in improvement of mobile videoservice quality by adjusting the values of certain factors to produce a positive user experience. It may also facilitate relative research in the way of locating important issues to study, clarifying research scopes, and setting up proper study procedures. We then review a great deal of research on UX measurement, including QoE metrics and QoE frameworks of mobile multimedia. Finally, we discuss how to achieve an optimal quality of user experience by focusing on the issues of various aspects of UX of mobile video. In the conclusion, we suggest some open issues for future study.
Resumo:
Deep Raman spectroscopy has been utilized for the standoff detection of concealed chemical threat agents from a distance of 15 meters under real life background illumination conditions. By using combined time and space resolved measurements, various explosive precursors hidden in opaque plastic containers were identified non-invasively. Our results confirm that combined time and space resolved Raman spectroscopy leads to higher selectivity towards the sub-layer over the surface layer as well as enhanced rejection of fluorescence from the container surface when compared to standoff spatially offset Raman spectroscopy. Raman spectra that have minimal interference from the packaging material and good signal-to-noise ratio were acquired within 5 seconds of measurement time. A new combined time and space resolved Raman spectrometer has been designed with nanosecond laser excitation and gated detection, making it of lower cost and complexity than picosecond-based laboratory systems.
Resumo:
The future vehicle navigation for safety applications requires seamless positioning at the accuracy of sub-meter or better. However, standalone Global Positioning System (GPS) or Differential GPS (DGPS) suffer from solution outages while being used in restricted areas such as high-rise urban areas and tunnels due to the blockages of satellite signals. Smoothed DGPS can provide sub-meter positioning accuracy, but not the seamless requirement. A disadvantage of the traditional navigation aids such as Dead Reckoning and Inertial Measurement Unit onboard vehicles are either not accurate enough due to error accumulation or too expensive to be acceptable by the mass market vehicle users. One of the alternative technologies is to use the wireless infrastructure installed in roadside to locate vehicles in regions where the Global Navigation Satellite Systems (GNSS) signals are not available (for example: inside tunnels, urban canyons and large indoor car parks). The examples of roadside infrastructure which can be potentially used for positioning purposes could include Wireless Local Area Network (WLAN)/Wireless Personal Area Network (WPAN) based positioning systems, Ultra-wide band (UWB) based positioning systems, Dedicated Short Range Communication (DSRC) devices, Locata’s positioning technology, and accurate road surface height information over selected road segments such as tunnels. This research reviews and compares the possible wireless technologies that could possibly be installed along roadside for positioning purposes. Models and algorithms of integrating different positioning technologies are also presented. Various simulation schemes are designed to examine the performance benefits of united GNSS and roadside infrastructure for vehicle positioning. The results from these experimental studies have shown a number of useful findings. It is clear that in the open road environment where sufficient satellite signals can be obtained, the roadside wireless measurements contribute very little to the improvement of positioning accuracy at the sub-meter level, especially in the dual constellation cases. In the restricted outdoor environments where only a few GPS satellites, such as those with 45 elevations, can be received, the roadside distance measurements can help improve both positioning accuracy and availability to the sub-meter level. When the vehicle is travelling in tunnels with known heights of tunnel surfaces and roadside distance measurements, the sub-meter horizontal positioning accuracy is also achievable. Overall, simulation results have demonstrated that roadside infrastructure indeed has the potential to provide sub-meter vehicle position solutions for certain road safety applications if the properly deployed roadside measurements are obtainable.
Resumo:
Enterprise architecture (EA) management has become an intensively discussed approach to manage enterprise transformations. Despite the popularity and potential of EA, both researchers and practitioners lament a lack of knowledge about the realization of benefits from EA. To determine the benefits from EA, we explore the various dimensions of EA benefit realization and report on the development of a validated and robust measurement instrument. In this paper, we test the reliability and construct validity of the EA benefit realization model (EABRM), which we have designed based on the DeLone & McLean IS success model and findings from exploratory interviews. A confirmatory factor analysis confirms the existence of an impact of five distinct and individually important dimensions on the benefits derived from EA: EA artefact quality, EA infrastructure quality, EA service quality, EA culture, and EA use. The analysis presented in this paper shows that the EA benefit realization model is an instrument that demonstrates strong reliability and validity.
Resumo:
Process-aware information systems, ranging from generic workflow systems to dedicated enterprise information systems, use work-lists to offer so-called work items to users. In real scenarios, users can be confronted with a very large number of work items that stem from multiple cases of different processes. In this jungle of work items, users may find it hard to choose the right item to work on next. The system cannot autonomously decide which is the right work item, since the decision is also dependent on conditions that are somehow outside the system. For instance, what is “best” for an organisation should be mediated with what is “best” for its employees. Current work-list handlers show work items as a simple sorted list and therefore do not provide much decision support for choosing the right work item. Since the work-list handler is the dominant interface between the system and its users, it is worthwhile to provide an intuitive graphical interface that uses contextual information about work items and users to provide suggestions about prioritisation of work items. This paper uses the so-called map metaphor to visualise work items and resources (e.g., users) in a sophisticated manner. Moreover, based on distance notions, the work-list handler can suggest the next work item by considering different perspectives. For example, urgent work items of a type that suits the user may be highlighted. The underlying map and distance notions may be of a geographical nature (e.g., a map of a city or office building), but may also be based on process designs, organisational structures, social networks, due dates, calendars, etc. The framework proposed in this paper is generic and can be applied to any process-aware information system. Moreover, in order to show its practical feasibility, the paper discusses a full-fledged implementation developed in the context of the open-source workflow environment YAWL, together with two real examples stemming from two very different scenarios. The results of an initial usability evaluation of the implementation are also presented, which provide a first indication of the validity of the approach.
Resumo:
Hybrid system representations have been exploited in a number of challenging modelling situations, including situations where the original nonlinear dynamics are too complex (or too imprecisely known) to be directly filtered. Unfortunately, the question of how to best design suitable hybrid system models has not yet been fully addressed, particularly in the situations involving model uncertainty. This paper proposes a novel joint state-measurement relative entropy rate based approach for design of hybrid system filters in the presence of (parameterised) model uncertainty. We also present a design approach suitable for suboptimal hybrid system filters. The benefits of our proposed approaches are illustrated through design examples and simulation studies.
Resumo:
Data quality has become a major concern for organisations. The rapid growth in the size and technology of a databases and data warehouses has brought significant advantages in accessing, storing, and retrieving information. At the same time, great challenges arise with rapid data throughput and heterogeneous accesses in terms of maintaining high data quality. Yet, despite the importance of data quality, literature has usually condensed data quality into detecting and correcting poor data such as outliers, incomplete or inaccurate values. As a result, organisations are unable to efficiently and effectively assess data quality. Having an accurate and proper data quality assessment method will enable users to benchmark their systems and monitor their improvement. This paper introduces a granules mining for measuring the random degree of error data which will enable decision makers to conduct accurate quality assessment and allocate the most severe data, thereby providing an accurate estimation of human and financial resources for conducting quality improvement tasks.