68 resultados para Hilbert symbol
em Queensland University of Technology - ePrints Archive
Resumo:
Process modeling grammars are used to create models of business processes. In this paper, we discuss how different routing symbol designs affect an individual's ability to comprehend process models. We conduct an experiment with 154 students to ascertain which visual design principles influence process model comprehension. Our findings suggest that design principles related to perceptual discriminability and pop out improve comprehension accuracy. Furthermore, semantic transparency and aesthetic design of symbols lower the perceived difficulty of comprehension. Our results inform important principles about notational design of process modeling grammars and the effective use of process modeling in practice.
Resumo:
Complex numbers are a fundamental aspect of the mathematical formalism of quantum physics. Quantum-like models developed outside physics often overlooked the role of complex numbers. Specifically, previous models in Information Retrieval (IR) ignored complex numbers. We argue that to advance the use of quantum models of IR, one has to lift the constraint of real-valued representations of the information space, and package more information within the representation by means of complex numbers. As a first attempt, we propose a complex-valued representation for IR, which explicitly uses complex valued Hilbert spaces, and thus where terms, documents and queries are represented as complex-valued vectors. The proposal consists of integrating distributional semantics evidence within the real component of a term vector; whereas, ontological information is encoded in the imaginary component. Our proposal has the merit of lifting the role of complex numbers from a computational byproduct of the model to the very mathematical texture that unifies different levels of semantic information. An empirical instantiation of our proposal is tested in the TREC Medical Record task of retrieving cohorts for clinical studies.
Resumo:
Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.
Resumo:
Fleck and Johnson (Int. J. Mech. Sci. 29 (1987) 507) and Fleck et al. (Proc. Inst. Mech. Eng. 206 (1992) 119) have developed foil rolling models which allow for large deformations in the roll profile, including the possibility that the rolls flatten completely. However, these models require computationally expensive iterative solution techniques. A new approach to the approximate solution of the Fleck et al. (1992) Influence Function Model has been developed using both analytic and approximation techniques. The numerical difficulties arising from solving an integral equation in the flattened region have been reduced by applying an Inverse Hilbert Transform to get an analytic expression for the pressure. The method described in this paper is applicable to cases where there is or there is not a flat region.
Resumo:
This paper presents an efficient low-complexity clipping noise compensation scheme for PAR reduced orthogonal frequency division multiple access (OFDMA) systems. Conventional clipping noise compensation schemes proposed for OFDM systems are decision directed schemes which use demodulated data symbols. Thus these schemes fail to deliver expected performance in OFDMA systems where multiple users share a single OFDM symbol and a specific user may only know his/her own modulation scheme. The proposed clipping noise estimation and compensation scheme does not require the knowledge of the demodulated symbols of the other users, making it very promising for OFDMA systems. It uses the equalized output and the reserved tones to reconstruct the signal by compensating the clipping noise. Simulation results show that the proposed scheme can significantly improve the system performance.
Resumo:
The literature on corporate identity management suggests that managing corporate identity is a strategically complex task embracing the shaping of a range of dimensions of organisational life. The performance measurement literature and its applications likewise now also emphasise organisational ability to incorporate various dimensions considering both financial and non-financial performance measures when assessing success. The inclusion of these soft non-financial measures challenges organisations to quantify intangible aspects of performance such as corporate identity, transforming unmeasurables into measurables. This paper explores the regulatory roles of the use of the balanced scorecard in shaping key dimensions of corporate identities in a public sector shared service provider in Australia. This case study employs qualitative interviews of senior managers and employees, secondary data and participant observation. The findings suggest that the use of the balanced scorecard has potential to support identity construction, as an organisational symbol, a communication tool of vision, and as strategy, through creating conversations that self-regulate behaviour. The development of an integrated performance measurement system, the balanced scorecard, becomes an expression of a desired corporate identity, and the performance measures and continuous process provide the resource for interpreting actual corporate identities. Through this process of understanding and mobilising the interaction, it may be possible to create a less obtrusive and more subtle way to control “what an organisation is”. This case study also suggests that the theoretical and practical fusion of the disciplinary knowledge around corporate identities and performance measurement systems could make a contribution to understanding and shaping corporate identities.
Resumo:
As a part of vital infrastructure and transportation networks, bridge structures must function safely at all times. However, due to heavier and faster moving vehicular loads and function adjustment, such as Busway accommodation, many bridges are now operating at an overload beyond their design capacity. Additionally, the huge renovation and replacement costs always make the infrastructure owners difficult to undertake. Structural health monitoring (SHM) is set to assess condition and foresee probable failures of designated bridge(s), so as to monitor the structural health of the bridges. The SHM systems proposed recently are incorporated with Vibration-Based Damage Detection (VBDD) techniques, Statistical Methods and Signal processing techniques and have been regarded as efficient and economical ways to solve the problem. The recent development in damage detection and condition assessment techniques based on VBDD and statistical methods are reviewed. The VBDD methods based on changes in natural frequencies, curvature/strain modes, modal strain energy (MSE) dynamic flexibility, artificial neural networks (ANN) before and after damage and other signal processing methods like Wavelet techniques and empirical mode decomposition (EMD) / Hilbert spectrum methods are discussed here.
Resumo:
This paper explores the possibility of including human factoring in a business process model. The importance of doing so is twofold: (1) The organization becomes transparent in its processes as all participants (human, activities and events) are identifiable. (2) Including human factoring allows organizations to hire accordingly to the process needs. (3) Human factoring alleviates the current work related stress that is being encountered. (4) Enable quicker transition for newer employees into job scope. This was made possible by including a human behaviour layer in between pools within a process to depict human behaviour and feeling. Future work includes having a human thought symbol and a human interaction symbol included into the Business Process Modelling Notation (BPMN).
Resumo:
As an Aboriginal woman currently reviewing feminist literature in Australia, I have found that representations of Aboriginal women's gender have been generated predominantly by women anthropologists. Australian feminists utilise this literature in their writing and teaching and accept its truths without question; the most often quoted ethnographic text is Diane Bell's Daughters of the Dreaming (1983a).1 Feminists' lack of critical engagement with this literature implies that they are content to accept women anthropologists' representations because Aboriginal women are not central to their constructions of feminism.2 Instead the Aboriginal woman is positioned on the margins, a symbol of difference; a reminder that it is feminists who are the bearers of true womanhood.
Resumo:
This essay--part of a special issue on the work of Gunther Kress--uses the idea of affordances and constraints to explore the (im)possibilities of new environments for engaging with literature written for children (see Kress, 2003). In particular, it examines a festival of children's literature from an Australian education context that occurs online. The festival is part of a technologically mediated library space designated by the term libr@ry (Kapitzke & Bruce, 2006). The @ symbol (French word "arobase") inserted into the word library indicates that technological mediation has a history, an established set of social practices, and a political economy, which even chatrooms with "real" authors may alter but not fully supplant.