897 resultados para Performance Estimation
Resumo:
The importance of actively managing and analysing business processes is acknowledged more than ever in organisations nowadays. Business processes form an essential part of an organisation and their application areas are manifold. Most organisations keep records of various activities that have been carried out for auditing purposes, but they are rarely used for analysis purposes. This paper describes the design and implementation of a process analysis tool that replays, analyses and visualises a variety of performance metrics using a process definition and its corresponding execution logs. The replayer uses a YAWL process model example to demonstrate its capacity to support advanced language constructs.
Resumo:
This study demonstrates the feasibility of additive manufactured poly(3-caprolactone)/silanized tricalcium phosphate (PCL/TCP(Si)) scaffolds coated with carbonated hydroxyapatite (CHA)-gelatin composite for bone tissue engineering. In order to reinforce PCL/TCP scaffolds to match the mechanical properties of cancellous bone, TCP has been modified with 3-glycidoxypropyl trimethoxysilane (GPTMS) and incorporated into PCL to synthesize a PCL/TCP(Si) composite. The successful modification is confirmed by X-ray photoelectron spectroscopy (XPS) and Fourier transform infrared spectroscopy (FTIR) analysis. Additive manufactured PCL/TCP(Si) scaffolds have been fabricated using a screw extrusion system (SES). Compression testing demonstrates that both the compressive modulus and compressive yield strength of the developed PCL/TCP(Si) scaffolds fall within the lower ranges of mechanical properties for cancellous bone, with a compressive modulus and compressive yield strength of 6.0 times and 2.3 times of those of PCL/TCP scaffolds, respectively. To enhance the osteoconductive property of the developed PCL/TCP(Si) scaffolds, a CHA-gelatin composite has been coated onto the scaffolds via a biomimetic co-precipitation process, which is verified by using scanning electron microscopy (SEM) and XPS. Confocal laser microscopy and SEM images reveal a most uniform distribution of porcine bone marrow stromal cells (BMSCs) and cellsheet accumulation on the CHA-gelatin composite coated PCL/TCP(Si) scaffolds. The proliferation rate of BMSCs on the CHA-gelatin composite coated PCL/TCP(Si) scaffolds is 2.0 and 1.4 times higher compared to PCL/TCP(Si) and CHA coated PCL/TCP(Si) scaffolds, respectively, by day 10. Furthermore, the reverse transcription polymerase chain reaction (RT-PCR) and western blot analyses reveal that CHA-gelatin composite coated PCL/TCP(Si) scaffolds stimulate osteogenic differentiation of BMSCs the most compared to the other scaffolds. In vitro results of SEM, confocal microscopy and proliferation rate also show that there is no detrimental effect of GPTMS modification on biocompatibility of the scaffolds.
Resumo:
In fast bowling, cricketers are expected to produce a range of delivery lines and lengths while maximising ball speed. From a coaching perspective, technique consistency has been typically associated with superior performance in these areas. However, although bowlers are required to bowl consistently, at the elite level they must also be able to vary line, length and speed to adapt to opposition batters’ strengths and weaknesses. The relationship between technique and performance variability (and consistency) has not been investigated in previous fast bowling research. Consequently, the aim of this study was to quantify both technique (bowling action and coordination) and performance variability in elite fast bowlers from Australian Junior and National Pace Squads. Technique variability was analysed to investigate whether it could be classified as functional or dysfunctional in relation to speed and accuracy.
Resumo:
Picturebooks invite performance every time they are read. What happens to them when they’re adapted for live performance? This ongoing practice led research project (2008-) regenerates and transforms picturebook The Empty City (Hachette/Livre 2007) by David Megarrity and Jonathon Oxlade into a live experience. In this rebuilding, interanimation of text and illustration on the picturebook page suddenly open up into a new and complex structure incorporating composition of music, animation, live action, projected image and performing objects. The presenter is the creator of both the source text and writer/composer of the adaptation, providing a unique vantage point that draws on sources from both within and without the creative process up to and including audience reception. From the foundations up, this paper’s focus is on deep, muddy sites of development in the adaptation process, unearthed treasures, and how perceptions of fear and safety push, sway and stress the building of a new performance work for children in content, form and process.
Resumo:
Numerous tools and techniques have been developed to eliminate or reduce waste and carry out lean concepts in the manufacturing environment. However, appropriate lean tools need to be selected and implemented in order to fulfil the manufacturer needs within their budgetary constraints. As a result, it is important to identify manufacturer needs and implement only those tools, which contribute maximum benefit to their needs. In this research a mathematical model is proposed for maximising the perceived value of manufacturer needs and developed a step-by-step methodology to select best performance metrics along with appropriate lean strategies within the budgetary constraints. With the help of a case study, the proposed model and method have been demonstrated.
Resumo:
Many traffic situations require drivers to cross or merge into a stream having higher priority. Gap acceptance theory enables us to model such processes to analyse traffic operation. This discussion demonstrated that numerical search fine tuned by statistical analysis can be used to determine the most likely critical gap for a sample of drivers, based on their largest rejected gap and accepted gap. This method shares some common features with the Maximum Likelihood Estimation technique (Troutbeck 1992) but lends itself well to contemporary analysis tools such as spreadsheet and is particularly analytically transparent. This method is considered not to bias estimation of critical gap due to very small rejected gaps or very large rejected gaps. However, it requires a sufficiently large sample that there is reasonable representation of largest rejected gap/accepted gap pairs within a fairly narrow highest likelihood search band.
Resumo:
Markov chain Monte Carlo (MCMC) estimation provides a solution to the complex integration problems that are faced in the Bayesian analysis of statistical problems. The implementation of MCMC algorithms is, however, code intensive and time consuming. We have developed a Python package, which is called PyMCMC, that aids in the construction of MCMC samplers and helps to substantially reduce the likelihood of coding error, as well as aid in the minimisation of repetitive code. PyMCMC contains classes for Gibbs, Metropolis Hastings, independent Metropolis Hastings, random walk Metropolis Hastings, orientational bias Monte Carlo and slice samplers as well as specific modules for common models such as a module for Bayesian regression analysis. PyMCMC is straightforward to optimise, taking advantage of the Python libraries Numpy and Scipy, as well as being readily extensible with C or Fortran.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
This paper investigates the factors that drive high levels of corporate sustainability performance (CSP), as proxied by membership of the Dow Jones Sustainability World Index. Using a stakeholder framework, we examine the incentives for US firms to invest in sustainability principles and develop a number of hypotheses that relate CSP to firm-specific characteristics. Our results indicate that leading CSP firms are significantly larger, have higher levels of growth and a higher return on equity than conventional firms. Contrary to our predictions, leading CSP firms do not have greater free cash flows or lower leverage than other firms.
Resumo:
The aim of the study is to establish optimum building aspect ratios and south window sizes of residential buildings from thermal performance point of view. The effects of 6 different building aspect ratios and eight different south window sizes for each building aspect ratio are analyzed for apartments located at intermediate floors of buildings, by the aid of the computer based thermal analysis program SUNCODE-PC in five cities of Turkey: Erzurum, Ankara, Diyarbakir, Izmir, and Antalya. The results are evaluated in terms of annual energy consumption and the optimum values are driven. Comparison of optimum values and the total energy consumption rates is made among the analyzed cities.
Resumo:
Human facial expression is a complex process characterized of dynamic, subtle and regional emotional features. State-of-the-art approaches on facial expression recognition (FER) have not fully utilized this kind of features to improve the recognition performance. This paper proposes an approach to overcome this limitation using patch-based ‘salient’ Gabor features. A set of 3D patches are extracted to represent the subtle and regional features, and then inputted into patch matching operations for capturing the dynamic features. Experimental results show a significant performance improvement of the proposed approach due to the use of the dynamic features. Performance comparison with pervious work also confirms that the proposed approach achieves the highest CRR reported to date on the JAFFE database and a top-level performance on the Cohn-Kanade (CK) database.