960 resultados para Tracking performance
Resumo:
The importance of actively managing and analyzing business processes is acknowledged more than ever in organizations nowadays. Business processes form an essential part of an organization and their ap-plication areas are manifold. Most organizations keep records of various activities that have been carried out for auditing purposes, but they are rarely used for analysis purposes. This paper describes the design and implementation of a process analysis tool that replays, analyzes and visualizes a variety of performance metrics using a process definition and its execution logs. Performing performance analysis on existing and planned process models offers a great way for organizations to detect bottlenecks within their processes and allow them to make more effective process improvement decisions. Our technique is applied to processes modeled in the YAWL language. Execution logs of process instances are compared against the corresponding YAWL process model and replayed in a robust manner, taking into account any noise in the logs. Finally, performance characteristics, obtained from replaying the log in the model, are projected onto the model.
Resumo:
In this article Caroline Heim explores an avenue for the audience's contribution to the theatrical event that has emerged as increasingly important over the past decade: postperformance discussions. With the exception of theatres that actively encourage argument such as the Staatstheater Stuttgart, most extant audience discussions in Western mainstream theatres privilege the voice of the theatre expert. Caroline Heim presents case studies of post-performance discussions held after performances of Anne of the Thousand Days and Who's Afraid of Virginia Woolf? which trialled a new model of audience co-creation. An audience text which informs the theatrical event was created, and a new role, that of audience critic, established in the process.
Resumo:
A practical approach for identifying solution robustness is proposed for situations where parameters are uncertain. The approach is based upon the interpretation of a probability density function (pdf) and the definition of three parameters that describe how significant changes in the performance of a solution are deemed to be. The pdf is constructed by interpreting the results of simulations. A minimum number of simulations are achieved by updating the mean, variance, skewness and kurtosis of the sample using computationally efficient recursive equations. When these criterions have converged then no further simulations are needed. A case study involving several no-intermediate storage flow shop scheduling problems demonstrates the effectiveness of the approach.
Resumo:
While researchers strive to improve automatic face recognition performance, the relationship between image resolution and face recognition performance has not received much attention. This relationship is examined systematically and a framework is developed such that results from super-resolution techniques can be compared. Three super-resolution techniques are compared with the Eigenface and Elastic Bunch Graph Matching face recognition engines. Parameter ranges over which these techniques provide better recognition performance than interpolated images is determined.
Resumo:
The time consuming and labour intensive task of identifying individuals in surveillance video is often challenged by poor resolution and the sheer volume of stored video. Faces or identifying marks such as tattoos are often too coarse for direct matching by machine or human vision. Object tracking and super-resolution can then be combined to facilitate the automated detection and enhancement of areas of interest. The object tracking process enables the automatic detection of people of interest, greatly reducing the amount of data for super-resolution. Smaller regions such as faces can also be tracked. A number of instances of such regions can then be utilized to obtain a super-resolved version for matching. Performance improvement from super-resolution is demonstrated using a face verification task. It is shown that there is a consistent improvement of approximately 7% in verification accuracy, using both Eigenface and Elastic Bunch Graph Matching approaches for automatic face verification, starting from faces with an eye to eye distance of 14 pixels. Visual improvement in image fidelity from super-resolved images over low-resolution and interpolated images is demonstrated on a small database. Current research and future directions in this area are also summarized.
Resumo:
This letter presents a technique to assess the overall network performance of sampled value process buses based on IEC 61850-9-2 using measurements from a single location in the network. The method is based upon the use of Ethernet cards with externally synchronized time stamping, and characteristics of the process bus protocol. The application and utility of the method is demonstrated by measuring latency introduced by Ethernet switches. Network latency can be measured from a single set of captures, rather than comparing source and destination captures. Absolute latency measures will greatly assist the design testing, commissioning and maintenance of these critical data networks.
Resumo:
In order to support intelligent transportation system (ITS) road safety applications such as collision avoidance, lane departure warnings and lane keeping, Global Navigation Satellite Systems (GNSS) based vehicle positioning system has to provide lane-level (0.5 to 1 m) or even in-lane-level (0.1 to 0.3 m) accurate and reliable positioning information to vehicle users. However, current vehicle navigation systems equipped with a single frequency GPS receiver can only provide road-level accuracy at 5-10 meters. The positioning accuracy can be improved to sub-meter or higher with the augmented GNSS techniques such as Real Time Kinematic (RTK) and Precise Point Positioning (PPP) which have been traditionally used in land surveying and or in slowly moving environment. In these techniques, GNSS corrections data generated from a local or regional or global network of GNSS ground stations are broadcast to the users via various communication data links, mostly 3G cellular networks and communication satellites. This research aimed to investigate the precise positioning system performances when operating in the high mobility environments. This involves evaluation of the performances of both RTK and PPP techniques using: i) the state-of-art dual frequency GPS receiver; and ii) low-cost single frequency GNSS receiver. Additionally, this research evaluates the effectiveness of several operational strategies in reducing the load on data communication networks due to correction data transmission, which may be problematic for the future wide-area ITS services deployment. These strategies include the use of different data transmission protocols, different correction data format standards, and correction data transmission at the less-frequent interval. A series of field experiments were designed and conducted for each research task. Firstly, the performances of RTK and PPP techniques were evaluated in both static and kinematic (highway with speed exceed 80km) experiments. RTK solutions achieved the RMS precision of 0.09 to 0.2 meter accuracy in static and 0.2 to 0.3 meter in kinematic tests, while PPP reported 0.5 to 1.5 meters in static and 1 to 1.8 meter in kinematic tests by using the RTKlib software. These RMS precision values could be further improved if the better RTK and PPP algorithms are adopted. The tests results also showed that RTK may be more suitable in the lane-level accuracy vehicle positioning. The professional grade (dual frequency) and mass-market grade (single frequency) GNSS receivers were tested for their performance using RTK in static and kinematic modes. The analysis has shown that mass-market grade receivers provide the good solution continuity, although the overall positioning accuracy is worse than the professional grade receivers. In an attempt to reduce the load on data communication network, we firstly evaluate the use of different correction data format standards, namely RTCM version 2.x and RTCM version 3.0 format. A 24 hours transmission test was conducted to compare the network throughput. The results have shown that 66% of network throughput reduction can be achieved by using the newer RTCM version 3.0, comparing to the older RTCM version 2.x format. Secondly, experiments were conducted to examine the use of two data transmission protocols, TCP and UDP, for correction data transmission through the Telstra 3G cellular network. The performance of each transmission method was analysed in terms of packet transmission latency, packet dropout, packet throughput, packet retransmission rate etc. The overall network throughput and latency of UDP data transmission are 76.5% and 83.6% of TCP data transmission, while the overall accuracy of positioning solutions remains in the same level. Additionally, due to the nature of UDP transmission, it is also found that 0.17% of UDP packets were lost during the kinematic tests, but this loss doesn't lead to significant reduction of the quality of positioning results. The experimental results from the static and the kinematic field tests have also shown that the mobile network communication may be blocked for a couple of seconds, but the positioning solutions can be kept at the required accuracy level by setting of the Age of Differential. Finally, we investigate the effects of using less-frequent correction data (transmitted at 1, 5, 10, 15, 20, 30 and 60 seconds interval) on the precise positioning system. As the time interval increasing, the percentage of ambiguity fixed solutions gradually decreases, while the positioning error increases from 0.1 to 0.5 meter. The results showed the position accuracy could still be kept at the in-lane-level (0.1 to 0.3 m) when using up to 20 seconds interval correction data transmission.
Resumo:
Modelling activities in crowded scenes is very challenging as object tracking is not robust in complicated scenes and optical flow does not capture long range motion. We propose a novel approach to analyse activities in crowded scenes using a “bag of particle trajectories”. Particle trajectories are extracted from foreground regions within short video clips using particle video, which estimates long range motion in contrast to optical flow which is only concerned with inter-frame motion. Our applications include temporal video segmentation and anomaly detection, and we perform our evaluation on several real-world datasets containing complicated scenes. We show that our approaches achieve state-of-the-art performance for both tasks.
Resumo:
The Pomegranate Cycle is a practice-led enquiry consisting of a creative work and an exegesis. This project investigates the potential of self-directed, technologically mediated composition as a means of reconfiguring gender stereotypes within the operatic tradition. This practice confronts two primary stereotypes: the positioning of female performing bodies within narratives of violence and the absence of women from authorial roles that construct and regulate the operatic tradition. The Pomegranate Cycle redresses these stereotypes by presenting a new narrative trajectory of healing for its central character, and by placing the singer inside the role of composer and producer. During the twentieth and early twenty-first century, operatic and classical music institutions have resisted incorporating works of living composers into their repertory. Consequently, the canon’s historic representations of gender remain unchallenged. Historically and contemporarily, men have almost exclusively occupied the roles of composer, conductor, director and critic, and therefore men have regulated the pedagogy, performance practices, repertoire and organisations that sustain classical music. In this landscape, women are singers, and few have the means to challenge the constructions of gender they are asked to reproduce. The Pomegranate Cycle uses recording technologies as the means of driving change because these technologies have already challenged the regulation of the classical tradition by changing people’s modes of accessing, creating and interacting with music. Building on the work of artists including Phillips and van Veen, Robert Ashley and Diamanda Galas, The Pomegranate Cycle seeks to broaden the definition of what opera can be. This work examines the ways in which the operatic tradition can be hybridised with contemporary musical forms such as ambient electronica, glitch, spoken word and concrete sounds as a way of bringing the form into dialogue with contemporary music cultures. The ultilisation of other sound cultures within the context of opera enables women’s voices and stories to be presented in new ways, while also providing a point of friction with opera’s traditional storytelling devices. The Pomegranate Cycle simulates aesthetics associated with Western art music genres by drawing on contemporary recording techniques, virtual instruments and sound-processing plug-ins. Through such simulations, the work disrupts the way virtuosic human craft has been used to generate authenticity and regulate access to the institutions that protect and produce Western art music. The DIY approach to production, recording, composition and performance of The Pomegranate Cycle demonstrates that an opera can be realised by a single person. Access to the broader institutions which regulate the tradition are not necessary. In short, The Pomegranate Cycle establishes that a singer can be more than a voice and a performing body. She can be her own multimedia storyteller. Her audience can be anywhere.
Resumo:
Sandwich shells have recently emerged as aesthetically pleasing, efficient and economical structural systems, with a number of applications. They combine the advantages of sandwich layer technology together with those of shell action. With different materials and thicknesses used in the sandwich layers, their performance characteristics largely remain un-quantified and there are no guidelines at present for their design. This research paper provides verification, through finite element modeling and testing, for the application of this technology to dome styled dwellings with research currently being conducted into the further application to roofing and floor structures.
Resumo:
A ground-based tracking camera and co-aligned slit-less spectrograph were used to measure the spectral signature of visible radiation emitted from the Hayabusa capsule as it entered into the Earth's atmosphere in June 2010. Good quality spectra were obtained that showed the presence of radiation from the heat shield of the vehicle and the shock-heated air in front of the vehicle. An analysis of the black body nature of the radiation concluded that the peak average temperature of the surface was about (3100±100) K.
Resumo:
The volcanic succession on Montserrat provides an opportunity to examine the magmatic evolution of island arc volcanism over a ∼2.5 Ma period, extending from the andesites of the Silver Hills center, to the currently active Soufrière Hills volcano (February 2010). Here we present high-precision double-spike Pb isotope data, combined with trace element and Sr-Nd isotope data throughout this period of Montserrat's volcanic evolution. We demonstrate that each volcanic center; South Soufrière Hills, Soufrière Hills, Centre Hills and Silver Hills, can be clearly discriminated using trace element and isotopic parameters. Variations in these parameters suggest there have been systematic and episodic changes in the subduction input. The SSH center, in particular, has a greater slab fluid signature, as indicated by low Ce/Pb, but less sediment addition than the other volcanic centers, which have higher Th/Ce. Pb isotope data from Montserrat fall along two trends, the Silver Hills, Centre Hills and Soufrière Hills lie on a general trend of the Lesser Antilles volcanics, whereas SSH volcanics define a separate trend. The Soufrière Hills and SSH volcanic centers were erupted at approximately the same time, but retain distinctive isotopic signatures, suggesting that the SSH magmas have a different source to the other volcanic centers. We hypothesize that this rapid magmatic source change is controlled by the regional transtensional regime, which allowed the SSH magma to be extracted from a shallower source. The Pb isotopes indicate an interplay between subduction derived components and a MORB-like mantle wedge influenced by a Galapagos plume-like source.
Resumo:
The quick detection of abrupt (unknown) parameter changes in an observed hidden Markov model (HMM) is important in several applications. Motivated by the recent application of relative entropy concepts in the robust sequential change detection problem (and the related model selection problem), this paper proposes a sequential unknown change detection algorithm based on a relative entropy based HMM parameter estimator. Our proposed approach is able to overcome the lack of knowledge of post-change parameters, and is illustrated to have similar performance to the popular cumulative sum (CUSUM) algorithm (which requires knowledge of the post-change parameter values) when examined, on both simulated and real data, in a vision-based aircraft manoeuvre detection problem.