937 resultados para Virtual Performance
Resumo:
Increasingly semiconductor manufacturers are exploring opportunities for virtual metrology (VM) enabled process monitoring and control as a means of reducing non-value added metrology and achieving ever more demanding wafer fabrication tolerances. However, developing robust, reliable and interpretable VM models can be very challenging due to the highly correlated input space often associated with the underpinning data sets. A particularly pertinent example is etch rate prediction of plasma etch processes from multichannel optical emission spectroscopy data. This paper proposes a novel input-clustering based forward stepwise regression methodology for VM model building in such highly correlated input spaces. Max Separation Clustering (MSC) is employed as a pre-processing step to identify a reduced srt of well-conditioned, representative variables that can then be used as inputs to state-of-the-art model building techniques such as Forward Selection Regression (FSR), Ridge regression, LASSO and Forward Selection Ridge Regression (FCRR). The methodology is validated on a benchmark semiconductor plasma etch dataset and the results obtained are compared with those achieved when the state-of-art approaches are applied directly to the data without the MSC pre-processing step. Significant performance improvements are observed when MSC is combined with FSR (13%) and FSRR (8.5%), but not with Ridge Regression (-1%) or LASSO (-32%). The optimal VM results are obtained using the MSC-FSR and MSC-FSRR generated models. © 2012 IEEE.
Resumo:
Virtual metrology (VM) aims to predict metrology values using sensor data from production equipment and physical metrology values of preceding samples. VM is a promising technology for the semiconductor manufacturing industry as it can reduce the frequency of in-line metrology operations and provide supportive information for other operations such as fault detection, predictive maintenance and run-to-run control. The prediction models for VM can be from a large variety of linear and nonlinear regression methods and the selection of a proper regression method for a specific VM problem is not straightforward, especially when the candidate predictor set is of high dimension, correlated and noisy. Using process data from a benchmark semiconductor manufacturing process, this paper evaluates the performance of four typical regression methods for VM: multiple linear regression (MLR), least absolute shrinkage and selection operator (LASSO), neural networks (NN) and Gaussian process regression (GPR). It is observed that GPR performs the best among the four methods and that, remarkably, the performance of linear regression approaches that of GPR as the subset of selected input variables is increased. The observed competitiveness of high-dimensional linear regression models, which does not hold true in general, is explained in the context of extreme learning machines and functional link neural networks.
Resumo:
Building Information Modelling (BIM) is growing in pace, not only in design and construction stages, but also in the analysis of facilities throughout their life cycle. With this continued growth and utilisation of BIM processes, comes the possibility to adopt such procedures, to accurately measure the energy efficiency of buildings, to accurately estimate their energy usage. To this end, the aim of this research is to investigate if the introduction of BIM Energy Performance Assessment in the form of software analysis, provides accurate results, when compared with actual energy consumption recorded. Through selective sampling, three domestic case studies are scrutinised, with baseline figures taken from existing energy providers, the results scrutinised and compared with calculations provided from two separate BIM energy analysis software packages. Of the numerous software packages available, criterion sampling is used to select two of the most prominent platforms available on the market today. The two packages selected for scrutiny are Integrated Environmental Solutions - Virtual Environment (IES-VE) and Green Building Studio (GBS). The results indicate that IES-VE estimated the energy use in region of ±8% in two out of three case studies while GBS estimated usage approximately ±5%. The findings indicate that the introduction of BIM energy performance assessment, using proprietary software analysis, is a viable alternative to manual calculations of building energy use, mainly due to the accuracy and speed of assessing, even the most complex models. Given the surge in accurate and detailed BIM models and the importance placed on the continued monitoring and control of buildings energy use within today’s environmentally conscious society, this provides an alternative means by which to accurately assess a buildings energy usage, in a quick and cost effective manner.
Resumo:
To intercept a moving object, one needs to be in the right place at the right time. In order to do this, it is necessary to pick up and use perceptual information that specifies the time to arrival of an object at an interception point. In the present study, we examined the ability to intercept a laterally moving virtual sound object by controlling the displacement of a sliding handle and tested whether and how the interaural time difference (ITD) could be the main source of perceptual information for successfully intercepting the virtual object. The results revealed that in order to accomplish the task, one might need to vary the duration of the movement, control the hand velocity and time to reach the peak velocity (speed coupling), while the adjustment of movement initiation did not facilitate performance. Furthermore, the overall performance was more successful when subjects employed a time-to-contact (tau) coupling strategy. This result shows that prospective information is available in sound for guiding goal-directed actions.
Resumo:
Virtual metrology (VM) aims to predict metrology values using sensor data from production equipment and physical metrology values of preceding samples. VM is a promising technology for the semiconductor manufacturing industry as it can reduce the frequency of in-line metrology operations and provide supportive information for other operations such as fault detection, predictive maintenance and run-to-run control. Methods with minimal user intervention are required to perform VM in a real-time industrial process. In this paper we propose extreme learning machines (ELM) as a competitive alternative to popular methods like lasso and ridge regression for developing VM models. In addition, we propose a new way to choose the hidden layer weights of ELMs that leads to an improvement in its prediction performance.
Resumo:
Although technology can facilitate improvements in performance by allowing us to understand, monitor and evaluate performance, improvements must ultimately come from within the athlete. The first part of this article will focus on understanding how perception and action relate to performance from two different theoretical viewpoints. The first will be predominantly a cognitive or indirect approach that suggests that expertise and decision-making processes are mediated by athletes accruing large knowledge bases that are built up through practice and experience. The second, and alternative approach, will advocate a more 'direct' solution, where the athlete learns to 'tune' into the relevant information that is embedded in their relationship with the surrounding environment and unfolding action. The second part of the article will attempt to show how emerging virtual reality technology is revealing new evidence that helps us understand elite performance. Possibilities of how new types of training could be developed from this technology will also be discussed. © 2014 Crown Copyright.
Resumo:
This chapter focuses on the relationship between improvisation and indeterminacy. We discuss the two practices by referring to play theory and game studies and situate it in recent network music performance. We will develop a parallel with game theory in which indeterminacy is seen as a way of articulating situations where structural decisions are left to the discernment of the performers and discuss improvisation as a method of play. The improvisation-indeterminacy relationship is discussed in the context of network music performance, which employs digital networks in the exchange of data between performers and hence relies on topological structures with varying degrees of openness and flexibility. Artists such as Max Neuhaus and The League of Automatic Music Composers initiated the development of a multitude of practices and technologies exploring the network as an environment for music making. Even though the technologies behind “the network” have shifted dramatically since Neuhaus’ use of radio in the 1960’s, a preoccupation with distribution and sharing of artistic agency has remained at the centre of networked practices. Gollo Föllmer, after undertaking an extensive review of network music initiatives, produced a typology that comprises categories as diverse as remix lists, sound toys, real/virtual space installations and network performances. For Föllmer, “the term ‘Net music’ comprises all formal and stylistic kinds of music upon which the specifics of electronic networks leave considerable traces, whereby the electronic networks strongly influence the process of musical production, the musical aesthetic, or the way music is received” (2005: 185).
Resumo:
How can applications be deployed on the cloud to achieve maximum performance? This question has become significant and challenging with the availability of a wide variety of Virtual Machines (VMs) with different performance capabilities in the cloud. The above question is addressed by proposing a six step benchmarking methodology in which a user provides a set of four weights that indicate how important each of the following groups: memory, processor, computation and storage are to the application that needs to be executed on the cloud. The weights along with cloud benchmarking data are used to generate a ranking of VMs that can maximise performance of the application. The rankings are validated through an empirical analysis using two case study applications, the first is a financial risk application and the second is a molecular dynamics simulation, which are both representative of workloads that can benefit from execution on the cloud. Both case studies validate the feasibility of the methodology and highlight that maximum performance can be achieved on the cloud by selecting the top ranked VMs produced by the methodology.
Resumo:
With the availability of a wide range of cloud Virtual Machines (VMs) it is difficult to determine which VMs can maximise the performance of an application. Benchmarking is commonly used to this end for capturing the performance of VMs. Most cloud benchmarking techniques are typically heavyweight - time consuming processes which have to benchmark the entire VM in order to obtain accurate benchmark data. Such benchmarks cannot be used in real-time on the cloud and incur extra costs even before an application is deployed.
In this paper, we present lightweight cloud benchmarking techniques that execute quickly and can be used in near real-time on the cloud. The exploration of lightweight benchmarking techniques are facilitated by the development of DocLite - Docker Container-based Lightweight Benchmarking. DocLite is built on the Docker container technology which allows a user-defined portion (such as memory size and the number of CPU cores) of the VM to be benchmarked. DocLite operates in two modes, in the first mode, containers are used to benchmark a small portion of the VM to generate performance ranks. In the second mode, historic benchmark data is used along with the first mode as a hybrid to generate VM ranks. The generated ranks are evaluated against three scientific high-performance computing applications. The proposed techniques are up to 91 times faster than a heavyweight technique which benchmarks the entire VM. It is observed that the first mode can generate ranks with over 90% and 86% accuracy for sequential and parallel execution of an application. The hybrid mode improves the correlation slightly but the first mode is sufficient for benchmarking cloud VMs.
Resumo:
Research in the field of sports performance is constantly developing new technology to help extract meaningful data to aid in understanding in a multitude of areas such as improving technical or motor performance. Video playback has previously been extensively used for exploring anticipatory behaviour. However, when using such systems, perception is not active. This loses key information that only emerges from the dynamics of the action unfolding over time and the active perception of the observer. Virtual reality (VR) may be used to overcome such issues. This paper presents the architecture and initial implementation of a novel VR cricket simulator, utilising state of the art motion capture technology (21 Vicon cameras capturing kinematic profile of elite bowlers) and emerging VR technology (Intersense IS-900 tracking combined with Qualisys Motion capture cameras with visual display via Sony Head Mounted Display HMZ-T1), applied in a cricket scenario to examine varying components of decision and action for cricket batters. This provided an experience with a high level of presence allowing for a real-time egocentric view-point to be presented to participants. Cyclical user-testing was carried out, utilisng both qualitative and quantitative approaches, with users reporting a positive experience in use of the system.
Resumo:
In recent years, a new paradigm for communication called cooperative communications has been proposed for which initial information theoretic studies have shown the potential for improvements in capacity over traditional multi-hop wireless networks. Extensive research has been done to mitigate the impact of fading in wireless networks, being mostly focused on Multiple-Input Multiple-Output (MIMO) systems. Recently, cooperative relaying techniques have been investigated to increase the performance of wireless systems by using diversity created by different single antenna devices, aiming to reach the same level of performance of MIMO systems with low cost devices. Cooperative communication is a promising method to achieve high spectrum efficiency and improve transmission capacity for wireless networks. Cooperative communications is the general idea of pooling the resources of distributed nodes to improve the overall performance of a wireless network. In cooperative networks the nodes cooperate to help each other. A cooperative node offering help is acting like a middle man or proxy and can convey messages from source to destination. Cooperative communication involves exploiting the broadcast nature of the wireless medium to form virtual antenna arrays out of independent singleantenna network nodes for transmission. This research aims at contributing to the field of cooperative wireless networks. The focus of this research is on the relay-based Medium Access Control (MAC) protocol. Specifically, I provide a framework for cooperative relaying called RelaySpot which comprises on opportunistic relay selection, cooperative relay scheduling and relay switching. RelaySpot-based solutions are expected to minimize signaling exchange, remove estimation of channel conditions, and improve the utilization of spatial diversity, minimizing outage and increasing reliability.
Resumo:
Thesis (Ph. D.)--University of Washington, 1987
Resumo:
Tese de doutoramento, Cirurgia Geral (Medicina), Universidade de Lisboa, Faculdade de Medicina, 2014
Resumo:
Relatório da Prática de Ensino Supervisionada, Mestrado em Ensino de Informática, Universidade de Lisboa, 2014
Resumo:
The article This is Live this is now (2011), contextualises the performance Under the Covers (2009) by Zoo Indigo. The journal article is written by Ildiko Rippel on behalf of Zoo Indigo theatre company (Rosie Garton and Ildiko Rippel) and it is published in the online Body, Space and Technology Journal: