906 resultados para Experimental performance metrics


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of measurement-enabled production is based on integrating metrology systems into production processes and generated significant interest in industry, due to its potential to increase process capability and accuracy, which in turn reduces production times and eliminates defective parts. One of the most promising methods of integrating metrology into production is the usage of external metrology systems to compensate machine tool errors in real time. The development and experimental performance evaluation of a low-cost, prototype three-axis machine tool that is laser tracker assisted are described in this paper. Real-time corrections of the machine tool's absolute volumetric error have been achieved. As a result, significant increases in static repeatability and accuracy have been demonstrated, allowing the low-cost three-axis machine tool to reliably reach static positioning accuracies below 35 μm throughout its working volume without any prior calibration or error mapping. This is a significant technical development that demonstrated the feasibility of the proposed methods and can have wide-scale industrial applications by enabling low-cost and structural integrity machine tools that could be deployed flexibly as end-effectors of robotic automation, to achieve positional accuracies that were the preserve of large, high-precision machine tools.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - Despite many Maintenance Repair and Overhaul (MRO) organisations alluding their positive business performances to the adoption Lean initiatives, there is a paucity of direct literature that validates this assertion. Thus, the purpose of this paper is to study empirically via the use of an industry-wide survey to establish and extent of Lean adoption and to verify its suitability in mitigating prevalent MRO challenges. Design/methodology/approach - The empirical study contained in this paper is facilitated by an industry-wide survey to collect data from several firms across the MRO spectrum. The analysed responses from industry leaders, professionals and executives synthesised with existing literature was used in ascertaining the extent of Lean adoption within the operational framework of the industry. Findings - The empirical study helped in validating the suitability of Lean in MRO context. However, it was also observed that the focus of its application was skewed towards its production-orientated functions more than its service-orientated functions. Nonetheless, this paper presents results of the positive influence of Lean in MRO context. Research limitations/implications - This empirical study presented in this paper was carried out within a framework of key characteristics of operation. Although this approach is sufficient in assessing the industry's Lean status, further assessment can also be achieved within the context of relevant performance metrics which was not included in this paper. Practical implications - By exploring the industry's Lean status within the context of operational characteristics of operation, this study provides MRO practitioners with more awareness into some of the critical factors required for successful holistic Lean realisation. Social implications - The state-of-the-art of Lean within the aviation MRO context established through this research also contributes to the wider product-centric service environment by providing a platform that facilitates strategy development which ensures Lean success within this environment. Originality/value - Apart from validating the suitability of Lean in MRO contexts, by establishing the extent of Lean adoption within the context of the operational framework, this paper provides a clearer insight as to how successful Lean implementation can be achieved via a holistic implementation strategy balanced between the product-centric and service-centric aspects of the industry.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study focuses on empirical investigations and seeks implications by utilizing three different methodologies to test various aspects of trader behavior. The first methodology utilizes Prospect Theory to determine trader behavior during periods of extreme wealth contracting periods. Secondly, a threshold model to examine the sentiment variable is formulated and thirdly a study is made of the contagion effect and trader behavior. ^ The connection between consumers' sense of financial well-being or sentiment and stock market performance has been studied at length. However, without data on actual versus experimental performance, implications based on this relationship are meaningless. The empirical agenda included examining a proprietary file of daily trader activities over a five-year period. Overall, during periods of extreme wealth altering conditions, traders "satisfice" rather than choose the "best" alternative. A trader's degree of loss aversion depends on his/her prior investment performance. A model that explains the behavior of traders during periods of turmoil is developed. Prospect Theory and the data file influenced the design of the model. ^ Additional research included testing a model that permitted the data to signal the crisis through a threshold model. The third empirical study sought to investigate the existence of contagion caused by declining global wealth effects using evidence from the mining industry in Canada. Contagion, where a financial crisis begins locally and subsequently spreads elsewhere, has been studied in terms of correlations among similar regions. The results provide support for Prospect Theory in two out of the three empirical studies. ^ The dissertation emphasizes the need for specifying precise, testable models of investors' expectations by providing tools to identify paradoxical behavior patterns. True enhancements in this field must include empirical research utilizing reliable data sources to mitigate data mining problems and allow researchers to distinguish between expectations-based and risk-based explanations of behavior. Through this type of research, it may be possible to systematically exploit "irrational" market behavior. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lake Okeechobee, Florida, located in the middle of the larger Kissimmee River-Lake Okeechobee-Everglades ecosystem in South Florida, serves a variety of ecosystem and water management functions including fish and wildlife habitat, flood control, water supply, and source water for environmental restoration. As a result, the ecological status of Lake Okeechobee plays a significant role in defining the overall success of the greater Everglades ecosystem restoration initiative. One of the major ecological indicators of Lake Okeechobee condition focuses on the near-shore and littoral zone regions as characterized by the distribution and abundance of submerged aquatic vegetation (SAV) and giant bulrush (Scirpus californicus(C.A. Mey.) Steud.). The objective of this study is to present a stoplight restoration report card communication system, common to all 11 indicators noted in this special journal issue, as a means to convey the status of SAV and bulrush in Lake Okeechobee. The report card could be used by managers, policy makers, scientists and the public to effectively evaluate and distill information about the ecological status in South Florida. Our assessment of the areal distribution of SAV in Lake Okeechobee is based on a combination of empirical SAV monitoring and output from a SAV habitat suitability model. Bulrush status in the lake is related to a suitability index linked to adult survival and seedling establishment metrics. Overall, presentation of these performance metrics in a stoplight format enables an evaluation of how the status of two major components of Lake Okeechobee relates to the South Florida restoration program, and how the status of the lake influences restoration efforts in South Florida.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study focuses on empirical investigations and seeks implications by utilizing three different methodologies to test various aspects of trader behavior. The first methodology utilizes Prospect Theory to determine trader behavior during periods of extreme wealth contracting periods. Secondly, a threshold model to examine the sentiment variable is formulated and thirdly a study is made of the contagion effect and trader behavior. The connection between consumers' sense of financial well-being or sentiment and stock market performance has been studied at length. However, without data on actual versus experimental performance, implications based on this relationship are meaningless. The empirical agenda included examining a proprietary file of daily trader activities over a five-year period. Overall, during periods of extreme wealth altering conditions, traders "satisfice" rather than choose the "best" alternative. A trader's degree of loss aversion depends on his/her prior investment performance. A model that explains the behavior of traders during periods of turmoil is developed. Prospect Theory and the data file influenced the design of the model. Additional research included testing a model that permitted the data to signal the crisis through a threshold model. The third empirical study sought to investigate the existence of contagion caused by declining global wealth effects using evidence from the mining industry in Canada. Contagion, where a financial crisis begins locally and subsequently spreads elsewhere, has been studied in terms of correlations among similar regions. The results provide support for Prospect Theory in two out of the three empirical studies. The dissertation emphasizes the need for specifying precise, testable models of investors' expectations by providing tools to identify paradoxical behavior patterns. True enhancements in this field must include empirical research utilizing reliable data sources to mitigate data mining problems and allow researchers to distinguish between expectations-based and risk-based explanations of behavior. Through this type of research, it may be possible to systematically exploit "irrational" market behavior.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nesta dissertação apresentamos um trabalho de desenvolvimento e utilização de pulsos de radiofreqüência modulados simultaneamente em freqüência, amplitude e fase (pulsos fortemente modulados, SMP, do inglês Strongly Modulated Pulses) para criar estados iniciais e executar operações unitárias que servem como blocos básicos para processamento da informação quântica utilizando Ressonância Magnética Nuclear (RMN). As implementações experimentais foram realizas em um sistema de 3 q-bits constituído por spins nucleares de Césio 133 (spin nuclear 7/2) em uma amostra de cristal líquido em fase nemática. Os pulsos SMP´s foram construídos teoricamente utilizando um programa especialmente desenvolvido para esse fim, sendo o mesmo baseado no processo de otimização numérica Simplex Nelder-Mead. Através deste programa, os pulsos SMP foram otimizados de modo a executarem as operações lógicas desejadas com durações consideravelmente menores que aquelas realizadas usando o procedimento usual de RMN, ou seja, seqüências de pulsos e evoluções livres. Isso tem a vantagem de reduzir os efeitos de descoerência decorrentes da relaxação do sistema. Os conceitos teóricos envolvidos na criação dos SMPs são apresentados e as principais dificuldades (experimentais e teóricas) que podem surgir devido ao uso desses procedimentos são discutidas. Como exemplos de aplicação, foram produzidos os estados pseudo-puros usados como estados iniciais de operações lógicas em RMN, bem como operações lógicas que foram posteriormente aplicadas aos mesmos. Utilizando os SMP\'s também foi possível realizar experimentalmente os algoritmos quânticos de Grover e Deutsch-Jozsa para 3 q-bits. A fidelidade das implementações experimentais foi determinadas utilizando as matrizes densidade experimentais obtidas utilizando um método de tomografia da matriz densidade previamente desenvolvido.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Internet and the Web have changed the way that companies communicate with their publics, improving relations between them. Also providing substantial benefits for organizations. This has led to small and medium enterprises (SMEs) to develop corporate sites to establish relationships with their audiences. This paper, applying the methodology of content analysis, analyzes the main factors and tools that make the Websites usable and intuitive sites that promote better relations between SMEs and their audiences. Also, it has developed an index to measure the effectiveness of Webs from the perspective of usability. The results indicate that the Websites have, in general, appropriate levels of usability.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes the evolution of a ‘Design - Build-Fly’ (DBF) approach to the delivery and assessment of a Stage Three Aircraft Design module. It focuses on the primary learning outcomes around the design and manufacturing functions associated with the development of a remotely controlled aircraft. The work covers a six year period from 2011 to present mapping the transformation of the module from report based assessment to a more hands on approach resulting in a fully functioning remotely controlled aircraft. Results show that both the staff and student experience improved across key performance metrics including student feedback, learning and competency development. Challenges still remain in methods of placing students within teams and maintaining technical rigour in reporting as students develop vocational skills and more reflective writing styles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

SQL Injection Attack (SQLIA) remains a technique used by a computer network intruder to pilfer an organisation’s confidential data. This is done by an intruder re-crafting web form’s input and query strings used in web requests with malicious intent to compromise the security of an organisation’s confidential data stored at the back-end database. The database is the most valuable data source, and thus, intruders are unrelenting in constantly evolving new techniques to bypass the signature’s solutions currently provided in Web Application Firewalls (WAF) to mitigate SQLIA. There is therefore a need for an automated scalable methodology in the pre-processing of SQLIA features fit for a supervised learning model. However, obtaining a ready-made scalable dataset that is feature engineered with numerical attributes dataset items to train Artificial Neural Network (ANN) and Machine Leaning (ML) models is a known issue in applying artificial intelligence to effectively address ever evolving novel SQLIA signatures. This proposed approach applies numerical attributes encoding ontology to encode features (both legitimate web requests and SQLIA) to numerical data items as to extract scalable dataset for input to a supervised learning model in moving towards a ML SQLIA detection and prevention model. In numerical attributes encoding of features, the proposed model explores a hybrid of static and dynamic pattern matching by implementing a Non-Deterministic Finite Automaton (NFA). This combined with proxy and SQL parser Application Programming Interface (API) to intercept and parse web requests in transition to the back-end database. In developing a solution to address SQLIA, this model allows processed web requests at the proxy deemed to contain injected query string to be excluded from reaching the target back-end database. This paper is intended for evaluating the performance metrics of a dataset obtained by numerical encoding of features ontology in Microsoft Azure Machine Learning (MAML) studio using Two-Class Support Vector Machines (TCSVM) binary classifier. This methodology then forms the subject of the empirical evaluation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Image (Video) retrieval is an interesting problem of retrieving images (videos) similar to the query. Images (Videos) are represented in an input (feature) space and similar images (videos) are obtained by finding nearest neighbors in the input representation space. Numerous input representations both in real valued and binary space have been proposed for conducting faster retrieval. In this thesis, we present techniques that obtain improved input representations for retrieval in both supervised and unsupervised settings for images and videos. Supervised retrieval is a well known problem of retrieving same class images of the query. We address the practical aspects of achieving faster retrieval with binary codes as input representations for the supervised setting in the first part, where binary codes are used as addresses into hash tables. In practice, using binary codes as addresses does not guarantee fast retrieval, as similar images are not mapped to the same binary code (address). We address this problem by presenting an efficient supervised hashing (binary encoding) method that aims to explicitly map all the images of the same class ideally to a unique binary code. We refer to the binary codes of the images as `Semantic Binary Codes' and the unique code for all same class images as `Class Binary Code'. We also propose a new class­ based Hamming metric that dramatically reduces the retrieval times for larger databases, where only hamming distance is computed to the class binary codes. We also propose a Deep semantic binary code model, by replacing the output layer of a popular convolutional Neural Network (AlexNet) with the class binary codes and show that the hashing functions learned in this way outperforms the state­ of ­the art, and at the same time provide fast retrieval times. In the second part, we also address the problem of supervised retrieval by taking into account the relationship between classes. For a given query image, we want to retrieve images that preserve the relative order i.e. we want to retrieve all same class images first and then, the related classes images before different class images. We learn such relationship aware binary codes by minimizing the similarity between inner product of the binary codes and the similarity between the classes. We calculate the similarity between classes using output embedding vectors, which are vector representations of classes. Our method deviates from the other supervised binary encoding schemes as it is the first to use output embeddings for learning hashing functions. We also introduce new performance metrics that take into account the related class retrieval results and show significant gains over the state­ of­ the art. High Dimensional descriptors like Fisher Vectors or Vector of Locally Aggregated Descriptors have shown to improve the performance of many computer vision applications including retrieval. In the third part, we will discuss an unsupervised technique for compressing high dimensional vectors into high dimensional binary codes, to reduce storage complexity. In this approach, we deviate from adopting traditional hyperplane hashing functions and instead learn hyperspherical hashing functions. The proposed method overcomes the computational challenges of directly applying the spherical hashing algorithm that is intractable for compressing high dimensional vectors. A practical hierarchical model that utilizes divide and conquer techniques using the Random Select and Adjust (RSA) procedure to compress such high dimensional vectors is presented. We show that our proposed high dimensional binary codes outperform the binary codes obtained using traditional hyperplane methods for higher compression ratios. In the last part of the thesis, we propose a retrieval based solution to the Zero shot event classification problem - a setting where no training videos are available for the event. To do this, we learn a generic set of concept detectors and represent both videos and query events in the concept space. We then compute similarity between the query event and the video in the concept space and videos similar to the query event are classified as the videos belonging to the event. We show that we significantly boost the performance using concept features from other modalities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Part 17: Risk Analysis

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tese (doutorado)—Universidade de Brasília, Faculdade de Economia, Administração e Contabilidade, Programa de Pós-Graduação em Administração, 2016.