4 resultados para Metrics of managment

em Glasgow Theses Service


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Conventional Si complementary-metal-oxide-semiconductor (CMOS) scaling is fast approaching its limits. The extension of the logic device roadmap for future enhancements in transistor performance requires non-Si materials and new device architectures. III-V materials, due to their superior electron transport properties, are well poised to replace Si as the channel material beyond the 10nm technology node to mitigate the performance loss of Si transistors from further reductions in supply voltage to minimise power dissipation in logic circuits. However several key challenges, including a high quality dielectric/III-V gate stack, a low-resistance source/drain (S/D) technology, heterointegration onto a Si platform and a viable III-V p-metal-oxide-semiconductor field-effect-transistor (MOSFET), need to be addressed before III-Vs can be employed in CMOS. This Thesis specifically addressed the development and demonstration of planar III-V p-MOSFETs, to complement the n-MOSFET, thereby enabling an all III-V CMOS technology to be realised. This work explored the application of InGaAs and InGaSb material systems as the channel, in conjunction with Al2O3/metal gate stacks, for p-MOSFET development based on the buried-channel flatband device architecture. The body of work undertaken comprised material development, process module development and integration into a robust fabrication flow for the demonstration of p-channel devices. The parameter space in the design of the device layer structure, based around the III-V channel/barrier material options of Inx≥0.53Ga1-xAs/In0.52Al0.48As and Inx≥0.1Ga1-xSb/AlSb, was systematically examined to improve hole channel transport. A mobility of 433 cm2/Vs, the highest room temperature hole mobility of any InGaAs quantum-well channel reported to date, was obtained for the In0.85Ga0.15As (2.1% strain) structure. S/D ohmic contacts were developed based on thermally annealed Au/Zn/Au metallisation and validated using transmission line model test structures. The effects of metallisation thickness, diffusion barriers and de-oxidation conditions were examined. Contacts to InGaSb-channel structures were found to be sensitive to de-oxidation conditions. A fabrication process, based on a lithographically-aligned double ohmic patterning approach, was realised for deep submicron gate-to-source/drain gap (Lside) scaling to minimise the access resistance, thereby mitigating the effects of parasitic S/D series resistance on transistor performance. The developed process yielded gaps as small as 20nm. For high-k integration on GaSb, ex-situ ammonium sulphide ((NH4)2S) treatments, in the range 1%-22%, for 10min at 295K were systematically explored for improving the electrical properties of the Al2O3/GaSb interface. Electrical and physical characterisation indicated the 1% treatment to be most effective with interface trap densities in the range of 4 - 10×1012cm-2eV-1 in the lower half of the bandgap. An extended study, comprising additional immersion times at each sulphide concentration, was further undertaken to determine the surface roughness and the etching nature of the treatments on GaSb. A number of p-MOSFETs based on III-V-channels with the most promising hole transport and integration of the developed process modules were successfully demonstrated in this work. Although the non-inverted InGaAs-channel devices showed good current modulation and switch-off characteristics, several aspects of performance were non-ideal; depletion-mode operation, modest drive current (Id,sat=1.14mA/mm), double peaked transconductance (gm=1.06mS/mm), high subthreshold swing (SS=301mV/dec) and high on-resistance (Ron=845kΩ.μm). Despite demonstrating substantial improvement in the on-state metrics of Id,sat (11×), gm (5.5×) and Ron (5.6×), inverted devices did not switch-off. Scaling gate-to-source/drain gap (Lside) from 1μm down to 70nm improved Id,sat (72.4mA/mm) by a factor of 3.6 and gm (25.8mS/mm) by a factor of 4.1 in inverted InGaAs-channel devices. Well-controlled current modulation and good saturation behaviour was observed for InGaSb-channel devices. In the on-state In0.3Ga0.7Sb-channel (Id,sat=49.4mA/mm, gm=12.3mS/mm, Ron=31.7kΩ.μm) and In0.4Ga0.6Sb-channel (Id,sat=38mA/mm, gm=11.9mS/mm, Ron=73.5kΩ.μm) devices outperformed the InGaAs-channel devices. However the devices could not be switched off. These findings indicate that III-V p-MOSFETs based on InGaSb as opposed to InGaAs channels are more suited as the p-channel option for post-Si CMOS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates how web search evaluation can be improved using historical interaction data. Modern search engines combine offline and online evaluation approaches in a sequence of steps that a tested change needs to pass through to be accepted as an improvement and subsequently deployed. We refer to such a sequence of steps as an evaluation pipeline. In this thesis, we consider the evaluation pipeline to contain three sequential steps: an offline evaluation step, an online evaluation scheduling step, and an online evaluation step. In this thesis we show that historical user interaction data can aid in improving the accuracy or efficiency of each of the steps of the web search evaluation pipeline. As a result of these improvements, the overall efficiency of the entire evaluation pipeline is increased. Firstly, we investigate how user interaction data can be used to build accurate offline evaluation methods for query auto-completion mechanisms. We propose a family of offline evaluation metrics for query auto-completion that represents the effort the user has to spend in order to submit their query. The parameters of our proposed metrics are trained against a set of user interactions recorded in the search engine’s query logs. From our experimental study, we observe that our proposed metrics are significantly more correlated with an online user satisfaction indicator than the metrics proposed in the existing literature. Hence, fewer changes will pass the offline evaluation step to be rejected after the online evaluation step. As a result, this would allow us to achieve a higher efficiency of the entire evaluation pipeline. Secondly, we state the problem of the optimised scheduling of online experiments. We tackle this problem by considering a greedy scheduler that prioritises the evaluation queue according to the predicted likelihood of success of a particular experiment. This predictor is trained on a set of online experiments, and uses a diverse set of features to represent an online experiment. Our study demonstrates that a higher number of successful experiments per unit of time can be achieved by deploying such a scheduler on the second step of the evaluation pipeline. Consequently, we argue that the efficiency of the evaluation pipeline can be increased. Next, to improve the efficiency of the online evaluation step, we propose the Generalised Team Draft interleaving framework. Generalised Team Draft considers both the interleaving policy (how often a particular combination of results is shown) and click scoring (how important each click is) as parameters in a data-driven optimisation of the interleaving sensitivity. Further, Generalised Team Draft is applicable beyond domains with a list-based representation of results, i.e. in domains with a grid-based representation, such as image search. Our study using datasets of interleaving experiments performed both in document and image search domains demonstrates that Generalised Team Draft achieves the highest sensitivity. A higher sensitivity indicates that the interleaving experiments can be deployed for a shorter period of time or use a smaller sample of users. Importantly, Generalised Team Draft optimises the interleaving parameters w.r.t. historical interaction data recorded in the interleaving experiments. Finally, we propose to apply the sequential testing methods to reduce the mean deployment time for the interleaving experiments. We adapt two sequential tests for the interleaving experimentation. We demonstrate that one can achieve a significant decrease in experiment duration by using such sequential testing methods. The highest efficiency is achieved by the sequential tests that adjust their stopping thresholds using historical interaction data recorded in diagnostic experiments. Our further experimental study demonstrates that cumulative gains in the online experimentation efficiency can be achieved by combining the interleaving sensitivity optimisation approaches, including Generalised Team Draft, and the sequential testing approaches. Overall, the central contributions of this thesis are the proposed approaches to improve the accuracy or efficiency of the steps of the evaluation pipeline: the offline evaluation frameworks for the query auto-completion, an approach for the optimised scheduling of online experiments, a general framework for the efficient online interleaving evaluation, and a sequential testing approach for the online search evaluation. The experiments in this thesis are based on massive real-life datasets obtained from Yandex, a leading commercial search engine. These experiments demonstrate the potential of the proposed approaches to improve the efficiency of the evaluation pipeline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maintaining accessibility to and understanding of digital information over time is a complex challenge that often requires contributions and interventions from a variety of individuals and organizations. The processes of preservation planning and evaluation are fundamentally implicit and share similar complexity. Both demand comprehensive knowledge and understanding of every aspect of to-be-preserved content and the contexts within which preservation is undertaken. Consequently, means are required for the identification, documentation and association of those properties of data, representation and management mechanisms that in combination lend value, facilitate interaction and influence the preservation process. These properties may be almost limitless in terms of diversity, but are integral to the establishment of classes of risk exposure, and the planning and deployment of appropriate preservation strategies. We explore several research objectives within the course of this thesis. Our main objective is the conception of an ontology for risk management of digital collections. Incorporated within this are our aims to survey the contexts within which preservation has been undertaken successfully, the development of an appropriate methodology for risk management, the evaluation of existing preservation evaluation approaches and metrics, the structuring of best practice knowledge and lastly the demonstration of a range of tools that utilise our findings. We describe a mixed methodology that uses interview and survey, extensive content analysis, practical case study and iterative software and ontology development. We build on a robust foundation, the development of the Digital Repository Audit Method Based on Risk Assessment. We summarise the extent of the challenge facing the digital preservation community (and by extension users and creators of digital materials from many disciplines and operational contexts) and present the case for a comprehensive and extensible knowledge base of best practice. These challenges are manifested in the scale of data growth, the increasing complexity and the increasing onus on communities with no formal training to offer assurances of data management and sustainability. These collectively imply a challenge that demands an intuitive and adaptable means of evaluating digital preservation efforts. The need for individuals and organisations to validate the legitimacy of their own efforts is particularly prioritised. We introduce our approach, based on risk management. Risk is an expression of the likelihood of a negative outcome, and an expression of the impact of such an occurrence. We describe how risk management may be considered synonymous with preservation activity, a persistent effort to negate the dangers posed to information availability, usability and sustainability. Risk can be characterised according to associated goals, activities, responsibilities and policies in terms of both their manifestation and mitigation. They have the capacity to be deconstructed into their atomic units and responsibility for their resolution delegated appropriately. We continue to describe how the manifestation of risks typically spans an entire organisational environment, and as the focus of our analysis risk safeguards against omissions that may occur when pursuing functional, departmental or role-based assessment. We discuss the importance of relating risk-factors, through the risks themselves or associated system elements. To do so will yield the preservation best-practice knowledge base that is conspicuously lacking within the international digital preservation community. We present as research outcomes an encapsulation of preservation practice (and explicitly defined best practice) as a series of case studies, in turn distilled into atomic, related information elements. We conduct our analyses in the formal evaluation of memory institutions in the UK, US and continental Europe. Furthermore we showcase a series of applications that use the fruits of this research as their intellectual foundation. Finally we document our results in a range of technical reports and conference and journal articles. We present evidence of preservation approaches and infrastructures from a series of case studies conducted in a range of international preservation environments. We then aggregate this into a linked data structure entitled PORRO, an ontology relating preservation repository, object and risk characteristics, intended to support preservation decision making and evaluation. The methodology leading to this ontology is outlined, and lessons are exposed by revisiting legacy studies and exposing the resource and associated applications to evaluation by the digital preservation community.