995 resultados para supersymmetric affine Toda models
Resumo:
This paper proposes the use of empirical modeling techniques for building microarchitecture sensitive models for compiler optimizations. The models we build relate program performance to settings of compiler optimization flags, associated heuristics and key microarchitectural parameters. Unlike traditional analytical modeling methods, this relationship is learned entirely from data obtained by measuring performance at a small number of carefully selected compiler/microarchitecture configurations. We evaluate three different learning techniques in this context viz. linear regression, adaptive regression splines and radial basis function networks. We use the generated models to a) predict program performance at arbitrary compiler/microarchitecture configurations, b) quantify the significance of complex interactions between optimizations and the microarchitecture, and c) efficiently search for'optimal' settings of optimization flags and heuristics for any given microarchitectural configuration. Our evaluation using benchmarks from the SPEC CPU2000 suits suggests that accurate models (< 5% average error in prediction) can be generated using a reasonable number of simulations. We also find that using compiler settings prescribed by a model-based search can improve program performance by as much as 19% (with an average of 9.5%) over highly optimized binaries.
Resumo:
Within Australia, there have been many attempts to pass voluntary euthanasia (VE) or physician-assisted suicide (PAS) legislation. From 16 June 1993 until the date of writing, 51 Bills have been introduced into Australian parliaments dealing with legalising VE or PAS. Despite these numerous attempts, the only successful Bill was the Rights of the Terminally Ill Act 1995 (NT), which was enacted in the Northern Territory, but a short time later overturned by the controversial Euthanasia Laws Act 1997 (Cth). Yet, in stark contrast to the significant political opposition, for decades Australian public opinion has overwhelmingly supported law reform legalising VE or PAS. While there is ongoing debate in Australia, both through public discourse and scholarly publications, about the merits and dangers of reform in this field, there has been remarkably little analysis of the numerous legislative attempts to reform the law, and the context in which those reform attempts occurred. The aim of this article is to better understand the reform landscape in Australia over the past two decades. The information provided in this article will better equip Australians, both politicians and the general public, to have a more nuanced understanding of the political context in which the euthanasia debate has been and is occurring. It will also facilitate a more informed debate in the future.
Resumo:
This article presents and evaluates Quantum Inspired models of Target Activation using Cued-Target Recall Memory Modelling over multiple sources of Free Association data. Two components were evaluated: Whether Quantum Inspired models of Target Activation would provide a better framework than their classical psychological counterparts and how robust these models are across the different sources of Free Association data. In previous work, a formal model of cued-target recall did not exist and as such Target Activation was unable to be assessed directly. Further to that, the data source used was suspected of suffering from temporal and geographical bias. As a consequence, Target Activation was measured against cued-target recall data as an approximation of performance. Since then, a formal model of cued-target recall (PIER3) has been developed [10] with alternative sources of data also becoming available. This allowed us to directly model target activation in cued-target recall with human cued-target recall pairs and use multiply sources of Free Association Data. Featural Characteristics known to be important to Target Activation were measured for each of the data sources to identify any major differences that may explain variations in performance for each of the models. Each of the activation models were used in the PIER3 memory model for each of the data sources and was benchmarked against cued-target recall pairs provided by the University of South Florida (USF). Two methods where used to evaluate performance. The first involved measuring the divergence between the sets of results using the Kullback Leibler (KL) divergence with the second utilizing a previous statistical analysis of the errors [9]. Of the three sources of data, two were sourced from human subjects being the USF Free Association Norms and the University of Leuven (UL) Free Association Networks. The third was sourced from a new method put forward by Galea and Bruza, 2015 in which pseudo Free Association Networks (Corpus Based Association Networks - CANs) are built using co-occurrence statistics on large text corpus. It was found that the Quantum Inspired Models of Target Activation not only outperformed the classical psychological model but was more robust across a variety of data sources.
Resumo:
Discoveries at the LHC will soon set the physics agenda for future colliders. This report of a CERN Theory Institute includes the summaries of Working Groups that reviewed the physics goals and prospects of LHC running with 10 to 300 fb(-1) of integrated luminosity, of the proposed sLHC luminosity upgrade, of the ILC, of CLIC, of the LHeC and of a muon collider. The four Working Groups considered possible scenarios for the first 10 fb(-1) of data at the LHC in which (i) a state with properties that are compatible with a Higgs boson is discovered, (ii) no such state is discovered either because the Higgs properties are such that it is difficult to detect or because no Higgs boson exists, (iii) a missing-energy signal beyond the Standard Model is discovered as in some supersymmetric models, and (iv) some other exotic signature of new physics is discovered. In the contexts of these scenarios, the Working Groups reviewed the capabilities of the future colliders to study in more detail whatever new physics may be discovered by the LHC. Their reports provide the particle physics community with some tools for reviewing the scientific priorities for future colliders after the LHC produces its first harvest of new physics from multi-TeV collisions.
Resumo:
This paper presents the results of shaking table tests on geotextile-reinforced wrap-faced soil-retaining walls. Construction of model retaining walls in a laminar box mounted on a shaking table, instrumentation, and results from the shaking table tests are discussed in detail. The base motion parameters, surcharge pressure and number of reinforcing layers are varied in different model tests. It is observed from these tests that the response of the wrap-faced soil-retaining walls is significantly affected by the base acceleration levels, frequency of shaking, quantity of reinforcement and magnitude of surcharge pressure on the crest. The effects of these different parameters on acceleration response at different elevations of the retaining wall, horizontal soil pressures and face deformations are also presented. The results obtained from this study are helpful in understanding the relative performance of reinforced soil-retaining walls under different test conditions used in the experiments.
Resumo:
This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.
Resumo:
This paper develops a model for military conflicts where the defending forces have to determine an optimal partitioning of available resources to counter attacks from an adversary in two different fronts. The Lanchester attrition model is used to develop the dynamical equations governing the variation in force strength. Three different allocation schemes - Time-Zero-Allocation (TZA), Allocate-Assess-Reallocate (AAR), and Continuous Constant Allocation (CCA) - are considered and the optimal solutions are obtained in each case. Numerical examples are given to support the analytical results.
Resumo:
Field placements provide social work students with the opportunity to integrate their classroom learning with the knowledge and skills used in various human service programs. The supervision structure that has most commonly been used is the intensive one-to-one, clinical teaching model. However, this model is being challenged by significant changes in educational and industry sectors, which have led to an increased use of alternative fieldwork structures and supervision arrangements, including task supervision, group supervision, external supervision, and shared supervisory arrangements. This study focuses on identifying models of supervision and student satisfaction with their learning experiences and the supervision received on placement. The study analysed responses to a questionnaire administered to 263 undergraduate social work students enrolled in three different campuses in Australia after they had completed their first or final field placement. The study identified that just over half of the placements used the traditional one student to one social work supervisor model. A number of “emerging” models were also identified, where two or more social workers were involved in the professional supervision of the student. High levels of dissatisfaction were reported by those students who received external social work supervision. Results suggest that students are more satisfied across all aspects of the placement where there is a strong on-site social work presence.
Resumo:
Field placements provide social work students with the opportunity to integrate their classroom learning with the knowledge and skills used in various human service programs. The supervision structure that has most commonly been used is the intensive one-to-one, clinical teaching model. However, this model is being challenged by significant changes in educational and industry sectors, which have led to an increased use of alternative fieldwork structures and supervision arrangements, including task supervision, group supervision, external supervision, and shared supervisory arrangements. This study focuses on identifying models of supervision and student satisfaction with their learning experiences and the supervision received on placement. The study analysed responses to a questionnaire administered to 263 undergraduate social work students enrolled in three different campuses in Australia after they had completed their first or final field placement. The study identified that just over half of the placements used the traditional one student to one social work supervisor model. A number of “emerging” models were also identified, where two or more social workers were involved in the professional supervision of the student. High levels of dissatisfaction were reported by those students who received external social work supervision. Results suggest that students are more satisfied across all aspects of the placement where there is a strong on-site social work presence.
Resumo:
This thesis studies binary time series models and their applications in empirical macroeconomics and finance. In addition to previously suggested models, new dynamic extensions are proposed to the static probit model commonly used in the previous literature. In particular, we are interested in probit models with an autoregressive model structure. In Chapter 2, the main objective is to compare the predictive performance of the static and dynamic probit models in forecasting the U.S. and German business cycle recession periods. Financial variables, such as interest rates and stock market returns, are used as predictive variables. The empirical results suggest that the recession periods are predictable and dynamic probit models, especially models with the autoregressive structure, outperform the static model. Chapter 3 proposes a Lagrange Multiplier (LM) test for the usefulness of the autoregressive structure of the probit model. The finite sample properties of the LM test are considered with simulation experiments. Results indicate that the two alternative LM test statistics have reasonable size and power in large samples. In small samples, a parametric bootstrap method is suggested to obtain approximately correct size. In Chapter 4, the predictive power of dynamic probit models in predicting the direction of stock market returns are examined. The novel idea is to use recession forecast (see Chapter 2) as a predictor of the stock return sign. The evidence suggests that the signs of the U.S. excess stock returns over the risk-free return are predictable both in and out of sample. The new "error correction" probit model yields the best forecasts and it also outperforms other predictive models, such as ARMAX models, in terms of statistical and economic goodness-of-fit measures. Chapter 5 generalizes the analysis of univariate models considered in Chapters 2 4 to the case of a bivariate model. A new bivariate autoregressive probit model is applied to predict the current state of the U.S. business cycle and growth rate cycle periods. Evidence of predictability of both cycle indicators is obtained and the bivariate model is found to outperform the univariate models in terms of predictive power.
Resumo:
We present the results of a search for Higgs bosons predicted in two-Higgs-doublet models, in the case where the Higgs bosons decay to tau lepton pairs, using 1.8 inverse fb of integrated luminosity of proton-antiproton collisions recorded by the CDF II experiment at the Fermilab Tevatron. Studying the observed mass distribution in events where one or both tau leptons decay leptonically, no evidence for a Higgs boson signal is observed. The result is used to infer exclusion limits in the two-dimensional parameter space of tan beta versus m(A).
Resumo:
We present the results of a search for Higgs bosons predicted in two-Higgs-doublet models, in the case where the Higgs bosons decay to tau lepton pairs, using 1.8 inverse fb of integrated luminosity of proton-antiproton collisions recorded by the CDF II experiment at the Fermilab Tevatron. Studying the observed mass distribution in events where one or both tau leptons decay leptonically, no evidence for a Higgs boson signal is observed. The result is used to infer exclusion limits in the two-dimensional parameter space of tan beta versus m(A).
Resumo:
This paper develops a model for military conflicts where the defending forces have to determine an optimal partitioning of available resources to counter attacks from an adversary in two different fronts. The Lanchester attrition model is used to develop the dynamical equations governing the variation in force strength. Three different allocation schemes - Time-Zero-Allocation (TZA), Allocate-Assess-Reallocate (AAR), and Continuous Constant Allocation (CCA) - are considered and the optimal solutions are obtained in each case. Numerical examples are given to support the analytical results.
Resumo:
Inflation is a period of accelerated expansion in the very early universe, which has the appealing aspect that it can create primordial perturbations via quantum fluctuations. These primordial perturbations have been observed in the cosmic microwave background, and these perturbations also function as the seeds of all large-scale structure in the universe. Curvaton models are simple modifications of the standard inflationary paradigm, where inflation is driven by the energy density of the inflaton, but another field, the curvaton, is responsible for producing the primordial perturbations. The curvaton decays after inflation as ended, where the isocurvature perturbations of the curvaton are converted into adiabatic perturbations. Since the curvaton must decay, it must have some interactions. Additionally realistic curvaton models typically have some self-interactions. In this work we consider self-interacting curvaton models, where the self-interaction is a monomial in the potential, suppressed by the Planck scale, and thus the self-interaction is very weak. Nevertheless, since the self-interaction makes the equations of motion non-linear, it can modify the behaviour of the model very drastically. The most intriguing aspect of this behaviour is that the final properties of the perturbations become highly dependent on the initial values. Departures of Gaussian distribution are important observables of the primordial perturbations. Due to the non-linearity of the self-interacting curvaton model and its sensitivity to initial conditions, it can produce significant non-Gaussianity of the primordial perturbations. In this work we investigate the non-Gaussianity produced by the self-interacting curvaton, and demonstrate that the non-Gaussianity parameters do not obey the analytically derived approximate relations often cited in the literature. Furthermore we also consider a self-interacting curvaton with a mass in the TeV-scale. Motivated by realistic particle physics models such as the Minimally Supersymmetric Standard Model, we demonstrate that a curvaton model within the mass range can be responsible for the observed perturbations if it can decay late enough.