999 resultados para Ship models.
Resumo:
This paper presents the results of shaking table tests on model reinforced soil retaining walls in the laboratory. The influence of backfill relative density on the seismic response was studied through a series of laboratory model tests on retaining walls. Construction of model retaining walls in the laminar box mounted on shaking table, instrumentation and results from the shaking table tests are described in detail. Three types of walls: wrap- and rigid-faced reinforced soil walls and unreinforced rigid-faced walls constructed to different densities were tested for a relatively small excitation. Wrap-faced walls are further tested for higher base excitation at different frequencies and relative densities. It is observed from these tests that the effect of backfill density on the seismic performance of reinforced retaining walls is pronounced only at very low relative density and at the higher base excitation. The walls constructed with higher backfill relative density showed lesser face deformations and more acceleration amplifications compared to the walls constructed with lower densities when tested at higher base excitation. The response of wrap- and rigid-faced retaining walls is not much affected by the backfill relative density when tested at smaller base excitation. The effects of facing rigidity were evaluated to a limited extent. Displacements in wrap-faced walls are many times higher compared to rigid-faced walls. The results obtained from this study are helpful in understanding the relative performance of reinforced soil retaining walls constructed to when subjected to smaller and higher base excitation for the range of relative density employed in the testing program. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.
Resumo:
Cosmological inflation is the dominant paradigm in explaining the origin of structure in the universe. According to the inflationary scenario, there has been a period of nearly exponential expansion in the very early universe, long before the nucleosynthesis. Inflation is commonly considered as a consequence of some scalar field or fields whose energy density starts to dominate the universe. The inflationary expansion converts the quantum fluctuations of the fields into classical perturbations on superhorizon scales and these primordial perturbations are the seeds of the structure in the universe. Moreover, inflation also naturally explains the high degree of homogeneity and spatial flatness of the early universe. The real challenge of the inflationary cosmology lies in trying to establish a connection between the fields driving inflation and theories of particle physics. In this thesis we concentrate on inflationary models at scales well below the Planck scale. The low scale allows us to seek for candidates for the inflationary matter within extensions of the Standard Model but typically also implies fine-tuning problems. We discuss a low scale model where inflation is driven by a flat direction of the Minimally Supersymmetric Standard Model. The relation between the potential along the flat direction and the underlying supergravity model is studied. The low inflationary scale requires an extremely flat potential but we find that in this particular model the associated fine-tuning problems can be solved in a rather natural fashion in a class of supergravity models. For this class of models, the flatness is a consequence of the structure of the supergravity model and is insensitive to the vacuum expectation values of the fields that break supersymmetry. Another low scale model considered in the thesis is the curvaton scenario where the primordial perturbations originate from quantum fluctuations of a curvaton field, which is different from the fields driving inflation. The curvaton gives a negligible contribution to the total energy density during inflation but its perturbations become significant in the post-inflationary epoch. The separation between the fields driving inflation and the fields giving rise to primordial perturbations opens up new possibilities to lower the inflationary scale without introducing fine-tuning problems. The curvaton model typically gives rise to relatively large level of non-gaussian features in the statistics of primordial perturbations. We find that the level of non-gaussian effects is heavily dependent on the form of the curvaton potential. Future observations that provide more accurate information of the non-gaussian statistics can therefore place constraining bounds on the curvaton interactions.
Resumo:
This paper proposes the use of empirical modeling techniques for building microarchitecture sensitive models for compiler optimizations. The models we build relate program performance to settings of compiler optimization flags, associated heuristics and key microarchitectural parameters. Unlike traditional analytical modeling methods, this relationship is learned entirely from data obtained by measuring performance at a small number of carefully selected compiler/microarchitecture configurations. We evaluate three different learning techniques in this context viz. linear regression, adaptive regression splines and radial basis function networks. We use the generated models to a) predict program performance at arbitrary compiler/microarchitecture configurations, b) quantify the significance of complex interactions between optimizations and the microarchitecture, and c) efficiently search for'optimal' settings of optimization flags and heuristics for any given microarchitectural configuration. Our evaluation using benchmarks from the SPEC CPU2000 suits suggests that accurate models (< 5% average error in prediction) can be generated using a reasonable number of simulations. We also find that using compiler settings prescribed by a model-based search can improve program performance by as much as 19% (with an average of 9.5%) over highly optimized binaries.
Resumo:
Within Australia, there have been many attempts to pass voluntary euthanasia (VE) or physician-assisted suicide (PAS) legislation. From 16 June 1993 until the date of writing, 51 Bills have been introduced into Australian parliaments dealing with legalising VE or PAS. Despite these numerous attempts, the only successful Bill was the Rights of the Terminally Ill Act 1995 (NT), which was enacted in the Northern Territory, but a short time later overturned by the controversial Euthanasia Laws Act 1997 (Cth). Yet, in stark contrast to the significant political opposition, for decades Australian public opinion has overwhelmingly supported law reform legalising VE or PAS. While there is ongoing debate in Australia, both through public discourse and scholarly publications, about the merits and dangers of reform in this field, there has been remarkably little analysis of the numerous legislative attempts to reform the law, and the context in which those reform attempts occurred. The aim of this article is to better understand the reform landscape in Australia over the past two decades. The information provided in this article will better equip Australians, both politicians and the general public, to have a more nuanced understanding of the political context in which the euthanasia debate has been and is occurring. It will also facilitate a more informed debate in the future.
Resumo:
This article presents and evaluates Quantum Inspired models of Target Activation using Cued-Target Recall Memory Modelling over multiple sources of Free Association data. Two components were evaluated: Whether Quantum Inspired models of Target Activation would provide a better framework than their classical psychological counterparts and how robust these models are across the different sources of Free Association data. In previous work, a formal model of cued-target recall did not exist and as such Target Activation was unable to be assessed directly. Further to that, the data source used was suspected of suffering from temporal and geographical bias. As a consequence, Target Activation was measured against cued-target recall data as an approximation of performance. Since then, a formal model of cued-target recall (PIER3) has been developed [10] with alternative sources of data also becoming available. This allowed us to directly model target activation in cued-target recall with human cued-target recall pairs and use multiply sources of Free Association Data. Featural Characteristics known to be important to Target Activation were measured for each of the data sources to identify any major differences that may explain variations in performance for each of the models. Each of the activation models were used in the PIER3 memory model for each of the data sources and was benchmarked against cued-target recall pairs provided by the University of South Florida (USF). Two methods where used to evaluate performance. The first involved measuring the divergence between the sets of results using the Kullback Leibler (KL) divergence with the second utilizing a previous statistical analysis of the errors [9]. Of the three sources of data, two were sourced from human subjects being the USF Free Association Norms and the University of Leuven (UL) Free Association Networks. The third was sourced from a new method put forward by Galea and Bruza, 2015 in which pseudo Free Association Networks (Corpus Based Association Networks - CANs) are built using co-occurrence statistics on large text corpus. It was found that the Quantum Inspired Models of Target Activation not only outperformed the classical psychological model but was more robust across a variety of data sources.
Resumo:
This paper presents the results of shaking table tests on geotextile-reinforced wrap-faced soil-retaining walls. Construction of model retaining walls in a laminar box mounted on a shaking table, instrumentation, and results from the shaking table tests are discussed in detail. The base motion parameters, surcharge pressure and number of reinforcing layers are varied in different model tests. It is observed from these tests that the response of the wrap-faced soil-retaining walls is significantly affected by the base acceleration levels, frequency of shaking, quantity of reinforcement and magnitude of surcharge pressure on the crest. The effects of these different parameters on acceleration response at different elevations of the retaining wall, horizontal soil pressures and face deformations are also presented. The results obtained from this study are helpful in understanding the relative performance of reinforced soil-retaining walls under different test conditions used in the experiments.
Resumo:
This thesis studies quantile residuals and uses different methodologies to develop test statistics that are applicable in evaluating linear and nonlinear time series models based on continuous distributions. Models based on mixtures of distributions are of special interest because it turns out that for those models traditional residuals, often referred to as Pearson's residuals, are not appropriate. As such models have become more and more popular in practice, especially with financial time series data there is a need for reliable diagnostic tools that can be used to evaluate them. The aim of the thesis is to show how such diagnostic tools can be obtained and used in model evaluation. The quantile residuals considered here are defined in such a way that, when the model is correctly specified and its parameters are consistently estimated, they are approximately independent with standard normal distribution. All the tests derived in the thesis are pure significance type tests and are theoretically sound in that they properly take the uncertainty caused by parameter estimation into account. -- In Chapter 2 a general framework based on the likelihood function and smooth functions of univariate quantile residuals is derived that can be used to obtain misspecification tests for various purposes. Three easy-to-use tests aimed at detecting non-normality, autocorrelation, and conditional heteroscedasticity in quantile residuals are formulated. It also turns out that these tests can be interpreted as Lagrange Multiplier or score tests so that they are asymptotically optimal against local alternatives. Chapter 3 extends the concept of quantile residuals to multivariate models. The framework of Chapter 2 is generalized and tests aimed at detecting non-normality, serial correlation, and conditional heteroscedasticity in multivariate quantile residuals are derived based on it. Score test interpretations are obtained for the serial correlation and conditional heteroscedasticity tests and in a rather restricted special case for the normality test. In Chapter 4 the tests are constructed using the empirical distribution function of quantile residuals. So-called Khmaladze s martingale transformation is applied in order to eliminate the uncertainty caused by parameter estimation. Various test statistics are considered so that critical bounds for histogram type plots as well as Quantile-Quantile and Probability-Probability type plots of quantile residuals are obtained. Chapters 2, 3, and 4 contain simulations and empirical examples which illustrate the finite sample size and power properties of the derived tests and also how the tests and related graphical tools based on residuals are applied in practice.
Resumo:
This paper describes a concept for a collision avoidance system for ships, which is based on model predictive control. A finite set of alternative control behaviors are generated by varying two parameters: offsets to the guidance course angle commanded to the autopilot and changes to the propulsion command ranging from nominal speed to full reverse. Using simulated predictions of the trajectories of the obstacles and ship, compliance with the Convention on the International Regulations for Preventing Collisions at Sea and collision hazards associated with each of the alternative control behaviors are evaluated on a finite prediction horizon, and the optimal control behavior is selected. Robustness to sensing error, predicted obstacle behavior, and environmental conditions can be ensured by evaluating multiple scenarios for each control behavior. The method is conceptually and computationally simple and yet quite versatile as it can account for the dynamics of the ship, the dynamics of the steering and propulsion system, forces due to wind and ocean current, and any number of obstacles. Simulations show that the method is effective and can manage complex scenarios with multiple dynamic obstacles and uncertainty associated with sensors and predictions.
Resumo:
This paper develops a model for military conflicts where the defending forces have to determine an optimal partitioning of available resources to counter attacks from an adversary in two different fronts. The Lanchester attrition model is used to develop the dynamical equations governing the variation in force strength. Three different allocation schemes - Time-Zero-Allocation (TZA), Allocate-Assess-Reallocate (AAR), and Continuous Constant Allocation (CCA) - are considered and the optimal solutions are obtained in each case. Numerical examples are given to support the analytical results.
Resumo:
Field placements provide social work students with the opportunity to integrate their classroom learning with the knowledge and skills used in various human service programs. The supervision structure that has most commonly been used is the intensive one-to-one, clinical teaching model. However, this model is being challenged by significant changes in educational and industry sectors, which have led to an increased use of alternative fieldwork structures and supervision arrangements, including task supervision, group supervision, external supervision, and shared supervisory arrangements. This study focuses on identifying models of supervision and student satisfaction with their learning experiences and the supervision received on placement. The study analysed responses to a questionnaire administered to 263 undergraduate social work students enrolled in three different campuses in Australia after they had completed their first or final field placement. The study identified that just over half of the placements used the traditional one student to one social work supervisor model. A number of “emerging” models were also identified, where two or more social workers were involved in the professional supervision of the student. High levels of dissatisfaction were reported by those students who received external social work supervision. Results suggest that students are more satisfied across all aspects of the placement where there is a strong on-site social work presence.
Resumo:
Field placements provide social work students with the opportunity to integrate their classroom learning with the knowledge and skills used in various human service programs. The supervision structure that has most commonly been used is the intensive one-to-one, clinical teaching model. However, this model is being challenged by significant changes in educational and industry sectors, which have led to an increased use of alternative fieldwork structures and supervision arrangements, including task supervision, group supervision, external supervision, and shared supervisory arrangements. This study focuses on identifying models of supervision and student satisfaction with their learning experiences and the supervision received on placement. The study analysed responses to a questionnaire administered to 263 undergraduate social work students enrolled in three different campuses in Australia after they had completed their first or final field placement. The study identified that just over half of the placements used the traditional one student to one social work supervisor model. A number of “emerging” models were also identified, where two or more social workers were involved in the professional supervision of the student. High levels of dissatisfaction were reported by those students who received external social work supervision. Results suggest that students are more satisfied across all aspects of the placement where there is a strong on-site social work presence.