950 resultados para Scale space
Resumo:
A major weakness among loading models for pedestrians walking on flexible structures proposed in recent years is the various uncorroborated assumptions made in their development. This applies to spatio-temporal characteristics of pedestrian loading and the nature of multi-object interactions. To alleviate this problem, a framework for the determination of localised pedestrian forces on full-scale structures is presented using a wireless attitude and heading reference systems (AHRS). An AHRS comprises a triad of tri-axial accelerometers, gyroscopes and magnetometers managed by a dedicated data processing unit, allowing motion in three-dimensional space to be reconstructed. A pedestrian loading model based on a single point inertial measurement from an AHRS is derived and shown to perform well against benchmark data collected on an instrumented treadmill. Unlike other models, the current model does not take any predefined form nor does it require any extrapolations as to the timing and amplitude of pedestrian loading. In order to assess correctly the influence of the moving pedestrian on behaviour of a structure, an algorithm for tracking the point of application of pedestrian force is developed based on data from a single AHRS attached to a foot. A set of controlled walking tests with a single pedestrian is conducted on a real footbridge for validation purposes. A remarkably good match between the measured and simulated bridge response is found, indeed confirming applicability of the proposed framework.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.
Resumo:
Studies on hacking have typically focused on motivational aspects and general personality traits of the individuals who engage in hacking; little systematic research has been conducted on predispositions that may be associated not only with the choice to pursue a hacking career but also with performance in either naïve or expert populations. Here, we test the hypotheses that two traits that are typically enhanced in autism spectrum disorders—attention to detail and systemizing—may be positively related to both the choice of pursuing a career in information security and skilled performance in a prototypical hacking task (i.e., crypto-analysis or code-breaking). A group of naïve participants and of ethical hackers completed the Autism Spectrum Quotient, including an attention to detail scale, and the Systemizing Quotient (Baron-Cohen et al., 2001, 2003). They were also tested with behavioral tasks involving code-breaking and a control task involving security X-ray image interpretation. Hackers reported significantly higher systemizing and attention to detail than non-hackers. We found a positive relation between self-reported systemizing (but not attention to detail) and code-breaking skills in both hackers and non-hackers, whereas attention to detail (but not systemizing) was related with performance in the X-ray screening task in both groups, as previously reported with naïve participants (Rusconi et al., 2015). We discuss the theoretical and translational implications of our findings.
Resumo:
This thesis presents detailed observational studies of the extended distributions of gas, galaxies, and dark matter around hyperluminous quasars (HLQSOs) at high redshift. Taken together, these works aim to coherently describe the relationships between these massive, accreting black holes and their environments: the nature of the regions that give rise to such massive black holes, the effect of HLQSO radiation on their surrounding galaxies and gas, and the ability of both galaxies and black holes to shed new light on the formation and evolution of the other.
Chapter 2 focuses on the continuum-color-selected galaxies drawn from the Keck Baryonic Structure Survey (KBSS). The KBSS is a uniquely deep spectroscopic survey of star-forming galaxies in the same volumes of space as 15 HLQSOs at 2.5 <
Chapter 3 describes the first results from a new survey (KBSS-Lyα) conducted for this thesis. The KBSS-Lyα survey uses narrowband imaging to identify Lyα-emitters (LAEs) in the ~Mpc regions around eight of the KBSS HLQSOs. Many of these LAEs show the effect of reprocessed HLQSO radiation in their emission through the process known as Lyα fluorescence. In this chapter, these fluorescent LAEs are used to generate a coarse map of the average HLQSO ionizing emission on Mpc scales, thereby setting the first direct constraints of the lifetime and angular distribution of activity for a population of these uniquely luminous black holes.
Chapter 4 contains a more detailed description of the KBSS-Lyα survey itself and the detailed properties of the star-forming and fluorescent objects selected therein. Using imaging and spectroscopic data covering rest-frame UV and optical wavelengths, including spectra from the new near-infrared spectrometer MOSFIRE, we characterize this population of nascent galaxies in terms of their kinematics, enrichment, gas properties, and luminosity distribution while comparing and contrasting them with previously-studied populations of continuum-selected galaxies and LAEs far from the effects of HLQSO emission.
At the conclusion of this thesis, I briefly present future directions for the continuation of this research. In Appendix A, I provide background information on the instrumentation used in this thesis, including my own contributions to MOSFIRE.
Resumo:
This study presents a computational parametric analysis of DME steam reforming in a large scale Circulating Fluidized Bed (CFB) reactor. The Computational Fluid Dynamic (CFD) model used, which is based on Eulerian-Eulerian dispersed flow, has been developed and validated in Part I of this study [1]. The effect of the reactor inlet configuration, gas residence time, inlet temperature and steam to DME ratio on the overall reactor performance and products have all been investigated. The results have shown that the use of double sided solid feeding system remarkable improvement in the flow uniformity, but with limited effect on the reactions and products. The temperature has been found to play a dominant role in increasing the DME conversion and the hydrogen yield. According to the parametric analysis, it is recommended to run the CFB reactor at around 300 °C inlet temperature, 5.5 steam to DME molar ratio, 4 s gas residence time and 37,104 ml gcat -1 h-1 space velocity. At these conditions, the DME conversion and hydrogen molar concentration in the product gas were both found to be around 80%.
Resumo:
We measured the distribution in absolute magnitude - circular velocity space for a well-defined sample of 199 rotating galaxies of the Calar Alto Legacy Integral Field Area Survey (CALIFA) using their stellar kinematics. Our aim in this analysis is to avoid subjective selection criteria and to take volume and large-scale structure factors into account. Using stellar velocity fields instead of gas emission line kinematics allows including rapidly rotating early-type galaxies. Our initial sample contains 277 galaxies with available stellar velocity fields and growth curve r-band photometry. After rejecting 51 velocity fields that could not be modelled because of the low number of bins, foreground contamination, or significant interaction, we performed Markov chain Monte Carlo modelling of the velocity fields, from which we obtained the rotation curve and kinematic parameters and their realistic uncertainties. We performed an extinction correction and calculated the circular velocity v_circ accounting for the pressure support of a given galaxy. The resulting galaxy distribution on the M-r - v(circ) plane was then modelled as a mixture of two distinct populations, allowing robust and reproducible rejection of outliers, a significant fraction of which are slow rotators. The selection effects are understood well enough that we were able to correct for the incompleteness of the sample. The 199 galaxies were weighted by volume and large-scale structure factors, which enabled us to fit a volume-corrected Tully-Fisher relation (TFR). More importantly, we also provide the volume-corrected distribution of galaxies in the M_r - v_circ plane, which can be compared with cosmological simulations. The joint distribution of the luminosity and circular velocity space densities, representative over the range of -20 > M_r > -22 mag, can place more stringent constraints on the galaxy formation and evolution scenarios than linear TFR fit parameters or the luminosity function alone.
Resumo:
In this paper, a space fractional di®usion equation (SFDE) with non- homogeneous boundary conditions on a bounded domain is considered. A new matrix transfer technique (MTT) for solving the SFDE is proposed. The method is based on a matrix representation of the fractional-in-space operator and the novelty of this approach is that a standard discretisation of the operator leads to a system of linear ODEs with the matrix raised to the same fractional power. Analytic solutions of the SFDE are derived. Finally, some numerical results are given to demonstrate that the MTT is a computationally e±cient and accurate method for solving SFDE.