919 resultados para physics.data-an
Resumo:
Senior thesis written for Oceanography 445
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-04
Resumo:
The data available during the drug discovery process is vast in amount and diverse in nature. To gain useful information from such data, an effective visualisation tool is required. To provide better visualisation facilities to the domain experts (screening scientist, biologist, chemist, etc.),we developed a software which is based on recently developed principled visualisation algorithms such as Generative Topographic Mapping (GTM) and Hierarchical Generative Topographic Mapping (HGTM). The software also supports conventional visualisation techniques such as Principal Component Analysis, NeuroScale, PhiVis, and Locally Linear Embedding (LLE). The software also provides global and local regression facilities . It supports regression algorithms such as Multilayer Perceptron (MLP), Radial Basis Functions network (RBF), Generalised Linear Models (GLM), Mixture of Experts (MoE), and newly developed Guided Mixture of Experts (GME). This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install & use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.
Resumo:
The modelling of mechanical structures using finite element analysis has become an indispensable stage in the design of new components and products. Once the theoretical design has been optimised a prototype may be constructed and tested. What can the engineer do if the measured and theoretically predicted vibration characteristics of the structure are significantly different? This thesis considers the problems of changing the parameters of the finite element model to improve the correlation between a physical structure and its mathematical model. Two new methods are introduced to perform the systematic parameter updating. The first uses the measured modal model to derive the parameter values with the minimum variance. The user must provide estimates for the variance of the theoretical parameter values and the measured data. Previous authors using similar methods have assumed that the estimated parameters and measured modal properties are statistically independent. This will generally be the case during the first iteration but will not be the case subsequently. The second method updates the parameters directly from the frequency response functions. The order of the finite element model of the structure is reduced as a function of the unknown parameters. A method related to a weighted equation error algorithm is used to update the parameters. After each iteration the weighting changes so that on convergence the output error is minimised. The suggested methods are extensively tested using simulated data. An H frame is then used to demonstrate the algorithms on a physical structure.
Resumo:
The northern Antarctic Peninsula is one of the fastest changing regions on Earth. The disintegration of the Larsen-A Ice Shelf in 1995 caused tributary glaciers to adjust by speeding up, surface lowering, and overall increased ice-mass discharge. In this study, we investigate the temporal variation of these changes at the Dinsmoor-Bombardier-Edgeworth glacier system by analyzing dense time series from various spaceborne and airborne Earth observation missions. Precollapse ice shelf conditions and subsequent adjustments through 2014 were covered. Our results show a response of the glacier system some months after the breakup, reaching maximum surface velocities at the glacier front of up to 8.8 m/d in 1999 and a subsequent decrease to ~1.5 m/d in 2014. Using a dense time series of interferometrically derived TanDEM-X digital elevation models and photogrammetric data, an exponential function was fitted for the decrease in surface elevation. Elevation changes in areas below 1000 m a.s.l. amounted to at least 130±15 m130±15 m between 1995 and 2014, with change rates of ~3.15 m/a between 2003 and 2008. Current change rates (2010-2014) are in the range of 1.7 m/a. Mass imbalances were computed with different scenarios of boundary conditions. The most plausible results amount to -40.7±3.9 Gt-40.7±3.9 Gt. The contribution to sea level rise was estimated to be 18.8±1.8 Gt18.8±1.8 Gt, corresponding to a 0.052±0.005 mm0.052±0.005 mm sea level equivalent, for the period 1995-2014. Our analysis and scenario considerations revealed that major uncertainties still exist due to insufficiently accurate ice-thickness information. The second largest uncertainty in the computations was the glacier surface mass balance, which is still poorly known. Our time series analysis facilitates an improved comparison with GRACE data and as input to modeling of glacio-isostatic uplift in this region. The study contributed to a better understanding of how glacier systems adjust to ice shelf disintegration.
Resumo:
Systematic, high-quality observations of the atmosphere, oceans and terrestrial environments are required to improve understanding of climate characteristics and the consequences of climate change. The overall aim of this report is to carry out a comparative assessment of approaches taken to addressing the state of European observations systems and related data analysis by some leading actors in the field. This research reports on approaches to climate observations and analyses in Ireland, Switzerland, Germany, The Netherlands and Austria and explores options for a more coordinated approach to national responses to climate observations in Europe. The key aspects addressed are: an assessment of approaches to develop GCOS and provision of analysis of GCOS data; an evaluation of how these countries are reporting development of GCOS; highlighting best practice in advancing GCOS implementation including analysis of Essential Climate Variables (ECVs); a comparative summary of the differences and synergies in terms of the reporting of climate observations; an overview of relevant European initiatives and recommendations on how identified gaps might be addressed in the short to medium term.
Resumo:
Aerial observations of light pollution can fill an important gap between ground based surveys and nighttime satellite data. Terrestrially bound surveys are labor intensive and are generally limited to a small spatial extent, and while existing satellite data cover the whole world, they are limited to coarse resolution. This paper describes the production of a high resolution (1 m) mosaic image of the city of Berlin, Germany at night. The dataset is spatially analyzed to identify themajor sources of light pollution in the city based on urban land use data. An area-independent 'brightness factor' is introduced that allows direct comparison of the light emission from differently sized land use classes, and the percentage area with values above average brightness is calculated for each class. Using this methodology, lighting associated with streets has been found to be the dominant source of zenith directed light pollution (31.6%), although other land use classes have much higher average brightness. These results are compared with other urban light pollution quantification studies. The minimum resolution required for an analysis of this type is found to be near 10 m. Future applications of high resolution datasets such as this one could include: studies of the efficacy of light pollution mitigation measures, improved light pollution simulations, economic and energy use, the relationship between artificial light and ecological parameters (e.g. circadian rhythm, fitness, mate selection, species distributions, migration barriers and seasonal behavior), or the management of nightscapes. To encourage further scientific inquiry, the mosaic data is freely available at Pangaea.
Resumo:
This paper discusses some aspects of hunter-gatherer spatial organization in southern South Patagonia, in later times to 10,000 cal yr BP. Various methods of spatial analysis, elaborated with a Geographic Information System (GIS) were applied to the distributional pattern of archaeological sites with radiocarbon dates. The shift in the distributional pattern of chronological information was assessed in conjunction with other lines of evidence within a biogeographic framework. Accordingly, the varying degrees of occupation and integration of coastal and interior spaces in human spatial organization are explained in association with the adaptive strategies hunter-gatherers have used over time. Both are part of the same human response to changes in risk and uncertainty variability in the region in terms of resource availability and environmental dynamics.
Resumo:
The protein folding problem has been one of the most challenging subjects in biological physics due to its complexity. Energy landscape theory based on statistical mechanics provides a thermodynamic interpretation of the protein folding process. We have been working to answer fundamental questions about protein-protein and protein-water interactions, which are very important for describing the energy landscape surface of proteins correctly. At first, we present a new method for computing protein-protein interaction potentials of solvated proteins directly from SAXS data. An ensemble of proteins was modeled by Metropolis Monte Carlo and Molecular Dynamics simulations, and the global X-ray scattering of the whole model ensemble was computed at each snapshot of the simulation. The interaction potential model was optimized and iterated by a Levenberg-Marquardt algorithm. Secondly, we report that terahertz spectroscopy directly probes hydration dynamics around proteins and determines the size of the dynamical hydration shell. We also present the sequence and pH-dependence of the hydration shell and the effect of the hydrophobicity. On the other hand, kinetic terahertz absorption (KITA) spectroscopy is introduced to study the refolding kinetics of ubiquitin and its mutants. KITA results are compared to small angle X-ray scattering, tryptophan fluorescence, and circular dichroism results. We propose that KITA monitors the rearrangement of hydrogen bonding during secondary structure formation. Finally, we present development of the automated single molecule operating system (ASMOS) for a high throughput single molecule detector, which levitates a single protein molecule in a 10 µm diameter droplet by the laser guidance. I also have performed supporting calculations and simulations with my own program codes.
Resumo:
Ethernet connections, which are widely used in many computer networks, can suffer from electromagnetic interference. Typically, a degradation of the data transmission rate can be perceived as electromagnetic disturbances lead to corruption of data frames on the network media. In this paper a software-based measuring method is presented, which allows a direct assessment of the effects on the link layer. The results can directly be linked to the physical interaction without the influence of software related effects on higher protocol layers. This gives a simple tool for a quantitative analysis of the disturbance of an Ethernet connection based on time domain data. An example is shown, how the data can be used for further investigation of mechanisms and detection of intentional electromagnetic attacks. © 2015 Author(s).
Resumo:
Doutoramento em Economia
Resumo:
To exploit the full potential of radio measurements of cosmic-ray air showers at MHz frequencies, a detector timing synchronization within 1 ns is needed. Large distributed radio detector arrays such as the Auger Engineering Radio Array (AERA) rely on timing via the Global Positioning System (GPS) for the synchronization of individual detector station clocks. Unfortunately, GPS timing is expected to have an accuracy no better than about 5 ns. In practice, in particular in AERA, the GPS clocks exhibit drifts on the order of tens of ns. We developed a technique to correct for the GPS drifts, and an independent method is used to cross-check that indeed we reach a nanosecond-scale timing accuracy by this correction. First, we operate a "beacon transmitter" which emits defined sine waves detected by AERA antennas recorded within the physics data. The relative phasing of these sine waves can be used to correct for GPS clock drifts. In addition to this, we observe radio pulses emitted by commercial airplanes, the position of which we determine in real time from Automatic Dependent Surveillance Broadcasts intercepted with a software-defined radio. From the known source location and the measured arrival times of the pulses we determine relative timing offsets between radio detector stations. We demonstrate with a combined analysis that the two methods give a consistent timing calibration with an accuracy of 2 ns or better. Consequently, the beacon method alone can be used in the future to continuously determine and correct for GPS clock drifts in each individual event measured by AERA.
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física
Resumo:
Recurrences are close returns of a given state in a time series, and can be used to identify different dynamical regimes and other related phenomena, being particularly suited for analyzing experimental data. In this work, we use recurrence quantification analysis to investigate dynamical patterns in scalar data series obtained from measurements of floating potential and ion saturation current at the plasma edge of the Tokamak Chauffage Alfveacuten Breacutesilien [R. M. O. Galva approximate to o , Plasma Phys. Controlled Fusion 43, 1181 (2001)]. We consider plasma discharges with and without the application of radial electric bias, and also with two different regimes of current ramp. Our results indicate that biasing improves confinement through destroying highly recurrent regions within the plasma column that enhance particle and heat transport.
Resumo:
Traditionally, chronotype classification is based on the Morningness-Eveningness Questionnaire (MEQ). It is implicit in the classification that intermediate individuals get intermediate scores to most of the MEQ questions. However, a small group of individuals has a different pattern of answers. In some questions, they answer as ""morning-types"" and in some others they answer as ""evening-types,"" resulting in an intermediate total score. ""Evening-type"" and ""Morning-type"" answers were set as A(1) and A(4), respectively. Intermediate answers were set as A(2) and A(3). The following algorithm was applied: Bimodality Index = (Sigma A(1) x Sigma A(4))(2) - (Sigma A(2) x Sigma A(3))(2). Neither-types that had positive bimodality scores were classified as bimodal. If our hypothesis is validated by objective data, an update of chronotype classification will be required. (Author correspondence: brunojm@ymail.com)