983 resultados para Digital Rock Physics
Resumo:
X-ray Raman scattering and x-ray emission spectroscopies were used to study the electronic properties and phase transitions in several condensed matter systems. The experimental work, carried out at the European Synchrotron Radiation Facility, was complemented by theoretical calculations of the x-ray spectra and of the electronic structure. The electronic structure of MgB2 at the Fermi level is dominated by the boron σ and π bands. The high density of states provided by these bands is the key feature of the electronic structure contributing to the high critical temperature of superconductivity in MgB2. The electronic structure of MgB2 can be modified by atomic substitutions, which introduce extra electrons or holes into the bands. X ray Raman scattering was used to probe the interesting σ and π band hole states in pure and aluminum substituted MgB2. A method for determining the final state density of electron states from experimental x-ray Raman scattering spectra was examined and applied to the experimental data on both pure MgB2 and on Mg(0.83)Al(0.17)B2. The extracted final state density of electron states for the pure and aluminum substituted samples revealed clear substitution induced changes in the σ and π bands. The experimental work was supported by theoretical calculations of the electronic structure and x-ray Raman spectra. X-ray emission at the metal Kβ line was applied to the studies of pressure and temperature induced spin state transitions in transition metal oxides. The experimental studies were complemented by cluster multiplet calculations of the electronic structure and emission spectra. In LaCoO3 evidence for the appearance of an intermediate spin state was found and the presence of a pressure induced spin transition was confirmed. Pressure induced changes in the electronic structure of transition metal monoxides were studied experimentally and were analyzed using the cluster multiplet approach. The effects of hybridization, bandwidth and crystal field splitting in stabilizing the high pressure spin state were discussed. Emission spectroscopy at the Kβ line was also applied to FeCO3 and a pressure induced iron spin state transition was discovered.
Resumo:
The concept of digital citizenship has become increasingly important to our understanding of the relationship between media and political action, and the possibilities for democratization, decentralization, and diversification of power offered by the Internet. In the era of the digitalization of just about everything, citizenship and its related concepts are in processes of technology-driven transformation, with important implications for the global future of democratic culture. Isin and Ruppert’s book is a timely engagement with these questions, and with the emerging notion of the “digital subject.”
Resumo:
A large proportion of our knowledge about the surfaces of atmosphereless solar-system bodies is obtained through remote-sensing measurements. The measurements can be carried out either as ground-based telescopic observations or space-based observations from orbiting spacecraft. In both cases, the measurement geometry normally varies during the observations due to the orbital motion of the target body, the spacecraft, etc.. As a result, the data are acquired over a variety of viewing and illumination angles. Surfaces of planetary bodies are usually covered with a layer of loose, broken-up rock material called the regolith whose physical properties affect the directional dependence of remote-sensed measurements. It is of utmost importance for correct interpretation of the remote-sensed data to understand the processes behind this alteration. In the thesis, the multi-angular effects that the physical properties of the regolith have on remote-sensing measurements are studied in two regimes of electromagnetic radiation, visible to near infrared and soft X-rays. These effects are here termed generally the regolith effects in remote sensing. Although the physical mechanisms that are important in these regions are largely different, notable similarities arise in the methodology that is used in the study of the regolith effects, including the characterization of the regolith both in experimental studies and in numerical simulations. Several novel experimental setups have been constructed for the thesis. Alongside the experimental work, theoretical modelling has been carried out, and results from both approaches are presented. Modelling of the directional behaviour of light scattered from a regolith is utilized to obtain shape and spin-state information of several asteroids from telescopic observations and to assess the surface roughness and single-scattering properties of lunar maria from spacecraft observations. One of the main conclusions is that the azimuthal direction is an important factor in detailed studies of planetary surfaces. In addition, even a single parameter, such as porosity, can alter the light scattering properties of a regolith significantly. Surface roughness of the regolith is found to alter the elemental fluorescence line ratios of a surface obtained through planetary soft X-ray spectrometry. The results presented in the thesis are among the first to report this phenomenon. Regolith effects need to be taken into account in the analysis of remote-sensed data, providing opportunities for retrieving physical parameters of the surface through inverse methods.
Resumo:
Instability in conventional haptic rendering destroys the perception of rigid objects in virtual environments. Inherent limitations in the conventional haptic loop restrict the maximum stiffness that can be rendered. In this paper we present a method to render virtual walls that are much stiffer than those achieved by conventional techniques. By removing the conventional digital haptic loop and replacing it with a part-continuous and part-discrete time hybrid haptic loop, we were able to render stiffer walls. The control loop is implemented as a combinational logic circuit on an field-programmable gate array. We compared the performance of the conventional haptic loop and our hybrid haptic loop on the same haptic device, and present mathematical analysis to show the limit of stability of our device. Our hybrid method removes the computer-intensive haptic loop from the CPU-this can free a significant amount of resources that can be used for other purposes such as graphical rendering and physics modeling. It is our hope that, in the future, similar designs will lead to a haptics processing unit (HPU).
Resumo:
Information and technology and its use in organisation transformation presents unprecedented opportunities and risks. Increasingly, the Governance of Enterprise Information and Technology (GEIT) competency in the board room and executive is needed. Whether your organization is small or large, public, private or not for profit or whether your industry is not considered high-tech, IT is impacting your sector – no exceptions. But there is a skill shortage in boards: GEIT capability is concerningly low. This capability is urgently needed across the board, including those directors who come from finance, legal, marketing, operations and HR backgrounds. Digital disruption also affects all occupations. Putting in place a vision will help ensure emergency responses will meet technology-related duty of care responsibilities. When GEIT-related forward thinking and planning is carried out at the same time that you put your business strategy and plan in place, your organization has a significantly increased chance of not only surviving, but thriving into the future. Those organizations that don’t build GEIT capability risk joining the growing list of once-leading firms left behind in the digital ‘cloud of smoke’. Those organizations that do will be better placed to reap the benefits and hedge against the risks of a digital world. This chapter provides actionable, research-based considerations and processes for boards to use, to build awareness, knowledge and skills in governing technology-related organization strategy, risk and value creation.
Resumo:
A simple analog instrumentation for Electrical Impedance Tomography is developed and calibrated using the practical phantoms. A constant current injector consisting of a modified Howland voltage controlled current source fed by a voltage controlled oscillator is developed to inject a constant current to the phantom boundary. An instrumentation amplifier, 50 Hz notch filter and a narrow band pass filter are developed and used for signal conditioning. Practical biological phantoms are developed and the forward problem is studied to calibrate the EIT-instrumentation. An array of sixteen stainless steel electrodes is developed and placed inside the phantom tank filled with KCl solution. 1 mA, 50 kHz sinusoidal current is injected at the phantom boundary using adjacent current injection protocol. The differential potentials developed at the voltage electrodes are measured for sixteen current injections. Differential voltage signal is passed through an instrumentation amplifier and a filtering block and measured by a digital multimeter. A forward solver is developed using Finite Element Method in MATLAB7.0 for solving the EIT governing equation. Differential potentials are numerically calculated using the forward solver with a simulated current and bathing solution conductivity. Measured potential data is compared with the differential potentials calculated for calibrating the instrumentation to acquire the voltage data suitable for better image reconstruction.
Resumo:
The magnetic field of the Earth is 99 % of the internal origin and generated in the outer liquid core by the dynamo principle. In the 19th century, Carl Friedrich Gauss proved that the field can be described by a sum of spherical harmonic terms. Presently, this theory is the basis of e.g. IGRF models (International Geomagnetic Reference Field), which are the most accurate description available for the geomagnetic field. In average, dipole forms 3/4 and non-dipolar terms 1/4 of the instantaneous field, but the temporal mean of the field is assumed to be a pure geocentric axial dipolar field. The validity of this GAD (Geocentric Axial Dipole) hypothesis has been estimated by using several methods. In this work, the testing rests on the frequency dependence of inclination with respect to latitude. Each combination of dipole (GAD), quadrupole (G2) and octupole (G3) produces a distinct inclination distribution. These theoretical distributions have been compared with those calculated from empirical observations from different continents, and last, from the entire globe. Only data from Precambrian rocks (over 542 million years old) has been used in this work. The basic assumption is that during the long-term course of drifting continents, the globe is sampled adequately. There were 2823 observations altogether in the paleomagnetic database of the University of Helsinki. The effect of the quality of observations, as well as the age and rocktype, has been tested. For comparison between theoretical and empirical distributions, chi-square testing has been applied. In addition, spatiotemporal binning has effectively been used to remove the errors caused by multiple observations. The modelling from igneous rock data tells that the average magnetic field of the Earth is best described by a combination of a geocentric dipole and a very weak octupole (less than 10 % of GAD). Filtering and binning gave distributions a more GAD-like appearance, but deviation from GAD increased as a function of the age of rocks. The distribution calculated from so called keypoles, the most reliable determinations, behaves almost like GAD, having a zero quadrupole and an octupole 1 % of GAD. In no earlier study, past-400-Ma rocks have given a result so close to GAD, but low inclinations have been prominent especially in the sedimentary data. Despite these results, a greater deal of high-quality data and a proof of the long-term randomness of the Earth's continental motions are needed to make sure the dipole model holds true.
Resumo:
This article discusses the physics programme of the TOTEM experiment at the LHC. A new special beam optics with beta* = 90 m, enabling the measurements of the total cross-section, elastic pp scattering and diffractive phenomena already at early LHC runs, is explained. For this and the various other TOTEM running scenarios, the acceptances of the leading proton detectors and of the forward tracking stations for some physics processes are described.
Resumo:
Two algorithms are outlined, each of which has interesting features for modeling of spatial variability of rock depth. In this paper, reduced level of rock at Bangalore, India, is arrived from the 652 boreholes data in the area covering 220 sqa <.km. Support vector machine (SVM) and relevance vector machine (RVM) have been utilized to predict the reduced level of rock in the subsurface of Bangalore and to study the spatial variability of the rock depth. The support vector machine (SVM) that is firmly based on the theory of statistical learning theory uses regression technique by introducing epsilon-insensitive loss function has been adopted. RVM is a probabilistic model similar to the widespread SVM, but where the training takes place in a Bayesian framework. Prediction results show the ability of learning machine to build accurate models for spatial variability of rock depth with strong predictive capabilities. The paper also highlights the capability ofRVM over the SVM model.
Resumo:
A search for new physics using three-lepton (trilepton) data collected with the CDF II detector and corresponding to an integrated luminosity of 976 pb-1 is presented. The standard model predicts a low rate of trilepton events, which makes some supersymmetric processes, such as chargino-neutralino production, measurable in this channel. The mu+mu+l signature is investigated, where l is an electron or a muon, with the additional requirement of large missing transverse energy. In this analysis, the lepton transverse momenta with respect to the beam direction (pT) are as low as 5 GeV/c, a selection that improves the sensitivity to particles which are light as well as to ones which result in leptonically decaying tau leptons. At the same time, this low-p_T selection presents additional challenges due to the non-negligible heavy-quark background at low lepton momenta. This background is measured with an innovative technique using experimental data. Several dimuon and trilepton control regions are investigated, and good agreement between experimental results and standard-model predictions is observed. In the signal region, we observe one three-muon event and expect 0.4+/-0.1 mu+mu+l events
Resumo:
A detailed study is presented of the expected performance of the ATLAS detector. The reconstruction of tracks, leptons, photons, missing energy and jets is investigated, together with the performance of b-tagging and the trigger. The physics potential for a variety of interesting physics processes, within the Standard Model and beyond, is examined. The study comprises a series of notes based on simulations of the detector and physics processes, with particular emphasis given to the data expected from the first years of operation of the LHC at CERN.
Resumo:
A hybrid computer for structure factor calculations in X-ray crystallography is described. The computer can calculate three-dimensional structure factors of up to 24 atoms in a single run and can generate the scatter functions of well over 100 atoms using Vand et al., or Forsyth and Wells approximations. The computer is essentially a digital computer with analog function generators, thus combining to advantage the economic data storage of digital systems and simple computing circuitry of analog systems. The digital part serially selects the data, computes and feeds the arguments into specially developed high precision digital-analog function generators, the outputs of which being d.c. voltages, are further processed by analog circuits and finally the sequential adder, which employs a novel digital voltmeter circuit, converts them back into digital form and accumulates them in a dekatron counter which displays the final result. The computer is also capable of carrying out 1-, 2-, or 3-dimensional Fourier summation, although in this case, the lack of sufficient storage space for the large number of coefficients involved, is a serious limitation at present.
Resumo:
We investigate the effects of new physics scenarios containing a high mass vector resonance on top pair production at the LHC, using the polarization of the produced top. In particular we use kinematic distributions of the secondary lepton coming from top decay, which depends on top polarization, as it has been shown that the angular distribution of the decay lepton is insensitive to the anomalous tbW vertex and hence is a pure probe of new physics in top quark production. Spin sensitive variables involving the decay lepton are used to probe top polarization. Some sensitivity is found for the new couplings of the top.
Resumo:
This research is connected with an education development project for the four-year-long officer education program at the National Defence University. In this curriculum physics was studied in two alternative course plans namely scientific and general. Observations connected to the later one e.g. student feedback and learning outcome gave indications that action was needed to support the course. The reform work was focused on the production of aligned course related instructional material. The learning material project produced a customized textbook set for the students of the general basic physics course. The research adapts phases that are typical in Design Based Research (DBR). The research analyses the feature requirements for physics textbook aimed at a specific sector and frames supporting instructional material development, and summarizes the experiences gained in the learning material project when the selected frames have been applied. The quality of instructional material is an essential part of qualified teaching. The goal of instructional material customization is to increase the product's customer centric nature and to enhance its function as a support media for the learning process. Textbooks are still one of the core elements in physics teaching. The idea of a textbook will remain but the form and appearance may change according to the prevailing technology. The work deals with substance connected frames (demands of a physics textbook according to the PER-viewpoint, quality thinking in educational material development), frames of university pedagogy and instructional material production processes. A wide knowledge and understanding of different frames are useful in development work, if they are to be utilized to aid inspiration without limiting new reasoning and new kinds of models. Applying customization even in the frame utilization supports creative and situation aware design and diminishes the gap between theory and practice. Generally, physics teachers produce their own supplementary instructional material. Even though customization thinking is not unknown the threshold to produce an entire textbook might be high. Even though the observations here are from the general physics course at the NDU, the research gives tools also for development in other discipline related educational contexts. This research is an example of an instructional material development work together the questions it uncovers, and presents thoughts when textbook customization is rewarding. At the same time, the research aims to further creative customization thinking in instruction and development. Key words: Physics textbook, PER (Physics Education Research), Instructional quality, Customization, Creativity