940 resultados para Detector alignment and calibration methods (lasers, sources, particle-beams)
Resumo:
Caption title.
Resumo:
Mode of access: Internet.
Resumo:
Senior thesis written for Oceanography 445
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
A new theory of particle discharge in high tension roll (HTR) separation is presented. The discharge dynamics of an isolated charged particle resting on a conducting surface are studied first. The analysis is extended to particle discharge in a homogenous particle bed. Finally, the paper looks at the more realistic scenario of particle discharge in a non-homogenous particle bed. The consequences of the resulting theory on HTR separation are discussed. Predictions from the new theory are tested against experimental HTR separations at the pilot scale. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
A platinum (Pt) on pure ceria (CeO2) supported by carbon black (CB) anode was synthesized using a combined process of precipitation and coimpregnation methods. The electrochemical activity of methanol oxidation reaction on synthesized Pt-CeO2/CB anodes was investigated by cyclic voltammetry and chronoamperometry experimentation. To improve the anode property on Pt-CeO2/CB, the influence of particle morphology and particle size on anode properties was examined. The morphology and particle size of the pure CeO2 particles could be controlled by changing the preparation conditions. The anode properties (i.e., peak current density and onset potential for methanol oxidation) were improved by using nanosize CeO2 particles. This indicates that a larger surface area and higher activity on the surface of CeO2 improve the anode properties. The influence of particle morphology of CeO2 on anode properties was not very large. The onset potential for methanol oxidation reaction on Pt-CeO2/CB, which consisted of CeO2 with a high surface area, was shifted to a lower potential compared with that on the anodes, which consisted of CeO2 with a low surface area. The onset potential on Pt-CeO2/CB at 60 degrees C became similar to that on the commercially available Pt-Ru/carbon anode. We suggest that the rate-determining steps of the methanol oxidation reaction on Pt-CeO2/CB and commercially available Pt-Ru/carbon anodes are different, which accounts for the difference in performance. In the reaction mechanism on Pt-CeO2/CB, we conclude that the released oxygen species from the surface of CeO2 particles contribute to oxidation of adsorbed CO species on the Pt surface. This suggests that the anode performance of the Pt-CeO2/CB anode would lead to improvements in the operation of direct methanol fuel cells at 80 degrees C by the enhancement of diffusion of oxygen species created from the surface of nanosized CeO2 particles. Therefore, we conclude that fabrication of nanosized CeO2 with a high surface area is a key factor for development of a high-quality Pt-CeO2/CB anode in direct methanol fuel cells.
Resumo:
A strategy for the production and subsequent characterization of biofunctionalized silica particles is presented. The particles were engineered to produce a bifunctional material capable of both (a) the attachment of fluorescent dyes for particle encoding and (b) the sequential modification of the surface of the particles to couple oligonucleotide probes. A combination of microscopic and analytical methods is implemented to demonstrate that modification of the particles with 3-aminopropyl trimethoxysilane results in an even distribution of amine groups across the particle surface. Evidence is provided to indicate that there are negligible interactions between the bound fluorescent dyes and the attached biomolecules. A unique approach was adopted to provide direct quantification of the oligonucleotide probe loading on the particle surface through X-ray photoelectron spectroscopy, a technique which may have a major impact for current researchers and users of bead-based technologies. A simple hybridization assay showing high sequence specificity is included to demonstrate the applicability of these particles to DNA screening.
Resumo:
The importance of availability of comparable real income aggregates and their components to applied economic research is highlighted by the popularity of the Penn World Tables. Any methodology designed to achieve such a task requires the combination of data from several sources. The first is purchasing power parities (PPP) data available from the International Comparisons Project roughly every five years since the 1970s. The second is national level data on a range of variables that explain the behaviour of the ratio of PPP to market exchange rates. The final source of data is the national accounts publications of different countries which include estimates of gross domestic product and various price deflators. In this paper we present a method to construct a consistent panel of comparable real incomes by specifying the problem in state-space form. We present our completed work as well as briefly indicate our work in progress.
Resumo:
A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.
Resumo:
The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.
Resumo:
The efficiency literature, both using parametric and non-parametric methods, has been focusing mainly on cost efficiency analysis rather than on profit efficiency. In for-profit organisations, however, the measurement of profit efficiency and its decomposition into technical and allocative efficiency is particularly relevant. In this paper a newly developed method is used to measure profit efficiency and to identify the sources of any shortfall in profitability (technical and/or allocative inefficiency). The method is applied to a set of Portuguese bank branches first assuming long run and then a short run profit maximisation objective. In the long run most of the scope for profit improvement of bank branches is by becoming more allocatively efficient. In the short run most of profit gain can be realised through higher technical efficiency. © 2003 Elsevier B.V. All rights reserved.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
Suitable methods for the assessment of the effect of freeze-thaw action upon ceramic tiles have been determined. The results obtained have been shown to be reproducible with some work in this area still warranted. The analysis of Whichford Potteries clays via a variety of analytical techniques has shown them to be a complex mix of both clay and non-clay minerals. 57Fe Mössbauer spectroscopy has highlighted the presence of both small and large particleα-Fe203, removable via acid washing. 19F MAS NMR has demonstrated that the raw Whichford Pottery clays examined have negligible fluorine content. This is unlikely to be detrimental to ceramic wares during the heating process. A unique technique was used for the identification of fluorine in solid-state systems. The exchange of various cations into Wyoming Bentonite clay by microwave methodology did not show the appearance of five co-ordinate aluminium when examined by 27Al MAS NMR. The appearance of Qo silicate was linked to an increase in the amount of tetrahedrally bound aluminium in the silicate framework. This is formed as a result of the heating process. The analysis of two Chinese clays and two Chinese clay raw materials has highlighted a possible link between the two. These have also been shown to be a mix of both clay and non-clay minerals. Layered double hydroxides formed by conventional and microwave methods exhibited interesting characteristics. The main differences between the samples examined were not found to be solely attributable to the differences between microwave and conventional methods but more attributable to different experimental conditions used.