930 resultados para Large detector systems for particle and astroparticle physics
Resumo:
Geographic Information Systems (GIS) is an emerging information technology (IT) which promises to have large scale influences in how spatially distributed resources are managed. It has had applications in the management of issues as diverse as recovering from the disaster of Hurricane Andrew to aiding military operations in Desert Storm. Implementation of GIS systems is an important issue because there are high cost and time involvement in setting them up. An important component of the implementation problem is the "meaning" different groups of people who are influencing the implementation give to the technology. The research was based on the theory of (theoretical stance to the problem was based on the) "Social Construction of Knowledge" systems which assumes knowledge systems are subject to sociological analysis both in usage and in content. An interpretive research approach was adopted to inductively derive a model which explains how the "meanings" of a GIS are socially constructed. The research design entailed a comparative case analysis over two county sites which were using the same GIS for a variety of purposes. A total of 75 in-depth interviews were conducted to elicit interpretations of GIS. Results indicate that differences in how geographers and data-processors view the technology lead to different implementation patterns in the two sites.
Resumo:
We calculate near-threshold bound states and Feshbach resonance positions for atom–rigid-rotor models of the highly anisotropic systems Li+CaH and Li+CaF. We perform statistical analysis on the resonance positions to compare with the predictions of random matrix theory. For Li+CaH with total angular momentum J=0 we find fully chaotic behavior in both the nearest-neighbor spacing distribution and the level number variance. However, for J>0 we find different behavior due to the presence of a nearly conserved quantum number. Li+CaF (J=0) also shows apparently reduced levels of chaotic behavior despite its stronger effective coupling. This may indicate the development of another good quantum number relating to a bending motion of the complex. However, continuously varying the rotational constant over a wide range shows unexpected structure in the degree of chaotic behavior, including a dramatic reduction around the rotational constant of CaF. This demonstrates the complexity of the relationship between coupling and chaotic behavior.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
In this paper various techniques in relation to large-scale systems are presented. At first, explanation of large-scale systems and differences from traditional systems are given. Next, possible specifications and requirements on hardware and software are listed. Finally, examples of large-scale systems are presented.
Resumo:
Permeability of a rock is a dynamic property that varies spatially and temporally. Fractures provide the most efficient channels for fluid flow and thus directly contribute to the permeability of the system. Fractures usually form as a result of a combination of tectonic stresses, gravity (i.e. lithostatic pressure) and fluid pressures. High pressure gradients alone can cause fracturing, the process which is termed as hydrofracturing that can determine caprock (seal) stability or reservoir integrity. Fluids also transport mass and heat, and are responsible for the formation of veins by precipitating minerals within open fractures. Veining (healing) thus directly influences the rock’s permeability. Upon deformation these closed factures (veins) can refracture and the cycle starts again. This fracturing-healing-refacturing cycle is a fundamental part in studying the deformation dynamics and permeability evolution of rock systems. This is generally accompanied by fracture network characterization focusing on network topology that determines network connectivity. Fracture characterization allows to acquire quantitative and qualitative data on fractures and forms an important part of reservoir modeling. This thesis highlights the importance of fracture-healing and veins’ mechanical properties on the deformation dynamics. It shows that permeability varies spatially and temporally, and that healed systems (veined rocks) should not be treated as fractured systems (rocks without veins). Field observations also demonstrate the influence of contrasting mechanical properties, in addition to the complexities of vein microstructures that can form in low-porosity and permeability layered sequences. The thesis also presents graph theory as a characterization method to obtain statistical measures on evolving network connectivity. It also proposes what measures a good reservoir should have to exhibit potentially large permeability and robustness against healing. The results presented in the thesis can have applications for hydrocarbon and geothermal reservoir exploration, mining industry, underground waste disposal, CO2 injection or groundwater modeling.
Resumo:
To tackle the challenges at circuit level and system level VLSI and embedded system design, this dissertation proposes various novel algorithms to explore the efficient solutions. At the circuit level, a new reliability-driven minimum cost Steiner routing and layer assignment scheme is proposed, and the first transceiver insertion algorithmic framework for the optical interconnect is proposed. At the system level, a reliability-driven task scheduling scheme for multiprocessor real-time embedded systems, which optimizes system energy consumption under stochastic fault occurrences, is proposed. The embedded system design is also widely used in the smart home area for improving health, wellbeing and quality of life. The proposed scheduling scheme for multiprocessor embedded systems is hence extended to handle the energy consumption scheduling issues for smart homes. The extended scheme can arrange the household appliances for operation to minimize monetary expense of a customer based on the time-varying pricing model.
Resumo:
PRISM (Polarized Radiation Imaging and Spectroscopy Mission) was proposed to ESA in May 2013 as a large-class mission for investigating within the framework of the ESA Cosmic Vision program a set of important scientific questions that require high res- olution, high sensitivity, full-sky observations of the sky emission at wavelengths ranging from millimeter-wave to the far-infrared. PRISM’s main objective is to explore the distant universe, probing cosmic history from very early times until now as well as the structures, distribution of matter, and velocity flows throughout our Hubble volume. PRISM will survey the full sky in a large number of frequency bands in both intensity and polarization and will measure the absolute spectrum of sky emission more than three orders of magnitude bet- ter than COBE FIRAS. The data obtained will allow us to precisely measure the absolute sky brightness and polarization of all the components of the sky emission in the observed frequency range, separating the primordial and extragalactic components cleanly from the galactic and zodiacal light emissions. The aim of this Extended White Paper is to provide a more detailed overview of the highlights of the new science that will be made possible by PRISM, which include: (1) the ultimate galaxy cluster survey using the Sunyaev-Zeldovich (SZ) e↵ect, detecting approximately 106 clusters extending to large redshift, including a char- acterization of the gas temperature of the brightest ones (through the relativistic corrections to the classic SZ template) as well as a peculiar velocity survey using the kinetic SZ e↵ect that comprises our entire Hubble volume; (2) a detailed characterization of the properties and evolution of dusty galaxies, where the most of the star formation in the universe took place, the faintest population of which constitute the di↵use CIB (Cosmic Infrared Background); (3) a characterization of the B modes from primordial gravity waves generated during inflation and from gravitational lensing, as well as the ultimate search for primordial non-Gaussianity using CMB polarization, which is less contaminated by foregrounds on small scales than thetemperature anisotropies; (4) a search for distortions from a perfect blackbody spectrum, which include some nearly certain signals and others that are more speculative but more informative; and (5) a study of the role of the magnetic field in star formation and its inter- action with other components of the interstellar medium of our Galaxy. These are but a few of the highlights presented here along with a description of the proposed instrument.
Resumo:
In geophysics and seismology, raw data need to be processed to generate useful information that can be turned into knowledge by researchers. The number of sensors that are acquiring raw data is increasing rapidly. Without good data management systems, more time can be spent in querying and preparing datasets for analyses than in acquiring raw data. Also, a lot of good quality data acquired at great effort can be lost forever if they are not correctly stored. Local and international cooperation will probably be reduced, and a lot of data will never become scientific knowledge. For this reason, the Seismological Laboratory of the Institute of Astronomy, Geophysics and Atmospheric Sciences at the University of Sao Paulo (IAG-USP) has concentrated fully on its data management system. This report describes the efforts of the IAG-USP to set up a seismology data management system to facilitate local and international cooperation.
Resumo:
In this work, a study on the role of the long-range term of excess Gibbs energy models in the modeling of aqueous systems containing polymers and salts is presented. Four different approaches on how to account for the presence of polymer in the long-range term were considered, and simulations were conducted considering aqueous solutions of three different salts. The analysis of water activity curves showed that, in all cases, a liquid-phase separation may be introduced by the sole presence of the polymer in the long-range term, regardless of how it is taken into account. The results lead to the conclusion that there is no single exact solution for this problem, and that any kind of approach may introduce inconsistencies.
Resumo:
We evaluated the ability of microemulsions containing medium-chain glycerides as penetration enhancers to increase the transdermal delivery of lipophilic (progesterone) and hydrophilic (adenosine) model drugs as well as the effects of an increase in surfactant blend concentration on drug transdermal delivery. Microemulsions composed of polysorbate 80, medium-chain glycerides, and propylene glycol (1:1:1, w/w/w) as surfactant blend, myvacet oil as the oily phase, and water were developed. Two microemulsions containing different concentrations of surfactant blend but similar water/oil ratios were chosen; ME-lo contained a smaller concentration of surfactant than ME-hi (47:20:33 and 63:14:23 surfactant/oil/water, w/w/w). Although in vitro progesterone and adenosine release from ME-lo and ME-hi was similar, their transdermal delivery was differently affected. ME-lo significantly increased the flux of progesterone and adenosine delivered across porcine ear skin (4-fold or higher, p < 0.05) compared to progesterone solution in oil (0.05 +/- 0.01 mu g/cm(2)/h) or adenosine in water (no drug was detected in the receptor phase). The transdermal flux of adenosine, but not of progesterone, was further increased (2-fold) by ME-hi, suggesting that increases in surfactant concentration represent an interesting strategy to enhance transdermal delivery of hydrophilic, but not of lipophilic, compounds. The relative safety of the microemulsions was assessed in cultured fibroblasts. The cytotoxicity of ME-lo and ME-hi was significantly smaller than sodium lauryl sulfate (considered moderate-to-severe irritant) at same concentrations (up to 50 mu g/mL), but similar to propylene glycol (regarded as safe), suggesting the safety of these formulations.
Resumo:
Systems approaches can help to evaluate and improve the agronomic and economic viability of nitrogen application in the frequently water-limited environments. This requires a sound understanding of crop physiological processes and well tested simulation models. Thus, this experiment on spring wheat aimed to better quantify water x nitrogen effects on wheat by deriving some key crop physiological parameters that have proven useful in simulating crop growth. For spring wheat grown in Northern Australia under four levels of nitrogen (0 to 360 kg N ha(-1)) and either entirely on stored soil moisture or under full irrigation, kernel yields ranged from 343 to 719 g m(-2). Yield increases were strongly associated with increases in kernel number (9150-19950 kernels m(-2)), indicating the sensitivity of this parameter to water and N availability. Total water extraction under a rain shelter was 240 mm with a maximum extraction depth of 1.5 m. A substantial amount of mineral nitrogen available deep in the profile (below 0.9 m) was taken up by the crop. This was the source of nitrogen uptake observed after anthesis. Under dry conditions this late uptake accounted for approximately 50% of total nitrogen uptake and resulted in high (>2%) kernel nitrogen percentages even when no nitrogen was applied,Anthesis LAI values under sub-optimal water supply were reduced by 63% and under sub-optimal nitrogen supply by 50%. Radiation use efficiency (RUE) based on total incident short-wave radiation was 1.34 g MJ(-1) and did not differ among treatments. The conservative nature of RUE was the result of the crop reducing leaf area rather than leaf nitrogen content (which would have affected photosynthetic activity) under these moderate levels of nitrogen limitation. The transpiration efficiency coefficient was also conservative and averaged 4.7 Pa in the dry treatments. Kernel nitrogen percentage varied from 2.08 to 2.42%. The study provides a data set and a basis to consider ways to improve simulation capabilities of water and nitrogen effects on spring wheat. (C) 1997 Elsevier Science B.V.