886 resultados para Cloud Fraction
Resumo:
In this work, a simple correlation, which incorporates the mixture velocity, drift velocity, and the correction factor of Farooqi and Richardson, was proposed to predict the void fraction of gas/non-Newtonian intermittent flow in upward inclined pipes. The correlation was based on 352 data points covering a wide range of flow rates for different CMC solutions at diverse angles. A good agreement was obtained between the predicted and experimental results. These results substantiated the general validity of the model presented for gas/non-Newtonian two-phase intermittent flows.
Resumo:
The present work has been carried out to investigate on the average void fraction of gas/non-Newtonian fluids flow in downward inclined pipes. The influences of pipe inclination angle on the average void fraction were studied experimentally. A simple correlation, which incorporated the method of Vlachos et al. for gas/Newtonain fluid horizontal flow, the correction factor of Farooqi and Richardson and the pipe inclination angle, was proposed to predict the average void fraction of gas/non-Newtonian power-law stratified flow in downward inclined pipes. The correlation was based on 470 data points covering a wide range of flow rates for different systems at diverse angles. A good agreement was obtained between theory and data and the fitting results could describe the majority of the experimental data within ±20%.
Resumo:
The relentlessly increasing demand for network bandwidth, driven primarily by Internet-based services such as mobile computing, cloud storage and video-on-demand, calls for more efficient utilization of the available communication spectrum, as that afforded by the resurging DSP-powered coherent optical communications. Encoding information in the phase of the optical carrier, using multilevel phase modulationformats, and employing coherent detection at the receiver allows for enhanced spectral efficiency and thus enables increased network capacity. The distributed feedback semiconductor laser (DFB) has served as the near exclusive light source powering the fiber optic, long-haul network for over 30 years. The transition to coherent communication systems is pushing the DFB laser to the limits of its abilities. This is due to its limited temporal coherence that directly translates into the number of different phases that can be imparted to a single optical pulse and thus to the data capacity. Temporal coherence, most commonly quantified in the spectral linewidth Δν, is limited by phase noise, result of quantum-mandated spontaneous emission of photons due to random recombination of carriers in the active region of the laser.
In this work we develop a generically new type of semiconductor laser with the requisite coherence properties. We demonstrate electrically driven lasers characterized by a quantum noise-limited spectral linewidth as low as 18 kHz. This narrow linewidth is result of a fundamentally new laser design philosophy that separates the functions of photon generation and storage and is enabled by a hybrid Si/III-V integration platform. Photons generated in the active region of the III-V material are readily stored away in the low loss Si that hosts the bulk of the laser field, thereby enabling high-Q photon storage. The storage of a large number of coherent quanta acts as an optical flywheel, which by its inertia reduces the effect of the spontaneous emission-mandated phase perturbations on the laser field, while the enhanced photon lifetime effectively reduces the emission rate of incoherent quanta into the lasing mode. Narrow linewidths are obtained over a wavelength bandwidth spanning the entire optical communication C-band (1530-1575nm) at only a fraction of the input power required by conventional DFB lasers. The results presented in this thesis hold great promise for the large scale integration of lithographically tuned, high-coherence laser arrays for use in coherent communications, that will enable Tb/s-scale data capacities.
Resumo:
This thesis describes a compositional framework for developing situation awareness applications: applications that provide ongoing information about a user's changing environment. The thesis describes how the framework is used to develop a situation awareness application for earthquakes. The applications are implemented as Cloud computing services connected to sensors and actuators. The architecture and design of the Cloud services are described and measurements of performance metrics are provided. The thesis includes results of experiments on earthquake monitoring conducted over a year. The applications developed by the framework are (1) the CSN --- the Community Seismic Network --- which uses relatively low-cost sensors deployed by members of the community, and (2) SAF --- the Situation Awareness Framework --- which integrates data from multiple sources, including the CSN, CISN --- the California Integrated Seismic Network, a network consisting of high-quality seismometers deployed carefully by professionals in the CISN organization and spread across Southern California --- and prototypes of multi-sensor platforms that include carbon monoxide, methane, dust and radiation sensors.
Resumo:
This thesis is the culmination of field and laboratory studies aimed at assessing processes that affect the composition and distribution of atmospheric organic aerosol. An emphasis is placed on measurements conducted using compact and high-resolution Aerodyne Aerosol Mass Spectrometers (AMS). The first three chapters summarize results from aircraft campaigns designed to evaluate anthropogenic and biogenic impacts on marine aerosol and clouds off the coast of California. Subsequent chapters describe laboratory studies intended to evaluate gas and particle-phase mechanisms of organic aerosol oxidation.
The 2013 Nucleation in California Experiment (NiCE) was a campaign designed to study environments impacted by nucleated and/or freshly formed aerosol particles. Terrestrial biogenic aerosol with > 85% organic mass was observed to reside in the free troposphere above marine stratocumulus. This biogenic organic aerosol (BOA) originated from the Northwestern United States and was transported to the marine atmosphere during periodic cloud-clearing events. Spectra recorded by a cloud condensation nuclei counter demonstrated that BOA is CCN active. BOA enhancements at latitudes north of San Francisco, CA coincided with enhanced cloud water concentrations of organic species such as acetate and formate.
Airborne measurements conducted during the 2011 Eastern Pacific Emitted Aerosol Cloud Experiment (E-PEACE) were aimed at evaluating the contribution of ship emissions to the properties of marine aerosol and clouds off the coast of central California. In one study, analysis of organic aerosol mass spectra during periods of enhanced shipping activity yielded unique tracers indicative of cloud-processed ship emissions (m/z 42 and 99). The variation of their organic fraction (f42 and f99) was found to coincide with periods of heavy (f42 > 0.15; f99 > 0.04), moderate (0.05 < f42 < 0.15; 0.01 < f99 < 0.04), and negligible (f42 < 0.05; f99 < 0.01) ship influence. Application of these conditions to all measurements conducted during E-PEACE demonstrated that a large fraction of cloud droplet (72%) and dry aerosol mass (12%) sampled in the California coastal study region was heavily or moderately influenced by ship emissions. Another study investigated the chemical and physical evolution of a controlled organic plume emitted from the R/V Point Sur. Under sunny conditions, nucleated particles composed of oxidized organic compounds contributed nearly an order of magnitude more cloud condensation nuclei (CCN) than less oxidized particles formed under cloudy conditions. The processing time necessary for particles to become CCN active was short ( < 1 hr) compared to the time needed for particles to become hygroscopic at sub-saturated humidity ( > 4 hr).
Laboratory chamber experiments were also conducted to evaluate particle-phase processes influencing aerosol phase and composition. In one study, ammonium sulfate seed was coated with a layer of secondary organic aerosol (SOA) from toluene oxidation followed by a layer of SOA from α-pinene oxidation. The system exhibited different evaporative properties than ammonium sulfate seed initially coated with α-pinene SOA followed by a layer of toluene SOA. This behavior is consistent with a shell-and-core model and suggests limited mixing among different SOA types. Another study investigated the reactive uptake of isoprene epoxy diols (IEPOX) onto non-acidified aerosol. It was demonstrated that particle acidity has limited influence on organic aerosol formation onto ammonium sulfate seed, and that the chemical system is limited by the availability of nucleophiles such as sulfate.
Flow tube experiments were conducted to examine the role of iron in the reactive uptake and chemical oxidation of glycolaldehyde. Aerosol particles doped with iron and hydrogen peroxide were mixed with gas-phase glycolaldehyde and photochemically aged in a custom-built flow reactor. Compared to particles free of iron, iron-doped aerosols significantly enhanced the oxygen to carbon (O/C) ratio of accumulated organic mass. The primary oxidation mechanism is suggested to be a combination of Fenton and photo-Fenton reactions which enhance particle-phase OH radical concentrations.
Resumo:
STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.
It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.
In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.
Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.
Resumo:
Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: To ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. Copyright: © 2015 Bildosola et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Resumo:
Cloud chambers were essential devices in early nuclear and particle physics research. Superseded by more modern detectors in actual research, they still remain very interesting pedagogical apparatus. This thesis attempts to give a global view on this topic. To do so, a review of the physical foundations of the diffusion cloud chamber, in which an alcohol is supersaturated by cooling it with a thermal reservoir, is carried out. Its main results are then applied to analyse the working conditions inside the chamber. The analysis remarks the importance of using an appropriate alcohol, such as isopropanol, as well as a strong cooling system, which for isopropanol needs to reach −40ºC. That theoretical study is complemented with experimental tests that were performed with what is the usual design of a home-made cloud chamber. An effective setup is established, which highlights details such as a grazing illumination, a direct contact with the cooling reservoir through a wide metal plate, or the importance of avoiding vapour removal. Apart from that, video results of different phenomena that cloud chamber allow to observe are also presented. Overall, it is aimed to present a physical insight that pedagogical papers usually lack.