17 resultados para High performance computing.
em Digital Commons at Florida International University
Resumo:
The contributions of this dissertation are in the development of two new interrelated approaches to video data compression: (1) A level-refined motion estimation and subband compensation method for the effective motion estimation and motion compensation. (2) A shift-invariant sub-decimation decomposition method in order to overcome the deficiency of the decimation process in estimating motion due to its shift-invariant property of wavelet transform. ^ The enormous data generated by digital videos call for an intense need of efficient video compression techniques to conserve storage space and minimize bandwidth utilization. The main idea of video compression is to reduce the interpixel redundancies inside and between the video frames by applying motion estimation and motion compensation (MEMO) in combination with spatial transform coding. To locate the global minimum of the matching criterion function reasonably, hierarchical motion estimation by coarse to fine resolution refinements using discrete wavelet transform is applied due to its intrinsic multiresolution and scalability natures. ^ Due to the fact that most of the energies are concentrated in the low resolution subbands while decreased in the high resolution subbands, a new approach called level-refined motion estimation and subband compensation (LRSC) method is proposed. It realizes the possible intrablocks in the subbands for lower entropy coding while keeping the low computational loads of motion estimation as the level-refined method, thus to achieve both temporal compression quality and computational simplicity. ^ Since circular convolution is applied in wavelet transform to obtain the decomposed subframes without coefficient expansion, symmetric-extended wavelet transform is designed on the finite length frame signals for more accurate motion estimation without discontinuous boundary distortions. ^ Although wavelet transformed coefficients still contain spatial domain information, motion estimation in wavelet domain is not as straightforward as in spatial domain due to the shift variance property of the decimation process of the wavelet transform. A new approach called sub-decimation decomposition method is proposed, which maintains the motion consistency between the original frame and the decomposed subframes, improving as a consequence the wavelet domain video compressions by shift invariant motion estimation and compensation. ^
Resumo:
Unique electrical and mechanical properties of single-walled carbon nanotubes (SWNTs) have made them one of the most promising candidates for next-generation nanoelectronics. Efficient utilization of the exceptional properties of SWNTs requires controlling their growth direction (e.g., vertical, horizontal) and morphologies (e.g., straight, junction, coiled). ^ In this dissertation, the catalytic effect on the branching of SWNTs, Y-shaped SWNTs (Y-SWNTs), was investigated. The formation of Y-shaped branches was found to be dependent on the composition of the catalysts. Easier carbide formers have a strong tendency to attach to the sidewall of SWNTs and thus enhance the degree of branching. Y-SWNTs based field-effect transistors (FETs) were fabricated and modulated by the metallic branch of the Y-SWNTs, exhibiting ambipolar characteristics at room temperature. A subthreshold swing of 700 mV/decade and an on/off ratio of 105 with a low off-state current of 10-13 A were obtained. The transport phenomena associated with Y- and cross-junction configurations reveals that the conduction mechanism in the SWNT junctions is governed by thermionic emission at T > 100 K and by tunneling at T < 100 K. ^ Furthermore, horizontally aligned SWNTs were synthesized by the controlled modification of external fields and forces. High performance carbon nanotube FETs and logic circuit were demonstrated utilizing the aligned SWNTs. It is found that the hysteresis in CNTFETs can be eliminated by removing absorbed water molecules on the CNT/SiO2 interface by vacuum annealing, hydrophobic surface treatment, and surface passivation. SWNT “serpentines” were synthesized by utilization of the interaction between drag force from gas flow and Van der Waals force with substrates. The curvature of bent SWNTs could be tailored by adjusting the gas flow rate, and changing the gas flow direction with respect to the step-edges on a single-crystal quartz substrate. Resistivity of bent SWNTs was observed to increase with curvature, which can be attributed to local deformations and possible chirality shift at curved part. ^ Our results show the successful synthesis of SWNTs having controllable morphologies and directionality. The capability of tailoring the electrical properties of SWNTs makes it possible to build an all-nanotube device by integrating SWNTs, having different functionalities, into complex circuits. ^
Resumo:
Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.
Resumo:
The application of advanced materials in infrastructure has grown rapidly in recent years mainly because of their potential to ease the construction, extend the service life, and improve the performance of structures. Ultra-high performance concrete (UHPC) is one such material considered as a novel alternative to conventional concrete. The material microstructure in UHPC is optimized to significantly improve its material properties including compressive and tensile strength, modulus of elasticity, durability, and damage tolerance. Fiber-reinforced polymer (FRP) composite is another novel construction material with excellent properties such as high strength-to-weight and stiffness-to-weight ratios and good corrosion resistance. Considering the exceptional properties of UHPC and FRP, many advantages can result from the combined application of these two advanced materials, which is the subject of this research. The confinement behavior of UHPC was studied for the first time in this research. The stress-strain behavior of a series of UHPC-filled fiber-reinforced polymer (FRP) tubes with different fiber types and thicknesses were tested under uniaxial compression. The FRP confinement was shown to significantly enhance both the ultimate strength and strain of UHPC. It was also shown that existing confinement models are incapable of predicting the behavior of FRP-confined UHPC. Therefore, new stress-strain models for FRP-confined UHPC were developed through an analytical study. In the other part of this research, a novel steel-free UHPC-filled FRP tube (UHPCFFT) column system was developed and its cyclic behavior was studied. The proposed steel-free UHPCFFT column showed much higher strength and stiffness, with a reasonable ductility, as compared to its conventional reinforced concrete (RC) counterpart. Using the results of the first phase of column tests, a second series of UHPCFFT columns were made and studied under pseudo-static loading to study the effect of column parameters on the cyclic behavior of UHPCFFT columns. Strong correlations were noted between the initial stiffness and the stiffness index, and between the moment capacity and the reinforcement index. Finally, a thorough analytical study was carried out to investigate the seismic response of the proposed steel-free UHPCFFT columns, which showed their superior earthquake resistance, as compared to their RC counterparts.
Resumo:
The present study measured a chemotherapy drug, etoposide, in pig cerebrospinal fluid after intraventricular administrations were made directly into the fourth ventricle of the brain; cytotoxic concentrations for a twenty-four hour period after infusions. The analytical method developed validates the potential treatment of malignant brain tumors. The increase in serum carotenoid concentration in 30 healthy individuals was measured after supplementation with lutein. HPLC analysis of serum levels of carotenoids showed an increase in the concentration of lutein and a constant concentration of other major serum carotenoids. An initial attempt to measure the enthalpy of aggregation of xanthophylls was conducted by using ultraviolet-visible spectroscopy. The enthalpy of lutein aggregation and AH range of zeaxanthin disordering of aggregation are reported. Monomethyl ether of lutein did not aggregate in any of the aqueous solutions.
Resumo:
Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.
Resumo:
A comprehensive forensic investigation of sensitive ecosystems in the Everglades Area is presented. Assessing the background levels of contamination in these ecosystems represents a vital resource to build up forensic evidence required to enforce future environmental crimes within the studied areas. This investigation presents the development and validation of a fractionation and isolation method for two families of herbicides commonly applied in the vicinity of the study area, including phenoxy acids like 2,4-D, MCPA, and silvex; as well as the most common triazine-based herbicides like atrazine, prometyne, simazine and related metabolites like DIA and DEA. Accelerated solvent extraction (ASE) and solid phase extraction (SPE) were used to isolate the analytes from abiotic matrices containing large amounts of organic material. Atmospheric-pressure ionization (API) with electrospray ionization in negative mode (ESP-), and Chemical Ionization in the positive mode (APCI+) were used to perform the characterization of the herbicides of interest.
Resumo:
The drugs studied in this work have been reportedly used to commit drug-facilitated sexual assault (DFSA), commonly known as "date rape". Detection of the drugs was performed using high-performance liquid chromatography with ultraviolet detection (HPLC/UV) and identified with high performance-liquid chromatography mass spectrometry (HPLC/MS) using selected ion monitoring (SIM). The objective of this study was to develop a single HPLC method for the simultaneous detection, identification and quantitation of these drugs. The following drugs were simultaneously analyzed: Gamma-hydroxybutyrate (GHB), scopolamine, lysergic acid diethylamide, ketamine, flunitrazepam, and diphenhydramine. The results showed increased sensitivity with electrospray (ES) ionization versus atmospheric pressure chemical ionization (APCI) using HPLC/MS. HPLC/ES/MS was approximately six times more sensitive than HPLC/APCI/MS and about fifty times more sensitive than HPLC/UV. A limit of detection (LOD) of 100 ppb was achieved for drug analysis using this method. The average linear regression coefficient of correlation squared (r2) was 0.933 for HPLC/UV and 0.998 for HPLC/ES/MS. The detection limits achieved by this method allowed for the detection of drug dosages used in beverage tampering. This method can be used to screen beverages suspected of drug tampering. The results of this study demonstrated that solid phase microextraction (SPME) did not improve sensitivity as an extraction technique when compared to direct injections of the drug standards.
Resumo:
Concrete substructures are often subjected to environmental deterioration, such as sulfate and acid attack, which leads to severe damage and causes structure degradation or even failure. In order to improve the durability of concrete, the High Performance Concrete (HPC) has become widely used by partially replacing cement with pozzolanic materials. However, HPC degradation mechanisms in sulfate and acidic environments are not completely understood. It is therefore important to evaluate the performance of the HPC in such conditions and predict concrete service life by establishing degradation models. This study began with a review of available environmental data in the State of Florida. A total of seven bridges have been inspected. Concrete cores were taken from these bridge piles and were subjected for microstructural analysis using Scanning Electron Microscope (SEM). Ettringite is found to be the products of sulfate attack in sulfate and acidic condition. In order to quantitatively analyze concrete deterioration level, an image processing program is designed using Matlab to obtain quantitative data. Crack percentage (Acrack/Asurface) is used to evaluate concrete deterioration. Thereafter, correlation analysis was performed to find the correlation between five related variables and concrete deterioration. Environmental sulfate concentration and bridge age were found to be positively correlated, while environmental pH level was found to be negatively correlated. Besides environmental conditions, concrete property factor was also included in the equation. It was derived from laboratory testing data. Experimental tests were carried out implementing accelerated expansion test under controlled environment. Specimens of eight different mix designs were prepared. The effect of pozzolanic replacement rate was taken into consideration in the empirical equation. And the empirical equation was validated with existing bridges. Results show that the proposed equations compared well with field test results with a maximum deviation of ± 20%. Two examples showing how to use the proposed equations are provided to guide the practical implementation. In conclusion, the proposed approach of relating microcracks to deterioration is a better method than existing diffusion and sorption models since sulfate attack cause cracking in concrete. Imaging technique provided in this study can also be used to quantitatively analyze concrete samples.
Resumo:
Fueled by increasing human appetite for high computing performance, semiconductor technology has now marched into the deep sub-micron era. As transistor size keeps shrinking, more and more transistors are integrated into a single chip. This has increased tremendously the power consumption and heat generation of IC chips. The rapidly growing heat dissipation greatly increases the packaging/cooling costs, and adversely affects the performance and reliability of a computing system. In addition, it also reduces the processor's life span and may even crash the entire computing system. Therefore, dynamic thermal management (DTM) is becoming a critical problem in modern computer system design. Extensive theoretical research has been conducted to study the DTM problem. However, most of them are based on theoretically idealized assumptions or simplified models. While these models and assumptions help to greatly simplify a complex problem and make it theoretically manageable, practical computer systems and applications must deal with many practical factors and details beyond these models or assumptions. The goal of our research was to develop a test platform that can be used to validate theoretical results on DTM under well-controlled conditions, to identify the limitations of existing theoretical results, and also to develop new and practical DTM techniques. This dissertation details the background and our research efforts in this endeavor. Specifically, in our research, we first developed a customized test platform based on an Intel desktop. We then tested a number of related theoretical works and examined their limitations under the practical hardware environment. With these limitations in mind, we developed a new reactive thermal management algorithm for single-core computing systems to optimize the throughput under a peak temperature constraint. We further extended our research to a multicore platform and developed an effective proactive DTM technique for throughput maximization on multicore processor based on task migration and dynamic voltage frequency scaling technique. The significance of our research lies in the fact that our research complements the current extensive theoretical research in dealing with increasingly critical thermal problems and enabling the continuous evolution of high performance computing systems.
Resumo:
Catering to society's demand for high performance computing, billions of transistors are now integrated on IC chips to deliver unprecedented performances. With increasing transistor density, the power consumption/density is growing exponentially. The increasing power consumption directly translates to the high chip temperature, which not only raises the packaging/cooling costs, but also degrades the performance/reliability and life span of the computing systems. Moreover, high chip temperature also greatly increases the leakage power consumption, which is becoming more and more significant with the continuous scaling of the transistor size. As the semiconductor industry continues to evolve, power and thermal challenges have become the most critical challenges in the design of new generations of computing systems. ^ In this dissertation, we addressed the power/thermal issues from the system-level perspective. Specifically, we sought to employ real-time scheduling methods to optimize the power/thermal efficiency of the real-time computing systems, with leakage/ temperature dependency taken into consideration. In our research, we first explored the fundamental principles on how to employ dynamic voltage scaling (DVS) techniques to reduce the peak operating temperature when running a real-time application on a single core platform. We further proposed a novel real-time scheduling method, “M-Oscillations” to reduce the peak temperature when scheduling a hard real-time periodic task set. We also developed three checking methods to guarantee the feasibility of a periodic real-time schedule under peak temperature constraint. We further extended our research from single core platform to multi-core platform. We investigated the energy estimation problem on the multi-core platforms and developed a light weight and accurate method to calculate the energy consumption for a given voltage schedule on a multi-core platform. Finally, we concluded the dissertation with elaborated discussions of future extensions of our research. ^
Resumo:
Catering to society’s demand for high performance computing, billions of transistors are now integrated on IC chips to deliver unprecedented performances. With increasing transistor density, the power consumption/density is growing exponentially. The increasing power consumption directly translates to the high chip temperature, which not only raises the packaging/cooling costs, but also degrades the performance/reliability and life span of the computing systems. Moreover, high chip temperature also greatly increases the leakage power consumption, which is becoming more and more significant with the continuous scaling of the transistor size. As the semiconductor industry continues to evolve, power and thermal challenges have become the most critical challenges in the design of new generations of computing systems. In this dissertation, we addressed the power/thermal issues from the system-level perspective. Specifically, we sought to employ real-time scheduling methods to optimize the power/thermal efficiency of the real-time computing systems, with leakage/ temperature dependency taken into consideration. In our research, we first explored the fundamental principles on how to employ dynamic voltage scaling (DVS) techniques to reduce the peak operating temperature when running a real-time application on a single core platform. We further proposed a novel real-time scheduling method, “M-Oscillations” to reduce the peak temperature when scheduling a hard real-time periodic task set. We also developed three checking methods to guarantee the feasibility of a periodic real-time schedule under peak temperature constraint. We further extended our research from single core platform to multi-core platform. We investigated the energy estimation problem on the multi-core platforms and developed a light weight and accurate method to calculate the energy consumption for a given voltage schedule on a multi-core platform. Finally, we concluded the dissertation with elaborated discussions of future extensions of our research.
Resumo:
The phenomenonal growth of the Internet has connected us to a vast amount of computation and information resources around the world. However, making use of these resources is difficult due to the unparalleled massiveness, high communication latency, share-nothing architecture and unreliable connection of the Internet. In this dissertation, we present a distributed software agent approach, which brings a new distributed problem-solving paradigm to the Internet computing researches with enhanced client-server scheme, inherent scalability and heterogeneity. Our study discusses the role of a distributed software agent in Internet computing and classifies it into three major categories by the objects it interacts with: computation agent, information agent and interface agent. The discussion of the problem domain and the deployment of the computation agent and the information agent are presented with the analysis, design and implementation of the experimental systems in high performance Internet computing and in scalable Web searching. ^ In the computation agent study, high performance Internet computing can be achieved with our proposed Java massive computation agent (JAM) model. We analyzed the JAM computing scheme and built a brutal force cipher text decryption prototype. In the information agent study, we discuss the scalability problem of the existing Web search engines and designed the approach of Web searching with distributed collaborative index agent. This approach can be used for constructing a more accurate, reusable and scalable solution to deal with the growth of the Web and of the information on the Web. ^ Our research reveals that with the deployment of the distributed software agent in Internet computing, we can have a more cost effective approach to make better use of the gigantic scale network of computation and information resources on the Internet. The case studies in our research show that we are now able to solve many practically hard or previously unsolvable problems caused by the inherent difficulties of Internet computing. ^
Resumo:
Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. ^ This thesis describes a heterogeneous database system being developed at High-performance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii) a framework for intelligent computing and communication on the Internet applying the concepts of our work. ^
Resumo:
Since the 1990s, scholars have paid special attention to public management’s role in theory and research under the assumption that effective management is one of the primary means for achieving superior performance. To some extent, this was influenced by popular business writings of the 1980s as well as the reinventing literature of the 1990s. A number of case studies but limited quantitative research papers have been published showing that management matters in the performance of public organizations. ^ My study examined whether or not management capacity increased organizational performance using quantitative techniques. The specific research problem analyzed was whether significant differences existed between high and average performing public housing agencies on select criteria identified in the Government Performance Project (GPP) management capacity model, and whether this model could predict outcome performance measures in a statistically significant manner, while controlling for exogenous influences. My model included two of four GPP management subsystems (human resources and information technology), integration and alignment of subsystems, and an overall managing for results framework. It also included environmental and client control variables that were hypothesized to affect performance independent of management action. ^ Descriptive results of survey responses showed high performing agencies with better scores on most high performance dimensions of individual criteria, suggesting support for the model; however, quantitative analysis found limited statistically significant differences between high and average performers and limited predictive power of the model. My analysis led to the following major conclusions: past performance was the strongest predictor of present performance; high unionization hurt performance; and budget related criterion mattered more for high performance than other model factors. As to the specific research question, management capacity may be necessary but it is not sufficient to increase performance. ^ The research suggested managers may benefit by implementing best practices identified through the GPP model. The usefulness of the model could be improved by adding direct service delivery to the model, which may also improve its predictive power. Finally, there are abundant tested concepts and tools designed to improve system performance that are available for practitioners designed to improve management subsystem support of direct service delivery.^