4 resultados para New high
em Digital Commons at Florida International University
Resumo:
An implementation of Sem-ODB—a database management system based on the Semantic Binary Model is presented. A metaschema of Sem-ODB database as well as the top-level architecture of the database engine is defined. A new benchmarking technique is proposed which allows databases built on different database models to compete fairly. This technique is applied to show that Sem-ODB has excellent efficiency comparing to a relational database on a certain class of database applications. A new semantic benchmark is designed which allows evaluation of the performance of the features characteristic of semantic database applications. An application used in the benchmark represents a class of problems requiring databases with sparse data, complex inheritances and many-to-many relations. Such databases can be naturally accommodated by semantic model. A fixed predefined implementation is not enforced allowing the database designer to choose the most efficient structures available in the DBMS tested. The results of the benchmark are analyzed. ^ A new high-level querying model for semantic databases is defined. It is proven adequate to serve as an efficient native semantic database interface, and has several advantages over the existing interfaces. It is optimizable and parallelizable, supports the definition of semantic userviews and the interoperability of semantic databases with other data sources such as World Wide Web, relational, and object-oriented databases. The query is structured as a semantic database schema graph with interlinking conditionals. The query result is a mini-database, accessible in the same way as the original database. The paradigm supports and utilizes the rich semantics and inherent ergonomics of semantic databases. ^ The analysis and high-level design of a system that exploits the superiority of the Semantic Database Model to other data models in expressive power and ease of use to allow uniform access to heterogeneous data sources such as semantic databases, relational databases, web sites, ASCII files, and others via a common query interface is presented. The Sem-ODB engine is used to control all the data sources combined under a unified semantic schema. A particular application of the system to provide an ODBC interface to the WWW as a data source is discussed. ^
Resumo:
Small devices, in the range of nanometers, are playing a major role in today's technology. The field of nanotechnology is concerned with materials and systems whose structures and components exhibit novel and significantly improved physical, chemical and biological properties, phenomena and processes due to their small nanoscale size. Researches more and more are finding that structural features in the range of about 1 to 100 nanometers behave quite differently than isolated molecules (1 nanometer) or bulk materials. For comparison, a 10 nanometer structure is 1000 times smaller than the diameter of a human hair. The virtues of working in the nanodomain are increasingly recognized by the scientific community and discussed in the popular press. The use of such devices is expected to revolutionize our industries and lives. ^ This work mainly focuses on the fabrication, characterization and discovery of new nanostructured thin films. This research consists of the design of a new high-deposition rate nanoparticle machine for depositing nanostructured films from beams of nanoparticles and investigation film's unique optical and physical properties.^ A high-deposition rate nanoparticle machine was designed, built and successfully tested. Different nanostructured thin films were deposited from Copper, Gold, Iron and Zirconium targets with the grain size of between 1 to 20 nm under different conditions. Transmission Electron Microscopy (TEM), Atomic Force Microscopy (AFM), and x-ray diffraction (XRD) confirmed nanoscale grain size structures of deposited films. The optical properties of the nanostructured films deposited from copper, Iron and Zirconium targets were significantly different from optical properties of bulk and thin films. Zr, Cu and Fe films were transparent. Gold films revealed an epitaxial contact with the silicon substrate with interesting crystal structures. ^ The new high-deposition rate nanoparticle machine was able to deposit new nanostructured films with different properties from bulk and thin films reported in the literatures. ^
Resumo:
The E01-011 experiment at Jefferson Laboratory (JLab) studied light-to-medium mass Λ hypernuclei via the AZ + e → [special characters omitted] + e' + K+ electroproduction reaction. Precise measurement of hypernuclear ground state masses and excitation energies provides information about the nature of hyperon-nucleon interactions. Until recently, hypernuclei were studied at accelerator facilities with intense π+ and K- meson beams. The poor quality of these beams limited the resolution of the hypernuclear excitation energy spectra to about 1.5 MeV (FWHM). This resolution is not sufficient for resolving the rich structure observed in the excitation spectra. By using a high quality electron beam and employing a new high resolution spectrometer system, this study aims to improve the resolution to a few hundred keV with an absolute precision of about 100 keV for excitation energies. In this work the high-resolution excitation spectra of [special characters omitted], and [special characters omitted] hypernuclei are presented. In an attempt to emphasize the presence of the core-excited states we introduced a novel likelihood approach to particle identification (PID) to serve as an alternative to the commonly used standard hard-cut PID. The new method resulted in almost identical missing mass spectra as obtained by the standard approach. An energy resolution of approximately 400–500 keV (FWHM) has been achieved, an unprecedented value in hypernuclear reaction spectroscopy. For [special characters omitted] the core-excited configuration has been clearly observed with significant statistics. The embedded Λ hyperon increases the excitation energies of the 11B nuclear core by 0.5–1 MeV. The [special characters omitted] spectrum has been observed with significant statistics for the first time. The ground state is bound deeper by roughly 400 keV than currently predicted by theory. Indication for the core-excited doublet, which is unbound in the core itself, is observed. The measurement of [special characters omitted] provides the first study of a d-shell hypernucleus with sub-MeV resolution. Discrepancies of up to 2 MeV between measured and theoretically predicted binding energies are found. Similar disagreement exists when comparing to the [special characters omitted] mirror hypernucleus. Also the core-excited structure observed between the major s-, p- and d-shell Λ orbits is not consistent with the available theoretical calculations. In conclusion, the discrepancies found in this study will provide valuable input for the further development of theoretical models.
Resumo:
Engineering analysis in geometric models has been the main if not the only credible/reasonable tool used by engineers and scientists to resolve physical boundaries problems. New high speed computers have facilitated the accuracy and validation of the expected results. In practice, an engineering analysis is composed of two parts; the design of the model and the analysis of the geometry with the boundary conditions and constraints imposed on it. Numerical methods are used to resolve a large number of physical boundary problems independent of the model geometry. The time expended due to the computational process are related to the imposed boundary conditions and the well conformed geometry. Any geometric model that contains gaps or open lines is considered an imperfect geometry model and major commercial solver packages are incapable of handling such inputs. Others packages apply different kinds of methods to resolve this problems like patching or zippering; but the final resolved geometry may be different from the original geometry, and the changes may be unacceptable. The study proposed in this dissertation is based on a new technique to process models with geometrical imperfection without the necessity to repair or change the original geometry. An algorithm is presented that is able to analyze the imperfect geometric model with the imposed boundary conditions using a meshfree method and a distance field approximation to the boundaries. Experiments are proposed to analyze the convergence of the algorithm in imperfect models geometries and will be compared with the same models but with perfect geometries. Plotting results will be presented for further analysis and conclusions of the algorithm convergence