858 resultados para Electrical engineering|Computer science


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Defeasible reasoning is a simple but efficient approach to nonmonotonic reasoning that has recently attracted considerable interest and that has found various applications. Defeasible logic and its variants are an important family of defeasible reasoning methods. So far no relationship has been established between defeasible logic and mainstream nonmonotonic reasoning approaches. In this paper we establish close links to known semantics of logic programs. In particular, we give a translation of a defeasible theory D into a meta-program P(D). We show that under a condition of decisiveness, the defeasible consequences of D correspond exactly to the sceptical conclusions of P(D) under the stable model semantics. Without decisiveness, the result holds only in one direction (all defeasible consequences of D are included in all stable models of P(D)). If we wish a complete embedding for the general case, we need to use the Kunen semantics of P(D), instead.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 1972 the ionized cluster beam (ICB) deposition technique was introduced as a new method for thin film deposition. At that time the use of clusters was postulated to be able to enhance film nucleation and adatom surface mobility, resulting in high quality films. Although a few researchers reported singly ionized clusters containing 10$\sp2$-10$\sp3$ atoms, others were unable to repeat their work. The consensus now is that film effects in the early investigations were due to self-ion bombardment rather than clusters. Subsequently in recent work (early 1992) synthesis of large clusters of zinc without the use of a carrier gas was demonstrated by Gspann and repeated in our laboratory. Clusters resulted from very significant changes in two source parameters. Crucible pressure was increased from the earlier 2 Torr to several thousand Torr and a converging-diverging nozzle 18 mm long and 0.4 mm in diameter at the throat was used in place of the 1 mm x 1 mm nozzle used in the early work. While this is practical for zinc and other high vapor pressure materials it remains impractical for many materials of industrial interest such as gold, silver, and aluminum. The work presented here describes results using gold and silver at pressures of around 1 and 50 Torr in order to study the effect of the pressure and nozzle shape. Significant numbers of large clusters were not detected. Deposited films were studied by atomic force microscopy (AFM) for roughness analysis, and X-ray diffraction.^ Nanometer size islands of zinc deposited on flat silicon substrates by ICB were also studied by atomic force microscopy and the number of atoms/cm$\sp2$ was calculated and compared to data from Rutherford backscattering spectrometry (RBS). To improve the agreement between data from AFM and RBS, convolution and deconvolution algorithms were implemented to study and simulate the interaction between tip and sample in atomic force microscopy. The deconvolution algorithm takes into account the physical volume occupied by the tip resulting in an image that is a more accurate representation of the surface.^ One method increasingly used to study the deposited films both during the growth process and following, is ellipsometry. Ellipsometry is a surface analytical technique used to determine the optical properties and thickness of thin films. In situ measurements can be made through the windows of a deposition chamber. A method for determining the optical properties of a film, that is sensitive only to the growing film and accommodates underlying interfacial layers, multiple unknown underlayers, and other unknown substrates was developed. This method is carried out by making an initial ellipsometry measurement well past the real interface and by defining a virtual interface in the vicinity of this measurement. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As time advances, man has been able to control technology in finer and finer detail. The microelectronics era is an example of this, with control down to the micrometer. Experts agree that we may be entering a new era, controlling technology down to the nanometer. One aspect of such control is making materials in the nanometer range, i.e. nanoparticles. For this purpose, a new magnetron-sputtering gun, inert gas condensation, nanoparticle source has been designed, built, and tested. ^ Films made from cobalt, nickel, tantalum, molybdenum, chromium, and aluminum have been investigated. Transmission Electron Microscope measurements done at the University of Illinois confirm the thin films are nanostructured. This was also confirmed by Atomic Force Microscope measurements made at the F.I.U. Thin Film Laboratory. ^ Composition, optical and magnetic properties have been measured. In most cases, unique properties have been found that differ significantly from bulk properties. Rutherford Backscattering measurements done at the University of Illinois determined significant percentages of oxygen and carbon in the samples, possibly due to interactions with air. Because of this, optical properties are a composite of oxide, metal, and void properties. Magnetic materials were determined to have spin-glass properties below the irreversibility temperature and superparamagnetic properties above it. Indications of possible future uses for these nanostructured materials are discussed. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since multimedia data, such as images and videos, are way more expressive and informative than ordinary text-based data, people find it more attractive to communicate and express with them. Additionally, with the rising popularity of social networking tools such as Facebook and Twitter, multimedia information retrieval can no longer be considered a solitary task. Rather, people constantly collaborate with one another while searching and retrieving information. But the very cause of the popularity of multimedia data, the huge and different types of information a single data object can carry, makes their management a challenging task. Multimedia data is commonly represented as multidimensional feature vectors and carry high-level semantic information. These two characteristics make them very different from traditional alpha-numeric data. Thus, to try to manage them with frameworks and rationales designed for primitive alpha-numeric data, will be inefficient. An index structure is the backbone of any database management system. It has been seen that index structures present in existing relational database management frameworks cannot handle multimedia data effectively. Thus, in this dissertation, a generalized multidimensional index structure is proposed which accommodates the atypical multidimensional representation and the semantic information carried by different multimedia data seamlessly from within one single framework. Additionally, the dissertation investigates the evolving relationships among multimedia data in a collaborative environment and how such information can help to customize the design of the proposed index structure, when it is used to manage multimedia data in a shared environment. Extensive experiments were conducted to present the usability and better performance of the proposed framework over current state-of-art approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Type systems for secure information flow aim to prevent a program from leaking information from H (high) to L (low) variables. Traditionally, bisimulation has been the prevalent technique for proving the soundness of such systems. This work introduces a new proof technique based on stripping and fast simulation, and shows that it can be applied in a number of cases where bisimulation fails. We present a progressive development of this technique over a representative sample of languages including a simple imperative language (core theory), a multiprocessing nondeterministic language, a probabilistic language, and a language with cryptographic primitives. In the core theory we illustrate the key concepts of this technique in a basic setting. A fast low simulation in the context of transition systems is a binary relation where simulating states can match the moves of simulated states while maintaining the equivalence of low variables; stripping is a function that removes high commands from programs. We show that we can prove secure information flow by arguing that the stripping relation is a fast low simulation. We then extend the core theory to an abstract distributed language under a nondeterministic scheduler. Next, we extend to a probabilistic language with a random assignment command; we generalize fast simulation to the setting of discrete time Markov Chains, and prove approximate probabilistic noninterference. Finally, we introduce cryptographic primitives into the probabilistic language and prove computational noninterference, provided that the underling encryption scheme is secure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid growth of the Internet and the advancements of the Web technologies have made it possible for users to have access to large amounts of on-line music data, including music acoustic signals, lyrics, style/mood labels, and user-assigned tags. The progress has made music listening more fun, but has raised an issue of how to organize this data, and more generally, how computer programs can assist users in their music experience. An important subject in computer-aided music listening is music retrieval, i.e., the issue of efficiently helping users in locating the music they are looking for. Traditionally, songs were organized in a hierarchical structure such as genre->artist->album->track, to facilitate the users’ navigation. However, the intentions of the users are often hard to be captured in such a simply organized structure. The users may want to listen to music of a particular mood, style or topic; and/or any songs similar to some given music samples. This motivated us to work on user-centric music retrieval system to improve users’ satisfaction with the system. The traditional music information retrieval research was mainly concerned with classification, clustering, identification, and similarity search of acoustic data of music by way of feature extraction algorithms and machine learning techniques. More recently the music information retrieval research has focused on utilizing other types of data, such as lyrics, user-access patterns, and user-defined tags, and on targeting non-genre categories for classification, such as mood labels and styles. This dissertation focused on investigating and developing effective data mining techniques for (1) organizing and annotating music data with styles, moods and user-assigned tags; (2) performing effective analysis of music data with features from diverse information sources; and (3) recommending music songs to the users utilizing both content features and user access patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electronic noise has been investigated in AlxGa1−x N/GaN Modulation-Doped Field Effect Transistors (MODFETs) of submicron dimensions, grown for us by MBE (Molecular Beam Epitaxy) techniques at Virginia Commonwealth University by Dr. H. Morkoç and coworkers. Some 20 devices were grown on a GaN substrate, four of which have leads bonded to source (S), drain (D), and gate (G) pads, respectively. Conduction takes place in the quasi-2D layer of the junction (xy plane) which is perpendicular to the quantum well (z-direction) of average triangular width ∼3 nm. A non-doped intrinsic buffer layer of ∼5 nm separates the Si-doped donors in the AlxGa1−xN layer from the 2D-transistor plane, which affords a very high electron mobility, thus enabling high-speed devices. Since all contacts (S, D, and G) must reach through the AlxGa1−xN layer to connect internally to the 2D plane, parallel conduction through this layer is a feature of all modulation-doped devices. While the shunting effect may account for no more than a few percent of the current IDS, it is responsible for most excess noise, over and above thermal noise of the device. ^ The excess noise has been analyzed as a sum of Lorentzian spectra and 1/f noise. The Lorentzian noise has been ascribed to trapping of the carriers in the AlxGa1−xN layer. A detailed, multitrapping generation-recombination noise theory is presented, which shows that an exponential relationship exists for the time constants obtained from the spectral components as a function of 1/kT. The trap depths have been obtained from Arrhenius plots of log (τT2) vs. 1000/T. Comparison with previous noise results for GaAs devices shows that: (a) many more trapping levels are present in these nitride-based devices; (b) the traps are deeper (farther below the conduction band) than for GaAs. Furthermore, the magnitude of the noise is strongly dependent on the level of depletion of the AlxGa1−xN donor layer, which can be altered by a negative or positive gate bias VGS. ^ Altogether, these frontier nitride-based devices are promising for bluish light optoelectronic devices and lasers; however, the noise, though well understood, indicates that the purity of the constituent layers should be greatly improved for future technological applications. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The promise of Wireless Sensor Networks (WSNs) is the autonomous collaboration of a collection of sensors to accomplish some specific goals which a single sensor cannot offer. Basically, sensor networking serves a range of applications by providing the raw data as fundamentals for further analyses and actions. The imprecision of the collected data could tremendously mislead the decision-making process of sensor-based applications, resulting in an ineffectiveness or failure of the application objectives. Due to inherent WSN characteristics normally spoiling the raw sensor readings, many research efforts attempt to improve the accuracy of the corrupted or "dirty" sensor data. The dirty data need to be cleaned or corrected. However, the developed data cleaning solutions restrict themselves to the scope of static WSNs where deployed sensors would rarely move during the operation. Nowadays, many emerging applications relying on WSNs need the sensor mobility to enhance the application efficiency and usage flexibility. The location of deployed sensors needs to be dynamic. Also, each sensor would independently function and contribute its resources. Sensors equipped with vehicles for monitoring the traffic condition could be depicted as one of the prospective examples. The sensor mobility causes a transient in network topology and correlation among sensor streams. Based on static relationships among sensors, the existing methods for cleaning sensor data in static WSNs are invalid in such mobile scenarios. Therefore, a solution of data cleaning that considers the sensor movements is actively needed. This dissertation aims to improve the quality of sensor data by considering the consequences of various trajectory relationships of autonomous mobile sensors in the system. First of all, we address the dynamic network topology due to sensor mobility. The concept of virtual sensor is presented and used for spatio-temporal selection of neighboring sensors to help in cleaning sensor data streams. This method is one of the first methods to clean data in mobile sensor environments. We also study the mobility pattern of moving sensors relative to boundaries of sub-areas of interest. We developed a belief-based analysis to determine the reliable sets of neighboring sensors to improve the cleaning performance, especially when node density is relatively low. Finally, we design a novel sketch-based technique to clean data from internal sensors where spatio-temporal relationships among sensors cannot lead to the data correlations among sensor streams.