954 resultados para context-aware applications
Resumo:
PREPARATION OF COATED MICROTOOLS FOR ELECTROCHEMICAL MACHINING APPLICATIONS Ajaya K. Swain, M.S. University of Nebraska, 2010 Advisor: K.P. Rajurkar Coated tools have improved the performance of both traditional and nontraditional machining processes and have resulted in higher material removal, better surface finish, and increased wear resistance. However, a study on the performance of coated tools in micromachining has not yet been adequately conducted. One possible reason is the difficulties associated with the preparation of coated microtools. Besides the technical requirement, economic and environmental aspects of the material and the coating technique used also play a significant role in coating microtools. This, in fact, restricts the range of coating materials and the type of coating process. Handling is another major issue in case of microtools purely because of their miniature size. This research focuses on the preparation of coated microtools for pulse electrochemical machining by electrodeposition. The motivation of this research is derived from the fact that although there were reports of improved machining by using insulating coatings on ECM tools, particularly in ECM drilling operations, not much literature was found relating to use of metallic coating materials in other ECM process types. An ideal ECM tool should be good thermal and electrical conductor, corrosion resistant, electrochemically stable, and stiff enough to withstand electrolyte pressure. Tungsten has almost all the properties desired in an ECM tool material except being electrochemically unstable. Tungsten can be oxidized during machining resulting in poor machining quality. Electrochemical stability of a tungsten ECM tool can be improved by electroplating it with nickel which has superior electrochemical resistance. Moreover, a tungsten tool can be coated in situ reducing the tool handling and breakage frequency. The tungsten microtool was electroplated with nickel with direct and pulse current. The effect of the various input parameters on the coating characteristics was studied and performance of the coated microtool was evaluated in pulse ECM. The coated tool removed more material (about 28%) than the uncoated tool under similar conditions and was more electrochemical stable. It was concluded that nickel coated tungsten microtool can improve the pulse ECM performance.
Resumo:
Maximum-likelihood decoding is often the optimal decoding rule one can use, but it is very costly to implement in a general setting. Much effort has therefore been dedicated to find efficient decoding algorithms that either achieve or approximate the error-correcting performance of the maximum-likelihood decoder. This dissertation examines two approaches to this problem. In 2003 Feldman and his collaborators defined the linear programming decoder, which operates by solving a linear programming relaxation of the maximum-likelihood decoding problem. As with many modern decoding algorithms, is possible for the linear programming decoder to output vectors that do not correspond to codewords; such vectors are known as pseudocodewords. In this work, we completely classify the set of linear programming pseudocodewords for the family of cycle codes. For the case of the binary symmetric channel, another approximation of maximum-likelihood decoding was introduced by Omura in 1972. This decoder employs an iterative algorithm whose behavior closely mimics that of the simplex algorithm. We generalize Omura's decoder to operate on any binary-input memoryless channel, thus obtaining a soft-decision decoding algorithm. Further, we prove that the probability of the generalized algorithm returning the maximum-likelihood codeword approaches 1 as the number of iterations goes to infinity.
Resumo:
Control of burgeoning populations of white-tailed deer (Odocoileus virginianus) is a challenging endeavor under the best of circumstances. The challenge is further complicated when control programs are attempted within an urban or suburban area. Wildlife managers often consider management techniques and equipment which have a proven track record. New challenges require new and innovative techniques. The deer management program in Fairfax County, Virginia has employed thermal imaging technology in a variety of ways to better address these unique challenges. In addition to the more commonly used aircraft-mounted FLIR (forward looking infrared), this program utilizes vehicle-mounted and hand-held thermal imaging devices. Thermal imaging is used in determining herd densities, ensuring that control areas are free of humans, locating deer, assessing target attributes and recovering culled deer. These devices bring a higher level of safety, efficiency and efficacy to control programs operating within these difficult environs.
Resumo:
As the area of nanotechnology continues to grow, the development of new nanomaterials with interesting physical and electronic properties and improved characterization techniques are several areas of research that will be remain vital for continued improvement of devices and the understanding in nanoscale phenomenon. In this dissertation, the chemical vapor deposition synthesis of rare earth (RE) compounds is described in detail. In general, the procedure involves the vaporization of a REClx (RE = Y, La, Ce, Pr, Nd, Sm, Gd, Tb, Dy, Ho) in the presence of hydride phase precursors such as decaborane and ammonia at high temperatures and low pressures. The vapor-liquid-solid mechanism was used in combination with the chemical vapor deposition process to synthesize single crystalline rare earth hexaboride nanostructures. The crystallographic orientation of as-synthesized rare earth hexaboride nanostructures and gadolinium nitride thin films was controlled by judicious choice of specific growth substrates and modeled by analyzing x-ray diffraction powder patterns and crystallographic models. The rare earth hexaboride nanostructures were then implemented into two existing technologies to enhance their characterization capabilities. First, the rare earth hexaboride nanowires were used as a test material for the development of a TEM based local electrode atom probe tomography (LEAP) technique. This technique provided some of the first quantitative compositional information of the rare earth hexaboride systems. Second, due to the rigidity and excellent conductivity of the rare earth hexaborides, nanostructures were grown onto tungsten wires for the development of robust, oxidation resistant nanomanipulator electronic probes for semiconductor device failure analysis.
Resumo:
ABSTRACT: This thesis report illustrates the applications and potentials of biogenic methane recovery in Nebraska’s agricultural and industrial sectors and as a means for increasing sustainable economic development in the state’s rural communities. As the nation moves toward a new green economy, biogenic methane recovery as a waste management strategy and renewable energy resource presents significant opportunities for Nebraska to be a national and world leader in agricultural and industrial innovation, advanced research and development of renewable energy technology, and generation of new product markets. Nebraska’s agricultural economy provides a distinct advantage to the state for supporting methane recovery operations that provide long-term economic and environmental partnerships among producers, industry, and communities. These opportunities will serve to protect Nebraska’s agricultural producers from volatile energy input markets and as well as creating new markets for Nebraska agricultural products. They will also serve to provide quality education and employment opportunities for Nebraska students and businesses. There are challenges and issues that remain for the state in order to take advantage of its resource potential. There is a need to produce a comprehensive Nebraska biogenic methane potential study and digital mapping system to identify high-potential producers, co-products, and markets. There is also a need to develop a web-based format of consolidated information specific to Nebraska to aid in connecting producers, service providers, educators, and policy-makers.
Resumo:
Fibrous materials have morphological similarities to natural cartilage extracellular matrix and have been considered as candidate for bone tissue engineering scaffolds. In this study, we have evaluated a novel electrospun chitosan mat composed of oriented sub-micron fibers for its tensile property and biocompatibility with chondrocytes (cell attachment, proliferation and viability). Scanning electronic microscope images showed the fibers in the electrospun chitosan mats were indeed aligned and there was a slight cross-linking between the parent fibers. The electrospun mats have significantly higher elastic modulus (2.25 MPa) than the cast films (1.19 MPa). Viability of cells on the electrospun mat was 69% of the cells on tissue-culture polystyrene (TCP control) after three days in culture, which was slightly higher than that on the cast films (63% of the TCP control). Cells on the electrospun mat grew slowly the first week but the growth rate increased after that. By day 10, cell number on the electrospun mat was almost 82% that of TCP control, which was higher than that of cast films (56% of TCP). The electrospun chitosan mats have a higher Young’s modulus (P <0.01) than cast films and provide good chondrocyte biocompatibility. The electrospun chitosan mats, thus, have the potential to be further processed into three-dimensional scaffolds for cartilage tissue repair.
Resumo:
Technical evaluation of analytical data is of extreme relevance considering it can be used for comparisons with environmental quality standards and decision-making as related to the management of disposal of dredged sediments and the evaluation of salt and brackish water quality in accordance with CONAMA 357/05 Resolution. It is, therefore, essential that the project manager discusses the environmental agency`s technical requirements with the laboratory contracted for the follow-up of the analysis underway and even with a view to possible re-analysis when anomalous data are identified. The main technical requirements are: (1) method quantitation limits (QLs) should fall below environmental standards; (2) analyses should be carried out in laboratories whose analytical scope is accredited by the National Institute of Metrology (INMETRO) or qualified or accepted by a licensing agency; (3) chain of custody should be provided in order to ensure sample traceability; (4) control charts should be provided to prove method performance; (5) certified reference material analysis or, if that is not available, matrix spike analysis, should be undertaken and (6) chromatograms should be included in the analytical report. Within this context and with a view to helping environmental managers in analytical report evaluation, this work has as objectives the discussion of the limitations of the application of SW 846 US EPA methods to marine samples, the consequences of having data based on method detection limits (MDL) and not sample quantitation limits (SQL), and present possible modifications of the principal method applied by laboratories in order to comply with environmental quality standards.
Resumo:
Ubiquitous Computing promises seamless access to a wide range of applications and Internet based services from anywhere, at anytime, and using any device. In this scenario, new challenges for the practice of software development arise: Applications and services must keep a coherent behavior, a proper appearance, and must adapt to a plenty of contextual usage requirements and hardware aspects. Especially, due to its interactive nature, the interface content of Web applications must adapt to a large diversity of devices and contexts. In order to overcome such obstacles, this work introduces an innovative methodology for content adaptation of Web 2.0 interfaces. The basis of our work is to combine static adaption - the implementation of static Web interfaces; and dynamic adaptation - the alteration, during execution time, of static interfaces so as for adapting to different contexts of use. In hybrid fashion, our methodology benefits from the advantages of both adaptation strategies - static and dynamic. In this line, we designed and implemented UbiCon, a framework over which we tested our concepts through a case study and through a development experiment. Our results show that the hybrid methodology over UbiCon leads to broader and more accessible interfaces, and to faster and less costly software development. We believe that the UbiCon hybrid methodology can foster more efficient and accurate interface engineering in the industry and in the academy.
Resumo:
Working with nuclear magnetic resonance (NMR) in quadrupolar spin systems, in this paper we transfer the concept of atomic coherent state to the nuclear spin context, where it is referred to as pseudonuclear spin coherent state (pseudo-NSCS). Experimentally, we discuss the initialization of the pseudo- NSCSs and also their quantum control, implemented by polar and azimuthal rotations. Theoretically, we compute the geometric phases acquired by an initial pseudo-NSCS on undergoing three distinct cyclic evolutions: (i) the free evolution of the NMR quadrupolar system and, by analogy with the evolution of the NMR quadrupolar system, that of (ii) single-mode and (iii) two-mode Bose-Einstein Condensate like system. By means of these analogies, we derive, through spin angular momentum operators, results equivalent to those presented in the literature for orbital angular momentum operators. The pseudo-NSCS description is a starting point to introduce the spin squeezed state and quantum metrology into nuclear spin systems of liquid crystal or solid matter.
Resumo:
Chemistry can contribute, in many different ways to solve the challenges we are facing to modify our inefficient and fossil-fuel based energy system. The present work was motivated by the search for efficient photoactive materials to be employed in the context of the energy problem: materials to be utilized in energy efficient devices and in the production of renewable electricity and fuels. We presented a new class of copper complexes, that could find application in lighting techhnologies, by serving as luminescent materials in LEC, OLED, WOLED devices. These technologies may provide substantial energy savings in the lighting sector. Moreover, recently, copper complexes have been used as light harvesting compounds in dye sensitized photoelectrochemical solar cells, which offer a viable alternative to silicon-based photovoltaic technologies. We presented also a few supramolecular systems containing fullerene, e.g. dendrimers, dyads and triads.The most complex among these arrays, which contain porphyrin moieties, are presented in the final chapter. They undergo photoinduced energy- and electron transfer processes also with long-lived charge separated states, i.e. the fundamental processes to power artificial photosynthetic systems.
Resumo:
Ambient Intelligence (AmI) envisions a world where smart, electronic environments are aware and responsive to their context. People moving into these settings engage many computational devices and systems simultaneously even if they are not aware of their presence. AmI stems from the convergence of three key technologies: ubiquitous computing, ubiquitous communication and natural interfaces. The dependence on a large amount of fixed and mobile sensors embedded into the environment makes of Wireless Sensor Networks one of the most relevant enabling technologies for AmI. WSN are complex systems made up of a number of sensor nodes, simple devices that typically embed a low power computational unit (microcontrollers, FPGAs etc.), a wireless communication unit, one or more sensors and a some form of energy supply (either batteries or energy scavenger modules). Low-cost, low-computational power, low energy consumption and small size are characteristics that must be taken into consideration when designing and dealing with WSNs. In order to handle the large amount of data generated by a WSN several multi sensor data fusion techniques have been developed. The aim of multisensor data fusion is to combine data to achieve better accuracy and inferences than could be achieved by the use of a single sensor alone. In this dissertation we present our results in building several AmI applications suitable for a WSN implementation. The work can be divided into two main areas: Multimodal Surveillance and Activity Recognition. Novel techniques to handle data from a network of low-cost, low-power Pyroelectric InfraRed (PIR) sensors are presented. Such techniques allow the detection of the number of people moving in the environment, their direction of movement and their position. We discuss how a mesh of PIR sensors can be integrated with a video surveillance system to increase its performance in people tracking. Furthermore we embed a PIR sensor within the design of a Wireless Video Sensor Node (WVSN) to extend its lifetime. Activity recognition is a fundamental block in natural interfaces. A challenging objective is to design an activity recognition system that is able to exploit a redundant but unreliable WSN. We present our activity in building a novel activity recognition architecture for such a dynamic system. The architecture has a hierarchical structure where simple nodes performs gesture classification and a high level meta classifiers fuses a changing number of classifier outputs. We demonstrate the benefit of such architecture in terms of increased recognition performance, and fault and noise robustness. Furthermore we show how we can extend network lifetime by performing a performance-power trade-off. Smart objects can enhance user experience within smart environments. We present our work in extending the capabilities of the Smart Micrel Cube (SMCube), a smart object used as tangible interface within a tangible computing framework, through the development of a gesture recognition algorithm suitable for this limited computational power device. Finally the development of activity recognition techniques can greatly benefit from the availability of shared dataset. We report our experience in building a dataset for activity recognition. Such dataset is freely available to the scientific community for research purposes and can be used as a testbench for developing, testing and comparing different activity recognition techniques.
Resumo:
This thesis deals with an investigation of combinatorial and robust optimisation models to solve railway problems. Railway applications represent a challenging area for operations research. In fact, most problems in this context can be modelled as combinatorial optimisation problems, in which the number of feasible solutions is finite. Yet, despite the astonishing success in the field of combinatorial optimisation, the current state of algorithmic research faces severe difficulties with highly-complex and data-intensive applications such as those dealing with optimisation issues in large-scale transportation networks. One of the main issues concerns imperfect information. The idea of Robust Optimisation, as a way to represent and handle mathematically systems with not precisely known data, dates back to 1970s. Unfortunately, none of those techniques proved to be successfully applicable in one of the most complex and largest in scale (transportation) settings: that of railway systems. Railway optimisation deals with planning and scheduling problems over several time horizons. Disturbances are inevitable and severely affect the planning process. Here we focus on two compelling aspects of planning: robust planning and online (real-time) planning.
Resumo:
This thesis deals with inflation theory, focussing on the model of Jarrow & Yildirim, which is nowadays used when pricing inflation derivatives. After recalling main results about short and forward interest rate models, the dynamics of the main components of the market are derived. Then the most important inflation-indexed derivatives are explained (zero coupon swap, year-on-year, cap and floor), and their pricing proceeding is shown step by step. Calibration is explained and performed with a common method and an heuristic and non standard one. The model is enriched with credit risk, too, which allows to take into account the possibility of bankrupt of the counterparty of a contract. In this context, the general method of pricing is derived, with the introduction of defaultable zero-coupon bonds, and the Monte Carlo method is treated in detailed and used to price a concrete example of contract. Appendixes: A: martingale measures, Girsanov's theorem and the change of numeraire. B: some aspects of the theory of Stochastic Differential Equations; in particular, the solution for linear EDSs, and the Feynman-Kac Theorem, which shows the connection between EDSs and Partial Differential Equations. C: some useful results about normal distribution.
Resumo:
The common thread of this thesis is the will of investigating properties and behavior of assemblies. Groups of objects display peculiar properties, which can be very far from the simple sum of respective components’ properties. This is truer, the smaller is inter-objects distance, i.e. the higher is their density, and the smaller is the container size. “Confinement” is in fact a key concept in many topics explored and here reported. It can be conceived as a spatial limitation, that yet gives origin to unexpected processes and phenomena based on inter-objects communication. Such phenomena eventually result in “non-linear properties”, responsible for the low predictability of large assemblies. Chapter 1 provides two insights on surface chemistry, namely (i) on a supramolecular assembly based on orthogonal forces, and (ii) on selective and sensitive fluorescent sensing in thin polymeric film. In chapters 2 to 4 confinement of molecules plays a major role. Most of the work focuses on FRET within core-shell nanoparticles, investigated both through a simulation model and through experiments. Exciting results of great applicative interest are drawn, such as a method of tuning emission wavelength at constant excitation, and a way of overcoming self-quenching processes by setting up a competitive deactivation channel. We envisage applications of these materials as labels for multiplexing analysis, and in all fields of fluorescence imaging, where brightness coupled with biocompatibility and water solubility is required. Adducts of nanoparticles and molecular photoswitches are investigated in the context of superresolution techniques for fluorescence microscopy. In chapter 5 a method is proposed to prepare a library of functionalized Pluronic F127, which gives access to a twofold “smart” nanomaterial, namely both (i)luminescent and (ii)surface-functionalized SCSSNPs. Focus shifts in chapter 6 to confinement effects in an upper size scale. Moving from nanometers to micrometers, we investigate the interplay between microparticles flowing in microchannels where a constriction affects at very long ranges structure and dynamics of the colloidal paste.
Resumo:
A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.