921 resultados para Docker,ARM,Raspberry PI,single board computer,QEMU,Sabayon Linux,Gentoo Linux
Resumo:
The synthesis, hydrogelation, and aggregation-induced emission switching of the phenylenedivinylene bis-N-octyl pyridinium salt is described. Hydrogelation occurs as a consequence of pi-stacking, van der Waals, and electrostatic interactions that lead to a high gel melting temperature and significant mechanical properties at a very low weight percentage of the gelator. A morphology transition from fiber-to-coil-to-tube was observed depending on the concentration of the gelator. Variation in the added salt type, salt concentrations, or temperature profoundly influenced the order of aggregation of the gelator molecules in aqueous solution. Formation of a novel chromophore assembly in this way leads to an aggregation-induced switch of the emission colors. The emission color switches from sky blue to white to orange depending upon the extent of aggregation through mere addition of external inorganic salts. Remarkably, the salt effect on the assembly of such cationic phenylenedivinylenes in water follow the behavior predicted from the well-known Hofmeister effects. Mechanistic insights for these aggregation processes were obtained through the counterion exchange studies. The aggregation-induced emission switching that leads to a room-temperature white-light emission from a single chromophore in a single solvent (water) is highly promising for optoelectronic applications.
Resumo:
In this paper, we present a novel approach that makes use of topic models based on Latent Dirichlet allocation(LDA) for generating single document summaries. Our approach is distinguished from other LDA based approaches in that we identify the summary topics which best describe a given document and only extract sentences from those paragraphs within the document which are highly correlated given the summary topics. This ensures that our summaries always highlight the crux of the document without paying any attention to the grammar and the structure of the documents. Finally, we evaluate our summaries on the DUC 2002 Single document summarization data corpus using ROUGE measures. Our summaries had higher ROUGE values and better semantic similarity with the documents than the DUC summaries.
Resumo:
Super-resolution microscopy has tremendously progressed our understanding of cellular biophysics and biochemistry. Specifically, 4pi fluorescence microscopy technique stands out because of its axial super-resolution capability. All types of 4pi-microscopy techniques work well in conjugation with deconvolution techniques to get rid of artifacts due to side-lobes. In this regard, we propose a technique based on spatial filter in a 4pi-type-C confocal setup to get rid of these artifacts. Using a special spatial filter, we have reduced the depth-of-focus. Interference of two similar depth-of-focus beams in a 4 pi geometry result in substantial reduction of side-lobes. Studies show a reduction of side-lobes by 46% and 76% for single and two photon variant compared to 4pi - type - C confocal system. This is incredible considering the resolving capability of the existing 4pi - type - C confocal microscopy. Moreover, the main lobe is found to be 150 nm for the proposed spatial filtering technique as compared to 690 nm of the state-of-art confocal system. Reconstruction of experimentally obtained 2PE - 4pi data of green fluorescent protein (GFP)-tagged mitocondrial network shows near elimination of artifacts arising out of side-lobes. Proposed technique may find interesting application in fluorescence microscopy, nano-lithography, and cell biology. (C) 2013 AIP Publishing LLC.
Resumo:
Electrical Impedance Tomography (EIT) is a computerized medical imaging technique which reconstructs the electrical impedance images of a domain under test from the boundary voltage-current data measured by an EIT electronic instrumentation using an image reconstruction algorithm. Being a computed tomography technique, EIT injects a constant current to the patient's body through the surface electrodes surrounding the domain to be imaged (Omega) and tries to calculate the spatial distribution of electrical conductivity or resistivity of the closed conducting domain using the potentials developed at the domain boundary (partial derivative Omega). Practical phantoms are essentially required to study, test and calibrate a medical EIT system for certifying the system before applying it on patients for diagnostic imaging. Therefore, the EIT phantoms are essentially required to generate boundary data for studying and assessing the instrumentation and inverse solvers a in EIT. For proper assessment of an inverse solver of a 2D EIT system, a perfect 2D practical phantom is required. As the practical phantoms are the assemblies of the objects with 3D geometries, the developing of a practical 2D-phantom is a great challenge and therefore, the boundary data generated from the practical phantoms with 3D geometry are found inappropriate for assessing a 2D inverse solver. Furthermore, the boundary data errors contributed by the instrumentation are also difficult to separate from the errors developed by the 3D phantoms. Hence, the errorless boundary data are found essential to assess the inverse solver in 2D EIT. In this direction, a MatLAB-based Virtual Phantom for 2D EIT (MatVP2DEIT) is developed to generate accurate boundary data for assessing the 2D-EIT inverse solvers and the image reconstruction accuracy. MatVP2DEIT is a MatLAB-based computer program which simulates a phantom in computer and generates the boundary potential data as the outputs by using the combinations of different phantom parameters as the inputs to the program. Phantom diameter, inhomogeneity geometry (shape, size and position), number of inhomogeneities, applied current magnitude, background resistivity, inhomogeneity resistivity all are set as the phantom variables which are provided as the input parameters to the MatVP2DEIT for simulating different phantom configurations. A constant current injection is simulated at the phantom boundary with different current injection protocols and boundary potential data are calculated. Boundary data sets are generated with different phantom configurations obtained with the different combinations of the phantom variables and the resistivity images are reconstructed using EIDORS. Boundary data of the virtual phantoms, containing inhomogeneities with complex geometries, are also generated for different current injection patterns using MatVP2DEIT and the resistivity imaging is studied. The effect of regularization method on the image reconstruction is also studied with the data generated by MatVP2DEIT. Resistivity images are evaluated by studying the resistivity parameters and contrast parameters estimated from the elemental resistivity profiles of the reconstructed phantom domain. Results show that the MatVP2DEIT generates accurate boundary data for different types of single or multiple objects which are efficient and accurate enough to reconstruct the resistivity images in EIDORS. The spatial resolution studies show that, the resistivity imaging conducted with the boundary data generated by MatVP2DEIT with 2048 elements, can reconstruct two circular inhomogeneities placed with a minimum distance (boundary to boundary) of 2 mm. It is also observed that, in MatVP2DEIT with 2048 elements, the boundary data generated for a phantom with a circular inhomogeneity of a diameter less than 7% of that of the phantom domain can produce resistivity images in EIDORS with a 1968 element mesh. Results also show that the MatVP2DEIT accurately generates the boundary data for neighbouring, opposite reference and trigonometric current patterns which are very suitable for resistivity reconstruction studies. MatVP2DEIT generated data are also found suitable for studying the effect of the different regularization methods on reconstruction process. Comparing the reconstructed image with an original geometry made in MatVP2DEIT, it would be easier to study the resistivity imaging procedures as well as the inverse solver performance. Using the proposed MatVP2DEIT software with modified domains, the cross sectional anatomy of a number of body parts can be simulated in PC and the impedance image reconstruction of human anatomy can be studied.
Resumo:
With the premise that electronic noise dominates mechanical noise in micromachined accelerometers, we present here a method to enhance the sensitivity and resolution at kHz bandwidth using mechanical amplification. This is achieved by means of a Displacement-amplifying Compliant Mechanism (DaCM) that is appended to the usual sensing element comprising a proof-mass and a suspension. Differential comb-drive arrangement is used for capacitive-sensing. The DaCM is designed to match the stiffness of the suspension so that there is substantial net amplification without compromising the bandwidth. A spring-mass-lever model is used to estimate the lumped parameters of the system. A DaCM-aided accelerometer and another without a DaCM-both occupying the same footprint-are compared to show that the former gives enhanced sensitivity: 8.7 nm/g vs. 1.4 nm/g displacement at the sensing-combs under static conditions. A prototype of the DaCM-aided micromachined acclerometer was fabricated using bulk-micromachining. It was tested at the die-level and then packaged on a printed circuit board with an off-the-shelf integrated chip for measuring change in capacitance. Under dynamic conditions, the measured amplification factor at the output of the DaCM was observed to be about 11 times larger than the displacement of the proof-mass and thus validating the concept of enhancing the sensitivity of accelerometers using mechanical amplifiers. The measured first in-plane natural frequency of the fabricated accelerometer was 6.25 kHz. The packaged accelerometer with the DaCM was measured to have 26.7 mV/g sensitivity at 40 Hz.
Resumo:
This work is a follow up to 2, FUN 2010], which initiated a detailed analysis of the popular game of UNO (R). We consider the solitaire version of the game, which was shown to be NP-complete. In 2], the authors also demonstrate a (O)(n)(c(2)) algorithm, where c is the number of colors across all the cards, which implies, in particular that the problem is polynomial time when the number of colors is a constant. In this work, we propose a kernelization algorithm, a consequence of which is that the problem is fixed-parameter tractable when the number of colors is treated as a parameter. This removes the exponential dependence on c and answers the question stated in 2] in the affirmative. We also introduce a natural and possibly more challenging version of UNO that we call ``All Or None UNO''. For this variant, we prove that even the single-player version is NP-complete, and we show a single-exponential FPT algorithm, along with a cubic kernel.
Resumo:
Package-board co-design plays a crucial role in determining the performance of high-speed systems. Although there exist several commercial solutions for electromagnetic analysis and verification, lack of Computer Aided Design (CAD) tools for SI aware design and synthesis lead to longer design cycles and non-optimal package-board interconnect geometries. In this work, the functional similarities between package-board design and radio-frequency (RF) imaging are explored. Consequently, qualitative methods common to the imaging community, like Tikhonov Regularization (TR) and Landweber method are applied to solve multi-objective, multi-variable package design problems. In addition, a new hierarchical iterative piecewise linear algorithm is developed as a wrapper over LBP for an efficient solution in the design space.
Resumo:
We present in this paper a new algorithm based on Particle Swarm Optimization (PSO) for solving Dynamic Single Objective Constrained Optimization (DCOP) problems. We have modified several different parameters of the original particle swarm optimization algorithm by introducing new types of particles for local search and to detect changes in the search space. The algorithm is tested with a known benchmark set and compare with the results with other contemporary works. We demonstrate the convergence properties by using convergence graphs and also the illustrate the changes in the current benchmark problems for more realistic correspondence to practical real world problems.
Resumo:
The interaction of a single bubble with a single vortex ring in water has been studied experimentally. Measurements of both the bubble dynamics and vorticity dynamics have been done to help understand the two-way coupled problem. The circulation strength of the vortex ring (Gamma) has been systematically varied, while keeping the bubble diameter (D-b) constant, with the bubble volume to vortex core volume ratio (V-R) also kept fixed at about 0.1. The other important parameter in the problem is a Weber number based on the vortex ring strength. (We = 0.87 rho(Gamma/2 pi a)(2)/(sigma/D-b); a = vortex core radius, sigma = surface tension), which is varied over a large range, We = 3-406. The interaction between the bubble and ring for each of the We cases broadly falls into four stages. Stage I is before capture of the bubble by the ring where the bubble is drawn into the low-pressure vortex core, while in stage II the bubble is stretched in the azimuthal direction within the ring and gradually broken up into a number of smaller bubbles. Following this, in stage III the bubble break-up is complete and the resulting smaller bubbles slowly move around the core, and finally in stage IV the bubbles escape. Apart from the effect of the ring on the bubble, the bubble is also shown to significantly affect the vortex ring, especially at low We (We similar to 3). In these low-We cases, the convection speed drops significantly compared to the base case without a bubble, while the core appears to fragment with a resultant large decrease in enstrophy by about 50 %. In the higher-We cases (We > 100), there are some differences in convection speed and enstrophy, but the effects are relatively small. The most dramatic effects of the bubble on the ring are found for thicker core rings at low We (We similar to 3) with the vortex ring almost stopping after interacting with the bubble, and the core fragmenting into two parts. The present idealized experiments exhibit many phenomena also seen in bubbly turbulent flows such as reduction in enstrophy, suppression of structures, enhancement of energy at small scales and reduction in energy at large scales. These similarities suggest that results from the present experiments can be helpful in better understanding interactions of bubbles with eddies in turbulent flows.
Resumo:
In this article, a Field Programmable Gate Array (FPGA)-based hardware accelerator for 3D electromagnetic extraction, using Method of Moments (MoM) is presented. As the number of nets or ports in a system increases, leading to a corresponding increase in the number of right-hand-side (RHS) vectors, the computational cost for multiple matrix-vector products presents a time bottleneck in a linear-complexity fast solver framework. In this work, an FPGA-based hardware implementation is proposed toward a two-level parallelization scheme: (i) matrix level parallelization for single RHS and (ii) pipelining for multiple-RHS. The method is applied to accelerate electrostatic parasitic capacitance extraction of multiple nets in a Ball Grid Array (BGA) package. The acceleration is shown to be linearly scalable with FPGA resources and speed-ups over 10x against equivalent software implementation on a 2.4GHz Intel Core i5 processor is achieved using a Virtex-6 XC6VLX240T FPGA on Xilinx's ML605 board with the implemented design operating at 200MHz clock frequency. (c) 2016 Wiley Periodicals, Inc. Microwave Opt Technol Lett 58:776-783, 2016
Resumo:
We reported here a novel technique for laser high speed drillings on Printed Circuit Boards (PCBs). A CNC solid laser based system is developed to drill through and blind vias as an alternative to mechanical drilling. The system employs an Acousto-Optic Q-switched Nd: YAG laser, a computer control system and an X-Y moving table which can handle up to 400 x 400 mm PCB. With a special designed cavity the laser system works in a pulsed operation in order to generate pulses with width down to 0.5 mu s and maximum peak power over 10kW at 10k repetition rate. Delivered by an improved optical beam transforming system, the focused laser beam can drill hobs including blind vias on PCBs with diameter in the range of 0.1 - 0.4 mm and at up to 300 - 500 vias per second (depending on the construction of PCBs). By means of a CNC X-Y moving system, laser pulses with pulse-to-pulse superior repeatability can be fired at desired location on a PCBs with high accuracy. This alternative technology for drilling through or blind vias on PCBs or PWBs (printed wiring boards) will obviously enhance the capability to printed boards manufacturing.
Resumo:
Increasing the field of view of a holographic display while maintaining adequate image size is a difficult task. To address this problem, we designed a system that tessellates several sub-holograms into one large hologram at the output. The sub-holograms we generate is similar to a kinoform but without the paraxial approximation during computation. The sub-holograms are loaded onto a single spatial light modulator consecutively and relayed to the appropriate position at the output through a combination of optics and scanning reconstruction light. We will review the method of computer generated hologram and describe the working principles of our system. Results from our proof-of-concept system are shown to have an improved field of view and reconstructed image size. ©2009 IEEE.
Resumo:
Computation technology has dramatically changed the world around us; you can hardly find an area where cell phones have not saturated the market, yet there is a significant lack of breakthroughs in the development to integrate the computer with biological environments. This is largely the result of the incompatibility of the materials used in both environments; biological environments and experiments tend to need aqueous environments. To help aid in these development chemists, engineers, physicists and biologists have begun to develop microfluidics to help bridge this divide. Unfortunately, the microfluidic devices required large external support equipment to run the device. This thesis presents a series of several microfluidic methods that can help integrate engineering and biology by exploiting nanotechnology to help push the field of microfluidics back to its intended purpose, small integrated biological and electrical devices. I demonstrate this goal by developing different methods and devices to (1) separate membrane bound proteins with the use of microfluidics, (2) use optical technology to make fiber optic cables into protein sensors, (3) generate new fluidic devices using semiconductor material to manipulate single cells, and (4) develop a new genetic microfluidic based diagnostic assay that works with current PCR methodology to provide faster and cheaper results. All of these methods and systems can be used as components to build a self-contained biomedical device.