947 resultados para Transmission Line Method
Resumo:
We investigate the application of time-reversed electromagnetic wave propagation to transmit energy in a wireless power transmission system. “Time reversal” is a signal focusing method that exploits the time reversal invariance of the lossless wave equation to direct signals onto a single point inside a complex scattering environment. In this work, we explore the properties of time reversed microwave pulses in a low-loss ray-chaotic chamber. We measure the spatial profile of the collapsing wavefront around the target antenna, and demonstrate that time reversal can be used to transfer energy to a receiver in motion. We demonstrate how nonlinear elements can be controlled to selectively focus on one target out of a group. Finally, we discuss the design of a rectenna for use in a time reversal system. We explore the implication of these results, and how they may be applied in future technologies.
Resumo:
Background: The nitration of tyrosine residues in proteins is associated with nitrosative stress, resulting in the formation of 3-nitrotyrosine (3-NT). 3-NT levels in biological samples have been associated with numerous physiological and pathological conditions. For this reason, several attempts have been made in order to develop methods that accurately quantify 3-NT in biological samples. Regarding chromatographic methods, they seem to be very accurate, showing very good sensibility and specificity. However, accurate quantification of this molecule, which is present at very low concentrations both at physiological and pathological states, is always a complex task and a target of intense research. Objectives: We aimed to develop a simple, rapid, low-cost and sensitive 3-NT quantification method for use in medical laboratories as an additional tool for diagnosis and/or treatment monitoring of a wide range of pathologies. We also aimed to evaluate the performance of the HPLC-based method developed here in a wide range of biological matrices. Material and methods: All experiments were performed on a Hitachi LaChrom Elite® HPLC system and separation was carried out using a Lichrocart® 250-4 Lichrospher 100 RP-18 (5μm) column. The method was further validated according to ICH guidelines. The biological matrices tested were serum, whole blood, urine, B16 F-10 melanoma cell line, growth medium conditioned with the same cell line, bacterial and yeast suspensions. Results: From all the protocols tested, the best results were obtained using 0.5% CH3COOH:MeOH:H2O (15:15:70) as the mobile phase, with detection at wavelengths 215, 276 and 356 nm, at 25ºC, and using a flow rate of 1 mL/min. By using this protocol, it was possible to obtain a linear calibration curve (correlation coefficient = 1), limits of detection and quantification in the order of ng/mL, and a short analysis time (<15 minutes per sample). Additionally, the developed protocol allowed the successful detection and quantification of 3-NT in all biological matrices tested, with detection at 356 nm. Conclusion: The method described in this study, which was successfully developed and validated for 3-NT quantification, is simple, cheap and fast, rendering it suitable for analysis in a wide range of biological matrices.
Resumo:
Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.
Resumo:
Ethernet connections, which are widely used in many computer networks, can suffer from electromagnetic interference. Typically, a degradation of the data transmission rate can be perceived as electromagnetic disturbances lead to corruption of data frames on the network media. In this paper a software-based measuring method is presented, which allows a direct assessment of the effects on the link layer. The results can directly be linked to the physical interaction without the influence of software related effects on higher protocol layers. This gives a simple tool for a quantitative analysis of the disturbance of an Ethernet connection based on time domain data. An example is shown, how the data can be used for further investigation of mechanisms and detection of intentional electromagnetic attacks. © 2015 Author(s).
Resumo:
Most approaches to stereo visual odometry reconstruct the motion based on the tracking of point features along a sequence of images. However, in low-textured scenes it is often difficult to encounter a large set of point features, or it may happen that they are not well distributed over the image, so that the behavior of these algorithms deteriorates. This paper proposes a probabilistic approach to stereo visual odometry based on the combination of both point and line segment that works robustly in a wide variety of scenarios. The camera motion is recovered through non-linear minimization of the projection errors of both point and line segment features. In order to effectively combine both types of features, their associated errors are weighted according to their covariance matrices, computed from the propagation of Gaussian distribution errors in the sensor measurements. The method, of course, is computationally more expensive that using only one type of feature, but still can run in real-time on a standard computer and provides interesting advantages, including a straightforward integration into any probabilistic framework commonly employed in mobile robotics.
Resumo:
Bodies On the Line: Violence, Disposable Subjects, and the Border Industrial Complex explores the construction of identity and notions of belonging within an increasingly privatized and militarized Border Industrial Complex. Specifically, the project interrogates how discourses of Mexican migrants as racialized, gendered, and hypersexualized “deviants” normalize violence against border crossers. Starting at Juárez/El Paso border, I follow the expanding border, interrogating the ways that Mexican migrants, regardless of sexual orientation, have been constructed and disciplined according to racialized notions of “sexual deviance." I engage a queer of color critique to argue that sexual deviance becomes a justification for targeting and containing migrant subjects. By focusing on the economic and racially motivated violence that the Border Industrial Complex does to Mexican migrant communities, I expand the critiques that feminists of color have long leveraged against systemic violence done to communities of color through the prison industrial system. Importantly, this project contributes to transnational feminist scholarship by contextualizing border violence within the global circuits of labor, capital, and ideology that shape perceptions of border insecurity. The project contributes an interdisciplinary perspective that uses a multi-method approach to understand how border violence is exercised against Mexicans at the Mexico-US border. I use archival methods to ask how historical records housed at the National Border Patrol Museum and Memorial Library serve as political instruments that reinforce the contemporary use of violence against Mexican migrants. I also use semi-structured interviews with nine frequent border crossers to consider the various ways crossers defined and aligned themselves at the border. Finally, I analyze the master narratives that come to surround specific cases of border violence. To that end, I consider the mainstream media’s coverage, legal proceedings, and policy to better understand the racialized, gendered, and sexualized logics of the violence.
Resumo:
We propose a positive, accurate moment closure for linear kinetic transport equations based on a filtered spherical harmonic (FP_N) expansion in the angular variable. The FP_N moment equations are accurate approximations to linear kinetic equations, but they are known to suffer from the occurrence of unphysical, negative particle concentrations. The new positive filtered P_N (FP_N+) closure is developed to address this issue. The FP_N+ closure approximates the kinetic distribution by a spherical harmonic expansion that is non-negative on a finite, predetermined set of quadrature points. With an appropriate numerical PDE solver, the FP_N+ closure generates particle concentrations that are guaranteed to be non-negative. Under an additional, mild regularity assumption, we prove that as the moment order tends to infinity, the FP_N+ approximation converges, in the L2 sense, at the same rate as the FP_N approximation; numerical tests suggest that this assumption may not be necessary. By numerical experiments on the challenging line source benchmark problem, we confirm that the FP_N+ method indeed produces accurate and non-negative solutions. To apply the FP_N+ closure on problems at large temporal-spatial scales, we develop a positive asymptotic preserving (AP) numerical PDE solver. We prove that the propose AP scheme maintains stability and accuracy with standard mesh sizes at large temporal-spatial scales, while, for generic numerical schemes, excessive refinements on temporal-spatial meshes are required. We also show that the proposed scheme preserves positivity of the particle concentration, under some time step restriction. Numerical results confirm that the proposed AP scheme is capable for solving linear transport equations at large temporal-spatial scales, for which a generic scheme could fail. Constrained optimization problems are involved in the formulation of the FP_N+ closure to enforce non-negativity of the FP_N+ approximation on the set of quadrature points. These optimization problems can be written as strictly convex quadratic programs (CQPs) with a large number of inequality constraints. To efficiently solve the CQPs, we propose a constraint-reduced variant of a Mehrotra-predictor-corrector algorithm, with a novel constraint selection rule. We prove that, under appropriate assumptions, the proposed optimization algorithm converges globally to the solution at a locally q-quadratic rate. We test the algorithm on randomly generated problems, and the numerical results indicate that the combination of the proposed algorithm and the constraint selection rule outperforms other compared constraint-reduced algorithms, especially for problems with many more inequality constraints than variables.
Resumo:
Despite the wide swath of applications where multiphase fluid contact lines exist, there is still no consensus on an accurate and general simulation methodology. Most prior numerical work has imposed one of the many dynamic contact-angle theories at solid walls. Such approaches are inherently limited by the theory accuracy. In fact, when inertial effects are important, the contact angle may be history dependent and, thus, any single mathematical function is inappropriate. Given these limitations, the present work has two primary goals: 1) create a numerical framework that allows the contact angle to evolve naturally with appropriate contact-line physics and 2) develop equations and numerical methods such that contact-line simulations may be performed on coarse computational meshes.
Fluid flows affected by contact lines are dominated by capillary stresses and require accurate curvature calculations. The level set method was chosen to track the fluid interfaces because it is easy to calculate interface curvature accurately. Unfortunately, the level set reinitialization suffers from an ill-posed mathematical problem at contact lines: a ``blind spot'' exists. Standard techniques to handle this deficiency are shown to introduce parasitic velocity currents that artificially deform freely floating (non-prescribed) contact angles. As an alternative, a new relaxation equation reinitialization is proposed to remove these spurious velocity currents and its concept is further explored with level-set extension velocities.
To capture contact-line physics, two classical boundary conditions, the Navier-slip velocity boundary condition and a fixed contact angle, are implemented in direct numerical simulations (DNS). DNS are found to converge only if the slip length is well resolved by the computational mesh. Unfortunately, since the slip length is often very small compared to fluid structures, these simulations are not computationally feasible for large systems. To address the second goal, a new methodology is proposed which relies on the volumetric-filtered Navier-Stokes equations. Two unclosed terms, an average curvature and a viscous shear VS, are proposed to represent the missing microscale physics on a coarse mesh.
All of these components are then combined into a single framework and tested for a water droplet impacting a partially-wetting substrate. Very good agreement is found for the evolution of the contact diameter in time between the experimental measurements and the numerical simulation. Such comparison would not be possible with prior methods, since the Reynolds number Re and capillary number Ca are large. Furthermore, the experimentally approximated slip length ratio is well outside of the range currently achievable by DNS. This framework is a promising first step towards simulating complex physics in capillary-dominated flows at a reasonable computational expense.
Resumo:
We present measurements of the transmission spectra of 87Rb atoms at 780 nm in the vicinity of a nanofiber. A uniform distribution of fixed atoms around a nanofiber should produce a spectrum that is broadened towards the red due to shifts from the van der Waals potential. If the atoms are free, this also produces an attractive force that accelerates them until they collide with the fiber which depletes the steady-state density of near-surface atoms. It is for this reason that measurements of the van der Waals interaction are sparse. We confirm this by measuring the spectrum cold atoms from a magneto-optical trap around the fiber, revealing a symmetric line shape with nearly the natural linewidth of the transition. When we use an auxiliary 750 nm laser we are able to controllably desorb a steady flux of atoms from the fiber that reside near the surface (less than 50 nm) long enough to feel the van der Walls interaction and produce an asymmetric spectrum. We quantify the spectral asymmetry as a function of 750 nm laser power and find a maximum. Our model, which that takes into account the change in the density distribution, qualitatively explains the observations. In the future this can be used as a tool to more comprehensively study atom-surface interactions.
Resumo:
Rainflow counting methods convert a complex load time history into a set of load reversals for use in fatigue damage modeling. Rainflow counting methods were originally developed to assess fatigue damage associated with mechanical cycling where creep of the material under load was not considered to be a significant contributor to failure. However, creep is a significant factor in some cyclic loading cases such as solder interconnects under temperature cycling. In this case, fatigue life models require the dwell time to account for stress relaxation and creep. This study develops a new version of the multi-parameter rainflow counting algorithm that provides a range-based dwell time estimation for use with time-dependent fatigue damage models. To show the applicability, the method is used to calculate the life of solder joints under a complex thermal cycling regime and is verified by experimental testing. An additional algorithm is developed in this study to provide data reduction in the results of the rainflow counting. This algorithm uses a damage model and a statistical test to determine which of the resultant cycles are statistically insignificant to a given confidence level. This makes the resulting data file to be smaller, and for a simplified load history to be reconstructed.
Resumo:
Structural characteristics of combustion synthesized, calcined and densified pure and doped nanoceria with tri-valent cations of Er, Y, Gd, Sm and Nd were analyzed by X-ray diffraction (XRD) and high resolution transmission electron microscopy (HRTEM). The results showed that the as-synthesized and calcined nanopowders were mesoporous and calculated lattice parameters were close to theoretical ion-packing model. The effect of dopants on elastic modulus, microhardness and fracture toughness of sintered pure and doped ceria were investigated. It was observed that tri-valent cation dopants increased the hardness of the ceria, whereas the fracture toughness and elastic modulus were decreased.
Resumo:
Poster presented at the 17th Annual International Meeting of the Institute of Human Virology. Baltimore, 27-30 September 2015
Resumo:
Copper complexes containing inorganic ligands were loaded on a functionalized titania (F-TiO2) to obtain drug delivery systems. The as-received copper complexes and those released from titania were tested as toxic agents on different cancer cell lines. The sol–gel method was used for the synthesis and surface functionalization of the titania, as well as for loading the copper complexes, all in a single step. The resultant Cu/F-TiO2 materials were characterized by several techniques. An “in vitro” releasing test was developed using an aqueous medium. Different concentrations (15.6–1000 µg mL−1) of each copper complex, those loaded on titania (Cu/F-TiO2), functionalized titania, and cis-Pt as a reference material, were incubated on RG2, C6, U373, and B16 cancer cell lines for 24 h. The morphology of functionalized titania and the different Cu/F-TiO2 materials obtained consists of aggregated nanoparticles, which generate mesopores. The amorphous phase (in dominant proportion) and the anatase phase were the structures identified through the X-ray diffraction profiles. These results agree with high-resolution transmission electron microscopy. Theoretical studies indicate that the copper compounds were released by a Fickian diffusion mechanism. It was found that independently of the copper complex and also the cell line used, low concentrations of each copper compound were sufficient to kill almost 100 % of cancer cells. When the cancer cells were treated with increasing concentrations of the Cu/F-TiO2 materials the number of survival cells decreased. Both copper complexes alone as well as those loaded on TiO2 had higher toxic effect than cis-Pt.
Resumo:
The change in the carbonaceous skeleton of nanoporous carbons during their activation has received limited attention, unlike its counterpart process in the presence of an inert atmosphere. Here we adopt a multi-method approach to elucidate this change in a poly(furfuryl alcohol)-derived carbon activated using cyclic application of oxygen saturation at 250 °C before its removal (with carbon) at 800 °C in argon. The methods used include helium pycnometry, synchrotron-based X-ray diffraction (XRD) and associated radial distribution function (RDF) analysis, transmission electron microscopy (TEM) and, uniquely, electron energy-loss spectroscopy spectrum-imaging (EELS-SI), electron nanodiffraction and fluctuation electron microscopy (FEM). Helium pycnometry indicates the solid skeleton of the carbon densifies during activation from 78% to 93% of graphite. RDF analysis, EELS-SI, and FEM all suggest this densification comes through an in-plane growth of sp2 carbon out to the medium range without commensurate increase in order normal to the plane. This process could be termed ‘graphenization’. The exact way in which this process occurs is not clear, but TEM images of the carbon before and after activation suggest it may come through removal of the more reactive carbon, breaking constraining cross-links and creating space that allows the remaining carbon material to migrate in an annealing-like process.
Resumo:
Purpose: To optimize the extraction conditions of polysaccharides from Polygonum perfoliatum L. (PSDP) and to evaluate their anti-tumor activities on A549 cell line. Methods: Extraction of PSDP was optimized using Box-Behnken design (BBD). Three factors of response surface methodology (RSM) including extraction time, ratio of water to raw material and number of extractions were employed to optimize the yield of PSDP. The cytotoxic effect of PSDP on human lung carcinoma A549 cell line was evaluated in vivo, while its effects on expressions of caspase3, caspase-9, Bcl-2 and Bax were determined by western blot assay. Result: BBD was significant and applicable to PSDP extraction. Based on the contour plots, response surface plots and variance analysis, it predicted that the optimum conditions for PSDP extraction were: 1.58 h (extraction time); 30.18 mL/g (ratio of water to raw material); and 2.02 (number of extractions). PSDP had significant inhibitory effect on the growth of A549 cells in a concentration- and timedependent manner (p < 0.05). After treatment with PSDP, caspase-3, caspase-9 and Bax were significantly up-regulated (p < 0.05), whereas Bcl-2 was down-regulated, all concentration-dependently. Conclusion: RSM analysis is an appropriate method to optimize PSDP extraction. The results also indicate that PSDP has significant anti-tumor effect against A549 cells, most likely via inducing mitochondria-mediated apoptosis.