930 resultados para nanoparticle tracking analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a time varying wireless fading channel, equalized by an LMS Decision Feedback equalizer (DFE). We study how well this equalizer tracks the optimal MMSEDFE (Wiener) equalizer. We model the channel by an Autoregressive (AR) process. Then the LMS equalizer and the AR process are jointly approximated by the solution of a system of ODEs (ordinary differential equations). Using these ODEs, we show via some examples that the LMS equalizer moves close to the instantaneous Wiener filter after initial transience. We also compare the LMS equalizer with the instantaneous optimal DFE (the commonly used Wiener filter) designed assuming perfect previous decisions and computed using perfect channel estimate (we will call it as IDFE). We show that the LMS equalizer outperforms the IDFE almost all the time after initial transience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactions of turbulence, molecular transport, and energy transport, coupled with chemistry play a crucial role in the evolution of flame surface geometry, propagation, annihilation, and local extinction/re-ignition characteristics of intensely turbulent premixed flames. This study seeks to understand how these interactions affect flame surface annihilation of lean hydrogen-air premixed turbulent flames. Direct numerical simulations (DNSs) are conducted at different parametric conditions with a detailed reaction mechanism and transport properties for hydrogen-air flames. Flame particle tracking (FPT) technique is used to follow specific flame surface segments. An analytical expression for the local displacement flame speed (S-d) of a temperature isosurface is considered, and the contributions of transport, chemistry, and kinematics on the displacement flame speed at different turbulence-flame interaction conditions are identified. In general, the displacement flame speed for the flame particles is found to increase with time for all conditions considered. This is because, eventually all flame surfaces and their resident flame particles approach annihilation by reactant island formation at the end of stretching and folding processes induced by turbulence. Statistics of principal curvature evolving in time, obtained using FPT, suggest that these islands are ellipsoidal on average enclosing fresh reactants. Further examinations show that the increase in S-d is caused by the increased negative curvature of the flame surface and eventual homogenization of temperature gradients as these reactant islands shrink due to flame propagation and turbulent mixing. Finally, the evolution of the normalized, averaged, displacement flame speed vs. stretch Karlovitz number are found to collapse on a narrow band, suggesting that a unified description of flame speed dependence on stretch rate may be possible in the Lagrangian description. (C) 2015 The Combustion Institute. Published by Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The least-mean-fourth (LMF) algorithm is known for its fast convergence and lower steady state error, especially in sub-Gaussian noise environments. Recent work on normalised versions of the LMF algorithm has further enhanced its stability and performance in both Gaussian and sub-Gaussian noise environments. For example, the recently developed normalised LMF (XE-NLMF) algorithm is normalised by the mixed signal and error powers, and weighted by a fixed mixed-power parameter. Unfortunately, this algorithm depends on the selection of this mixing parameter. In this work, a time-varying mixed-power parameter technique is introduced to overcome this dependency. A convergence analysis, transient analysis, and steady-state behaviour of the proposed algorithm are derived and verified through simulations. An enhancement in performance is obtained through the use of this technique in two different scenarios. Moreover, the tracking analysis of the proposed algorithm is carried out in the presence of two sources of nonstationarities: (1) carrier frequency offset between transmitter and receiver and (2) random variations in the environment. Close agreement between analysis and simulation results is obtained. The results show that, unlike in the stationary case, the steady-state excess mean-square error is not a monotonically increasing function of the step size. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing use of nanoparticles in the pharmaceutical industry is generating concomitant interest in developing nanomaterials that can rapidly penetrate into, and permeate through, biological membranes to facilitate drug delivery and improve the bioavailability of active pharmaceutical ingredients. Here, we demonstrate that the permeation of thiolated silica nanoparticles through porcine gastric mucosa can be significantly enhanced by their functionalization with either 5 kDa poly(2-ethyl-2-oxazoline) or poly(ethylene glycol). Nanoparticle diffusion was assessed using two independent techniques; Nanoparticle Tracking Analysis, and fluorescence microscopy. Our results show that poly(2-ethyl-2-oxazoline) and poly(ethylene glycol) have comparable abilities to enhance diffusion of silica nanoparticles in mucin dispersions and through the gastric mucosa. These findings provide a new strategy in the design of nanomedicines, by surface modification or nanoparticle core construction, for enhanced transmucosal drug delivery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zakład Biofizyki Molekularnej, Centrum NanoBioMedyczne UAM

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding nanoparticle diffusion within non-Newtonian biological and synthetic fluids is essential in designing novel formulations (e.g., nanomedicines for drug delivery, shampoos, lotions, coatings, paints, etc.), but is presently poorly defined. This study reports the diffusion of thiolated and PEGylated silica nanoparticles, characterized by small-angle neutron scattering, in solutions of various water-soluble polymers such as poly(acrylic acid) (PAA), poly(Nvinylpyrrolidone) (PVP), poly(ethylene oxide) (PEO), and hydroxyethylcellulose (HEC) probed using NanoSight nanoparticle tracking analysis. Results show that the diffusivity of nanoparticles is affected by their dimensions, medium viscosity, and, in particular, the specific interactions between nanoparticles and the macromolecules in solution; strong attractive interactions such as hydrogen bonding hamper diffusion. The water-soluble polymers retarded the diffusion of thiolated particles in the order PEO > PVP > PAA > HEC whereas for PEGylated silica particles retardation followed the order PAA > PVP = HEC > PEO. In the absence of specific interactions with the medium, PEGylated nanoparticles exhibit enhanced mobility compared to their thiolated counterparts despite some increase in their dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Cell-free foetal haemoglobin (HbF) has been shown to play a role in the pathology of preeclampsia (PE). In the present study, we aimed to further characterize the harmful effects of extracellular free haemoglobin (Hb) on the placenta. In particular, we investigated whether cell-free Hb affects the release of placental syncytiotrophoblast vesicles (STBMs) and their micro-RNA content. METHODS The dual ex-vivo perfusion system was used to perfuse isolated cotyledons from human placenta, with medium alone (control) or supplemented with cell-free Hb. Perfusion medium from the maternal side of the placenta was collected at the end of all perfusion phases. The STBMs were isolated using ultra-centrifugation, at 10,000×g and 150,000×g (referred to as 10K and 150K STBMs). The STBMs were characterized using the nanoparticle tracking analysis, identification of surface markers and transmission electron microscopy. RNA was extracted and nine different micro-RNAs, related to hypoxia, PE and Hb synthesis, were selected for analysis by quantitative PCR. RESULTS All micro-RNAs investigated were present in the STBMs. Mir-517a, mir-141 and mir-517b were down regulated after Hb perfusion in the 10K STBMs. Furthermore, Hb was shown to be carried by the STBMs. CONCLUSION This study showed that Hb perfusion can alter the micro-RNA content of released STBMs. Of particular interest is the alteration of two placenta specific micro-RNAs; mir-517a and mir-517b. We have also seen that STBMs may function as carriers of Hb into the maternal circulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players', both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil), the results of 55 outline players participated in the whole game (n = 55) are presented. The results of mean distances covered, standard deviations (s) and coefficient of variation (cv) after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m), central midfielders (10476 ± 702 m) and external midfielders (10598 ± 890 m) were greater than forwards (9612 ± 772 m) and forwards covered greater distances than central defenders (9029 ± 860 m). The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6%) highly significant greater (p < 0.001) than the mean value 4,808 m (s = 375 m, cv = 7.8%) in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half. ©Journal of Sports Science and Medicine (2007).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For the tracking of extrema associated with weather systems to be applied to a broad range of fields it is necessary to remove a background field that represents the slowly varying, large spatial scales. The sensitivity of the tracking analysis to the form of background field removed is explored for the Northern Hemisphere winter storm tracks for three contrasting fields from an integration of the U. K. Met Office's (UKMO) Hadley Centre Climate Model (HadAM3). Several methods are explored for the removal of a background field from the simple subtraction of the climatology, to the more sophisticated removal of the planetary scales. Two temporal filters are also considered in the form of a 2-6-day Lanczos filter and a 20-day high-pass Fourier filter. The analysis indicates that the simple subtraction of the climatology tends to change the nature of the systems to the extent that there is a redistribution of the systems relative to the climatological background resulting in very similar statistical distributions for both positive and negative anomalies. The optimal planetary wave filter removes total wavenumbers less than or equal to a number in the range 5-7, resulting in distributions more easily related to particular types of weather system. For the temporal filters the 2-6-day bandpass filter is found to have a detrimental impact on the individual weather systems, resulting in the storm tracks having a weak waveguide type of behavior. The 20-day high-pass temporal filter is less aggressive than the 2-6-day filter and produces results falling between those of the climatological and 2-6-day filters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The hemocompatibility of nanoparticles is of critical importance for their systemic administration as drug delivery systems. Formulations of lipid-core nanocapsules, stabilized with polysorbate 80-lecithin and uncoated or coated with chitosan (LNC and LNC-CS), were prepared and characterized by laser diffraction (D[4,3]: 129 and 134 nm), dynamic light scattering (119 nm and 133 nm), nanoparticle tracking (D50: 124 and 139 nm) and particle mobility (zeta potential: -15.1 mV and + 9.3 mV) analysis. In vitro hemocompatibility studies were carried out with mixtures of nanocapsule suspensions in human blood at 2% and 10% (v/v). The prothrombin time showed no significant change independently of the nanocapsule surface potential or its concentration in plasma. Regarding the activated partial thromboplastin time, both suspensions at 2% (v/v) in plasma did not influence the clotting time. Even though suspensions at 10% (v/v) in plasma decreased the clotting times (p < 0.05), the values were within the normal range. The ability of plasma to activate the coagulation system was maintained after the addition of the formulations. Suspensions at 2% (v/v) in blood showed no significant hemolysis or platelet aggregation. In conclusion, the lipid-core nanocapsules uncoated or coated with chitosan are hemocompatible representing a potential innovative nanotechnological formulation for intravenous administration. (C) 2012 Elsevier B. V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

It is well known that constant-modulus-based algorithms present a large mean-square error for high-order quadrature amplitude modulation (QAM) signals, which may damage the switching to decision-directed-based algorithms. In this paper, we introduce a regional multimodulus algorithm for blind equalization of QAM signals that performs similar to the supervised normalized least-mean-squares (NLMS) algorithm, independently of the QAM order. We find a theoretical relation between the coefficient vector of the proposed algorithm and the Wiener solution and also provide theoretical models for the steady-state excess mean-square error in a nonstationary environment. The proposed algorithm in conjunction with strategies to speed up its convergence and to avoid divergence can bypass the switching mechanism between the blind mode and the decision-directed mode. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

During the course of several natural disasters in recent years, Twitter has been found to play an important role as an additional medium for many–to–many crisis communication. Emergency services are successfully using Twitter to inform the public about current developments, and are increasingly also attempting to source first–hand situational information from Twitter feeds (such as relevant hashtags). The further study of the uses of Twitter during natural disasters relies on the development of flexible and reliable research infrastructure for tracking and analysing Twitter feeds at scale and in close to real time, however. This article outlines two approaches to the development of such infrastructure: one which builds on the readily available open source platform yourTwapperkeeper to provide a low–cost, simple, and basic solution; and, one which establishes a more powerful and flexible framework by drawing on highly scaleable, state–of–the–art technology.