985 resultados para Particle method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Energy transmission through a box-shaped floating breakwater (FB) is examined, under simplified conditions, by using the smoothed particle hydrodynamics (SPH) method, a mesh-free particle numerical approach. The efficiency of the structure is assessed in terms of the coefficient of transm ission as a function of the wave period and the location of the floating breakwater relative to the zone to be protected. Preliminary results conceming wave energy transmission reveals a clear improvement of the efficiency as wave period decreases andan important role ofthe bathymetry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particle concentration is a principal factor that affects erosion rate of solid surfaces under particle impact, such as pipe bends in pneumatic conveyors; it is well known that a reduction in the specific erosion rate occurs under high particle concentrations, a phenomenon referred to as the “shielding effect”. The cause of shielding is believed to be increased likelihood of inter-particulate collisions, the high collision probability between incoming and rebounding particles reducing the frequency and the severity of particle impacts on the target surface. In this study, the effects of particle concentration on erosion of a mild steel bend surface have been investigated in detail using three different particulate materials on an industrial scale pneumatic conveying test rig. The materials were studied so that two had the same particle density but very different particle size, whereas two had very similar particle size but very different particle density. Experimental results confirm the shielding effect due to high particle concentration and show that the particle density has a far more significant influence than the particle size, on the magnitude of the shielding effect. A new method of correcting for change in erosivity of the particles in repeated handling, to take this factor out of the data, has been established, and appears to be successful. Moreover, a novel empirical model of the shielding effects has been used, in term of erosion resistance which appears to decrease linearly when the particle concentration decreases. With the model it is possible to find the specific erosion rate when the particle concentration tends to zero, and conversely predict how the specific erosion rate changes at finite values of particle concentration; this is critical to enable component life to be predicted from erosion tester results, as the variation of the shielding effect with concentration is different in these two scenarios. In addition a previously unreported phenomenon has been recorded, of a particulate material whose erosivity has steadily increased during repeated impacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The lattice Boltzmann method is a popular approach for simulating hydrodynamic interactions in soft matter and complex fluids. The solvent is represented on a discrete lattice whose nodes are populated by particle distributions that propagate on the discrete links between the nodes and undergo local collisions. On large length and time scales, the microdynamics leads to a hydrodynamic flow field that satisfies the Navier-Stokes equation. In this thesis, several extensions to the lattice Boltzmann method are developed. In complex fluids, for example suspensions, Brownian motion of the solutes is of paramount importance. However, it can not be simulated with the original lattice Boltzmann method because the dynamics is completely deterministic. It is possible, though, to introduce thermal fluctuations in order to reproduce the equations of fluctuating hydrodynamics. In this work, a generalized lattice gas model is used to systematically derive the fluctuating lattice Boltzmann equation from statistical mechanics principles. The stochastic part of the dynamics is interpreted as a Monte Carlo process, which is then required to satisfy the condition of detailed balance. This leads to an expression for the thermal fluctuations which implies that it is essential to thermalize all degrees of freedom of the system, including the kinetic modes. The new formalism guarantees that the fluctuating lattice Boltzmann equation is simultaneously consistent with both fluctuating hydrodynamics and statistical mechanics. This establishes a foundation for future extensions, such as the treatment of multi-phase and thermal flows. An important range of applications for the lattice Boltzmann method is formed by microfluidics. Fostered by the "lab-on-a-chip" paradigm, there is an increasing need for computer simulations which are able to complement the achievements of theory and experiment. Microfluidic systems are characterized by a large surface-to-volume ratio and, therefore, boundary conditions are of special relevance. On the microscale, the standard no-slip boundary condition used in hydrodynamics has to be replaced by a slip boundary condition. In this work, a boundary condition for lattice Boltzmann is constructed that allows the slip length to be tuned by a single model parameter. Furthermore, a conceptually new approach for constructing boundary conditions is explored, where the reduced symmetry at the boundary is explicitly incorporated into the lattice model. The lattice Boltzmann method is systematically extended to the reduced symmetry model. In the case of a Poiseuille flow in a plane channel, it is shown that a special choice of the collision operator is required to reproduce the correct flow profile. This systematic approach sheds light on the consequences of the reduced symmetry at the boundary and leads to a deeper understanding of boundary conditions in the lattice Boltzmann method. This can help to develop improved boundary conditions that lead to more accurate simulation results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The most ocean - atmosphere exchanges take place in polar environments due to the low temperatures which favor the absorption processes of atmospheric gases, in particular CO2. For this reason, the alterations of biogeochemical cycles in these areas can have a strong impact on the global climate. With the aim of contributing to the definition of the mechanisms regulating the biogeochemical fluxes we have analyzed the particles collected in the Ross Sea in different years (ROSSMIZE, BIOSESO 1 and 2, ROAVERRS and ABIOCLEAR projects) in two sites (mooring A and B). So it has been developed a more efficient method to prepare sediment trap samples for the analyses. We have also processed satellite data of sea ice, chlorophyll a and diatoms concentration. At both sites, in each year considered, there was a high seasonal and inter-annual variability of biogeochemical fluxes closely correlated with sea ice cover and primary productivity. The comparison between the samples collected at mooring A and B in 2008 highlighted the main differences between these two sites. Particle fluxes at Mooring A, located in a polynia area, are higher than mooring B ones and they happen about a month before. In the mooring B area it has been possible to correlate the particles fluxes to the ice concentration anomalies and with the atmospheric changes in response to El Niño Southern Oscillations. In 1996 and 1999, years subjected to La Niña, the concentrations of sea ice in this area have been less than in 1998, year subjected to El Niño. Inverse correlation was found for 2005 and 2008. In the mooring A area significant differences in mass and biogenic fluxes during 2005 and 2008 has been recorded. This allowed to underline the high variability of lateral advection processes and to connect them to the physical forcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reactive halogen compounds are known to play an important role in a wide variety of atmospheric processes such as atmospheric oxidation capacity and coastal new particle formation. In this work, novel analytical approaches combining diffusion denuder/impinger sampling techniques with gas chromatographic–mass spectrometric (GC–MS) determination are developed to measure activated chlorine compounds (HOCl and Cl2), activated bromine compounds (HOBr, Br2, BrCl, and BrI), activated iodine compounds (HOI and ICl), and molecular iodine (I2). The denuder/GC–MS methods have been used to field measurements in the marine boundary layer (MBL). High mixing ratios (of the order of 100 ppt) of activated halogen compounds and I2 are observed in the coastal MBL in Ireland, which explains the ozone destruction observed. The emission of I2 is found to correlate inversely with tidal height and correlate positively with the levels of O3 in the surrounding air. In addition the release is found to be dominated by algae species compositions and biomass density, which proves the “hot-spot” hypothesis of atmospheric iodine chemistry. The observations of elevated I2 concentrations substantially support the existence of higher concentrations of littoral iodine oxides and thus the connection to the strong ultra-fine particle formation events in the coastal MBL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the influence of composition changes on the glass transition behavior of binary liquids in two and three spatial dimensions (2D/3D) is studied in the framework of mode-coupling theory (MCT).The well-established MCT equations are generalized to isotropic and homogeneous multicomponent liquids in arbitrary spatial dimensions. Furthermore, a new method is introduced which allows a fast and precise determination of special properties of glass transition lines. The new equations are then applied to the following model systems: binary mixtures of hard disks/spheres in 2D/3D, binary mixtures of dipolar point particles in 2D, and binary mixtures of dipolar hard disks in 2D. Some general features of the glass transition lines are also discussed. The direct comparison of the binary hard disk/sphere models in 2D/3D shows similar qualitative behavior. Particularly, for binary mixtures of hard disks in 2D the same four so-called mixing effects are identified as have been found before by Götze and Voigtmann for binary hard spheres in 3D [Phys. Rev. E 67, 021502 (2003)]. For instance, depending on the size disparity, adding a second component to a one-component liquid may lead to a stabilization of either the liquid or the glassy state. The MCT results for the 2D system are on a qualitative level in agreement with available computer simulation data. Furthermore, the glass transition diagram found for binary hard disks in 2D strongly resembles the corresponding random close packing diagram. Concerning dipolar systems, it is demonstrated that the experimental system of König et al. [Eur. Phys. J. E 18, 287 (2005)] is well described by binary point dipoles in 2D through a comparison between the experimental partial structure factors and those from computer simulations. For such mixtures of point particles it is demonstrated that MCT predicts always a plasticization effect, i.e. a stabilization of the liquid state due to mixing, in contrast to binary hard disks in 2D or binary hard spheres in 3D. It is demonstrated that the predicted plasticization effect is in qualitative agreement with experimental results. Finally, a glass transition diagram for binary mixtures of dipolar hard disks in 2D is calculated. These results demonstrate that at higher packing fractions there is a competition between the mixing effects occurring for binary hard disks in 2D and those for binary point dipoles in 2D.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Die Herstellung von Polymer-Solarzellen aus wässriger Phase stellt eine attraktive Alternative zu der konventionellen lösemittelbasierten Formulierung dar. Die Vorteile der aus wässriger Lösung hergestellten Solarzellen liegen besonders in dem umweltschonenden Herstellungsprozess und in der Möglichkeit, druckbare optoelektronische Bauteile zu generieren. Die Prozessierbarkeit von hydrophoben Halbleitern im wässrigen Milieu wird durch die Dispergierung der Materialien, in Form von Nanopartikeln, erreicht. Der Transfer der Halbleiter in eine Dispersion erfolgt über die Lösemittelverdampfungsmethode. Die Idee der Verwendung von partikelbasierte Solarzellen wurde bereits umgesetzt, allerdings blieben eine genaue Charakterisierung der Partikel sowie ein umfassendes Verständnis des gesamten Fabrikationsvorgangs aus. Deshalb besteht das Ziel dieser Arbeit darin, einen detaillierten Einblick in den Herstellungsprozess von partikelbasierten Solarzellen zu erlangen, mögliche Schwächen aufzudecken, diese zu beseitigen, um so zukünftige Anwendungen zu verbessern. Zur Herstellung von Solarzellen aus wässrigen Dispersionen wurde Poly(3-hexylthiophen-2,5-diyl)/[6,6]-Phenyl-C61-Buttersäure-Methylester (P3HT/PCBM) als Donor/Akzeptor-System verwendet. Die Kernpunkte der Untersuchungen richteten sich zum einen die auf Partikelmorphologie und zum anderen auf die Generierung einer geeigneten Partikelschicht. Beide Parameter haben Auswirkungen auf die Solarzelleneffizienz. Die Morphologie wurde sowohl spektroskopisch über Photolumineszenz-Messungen, als auch visuell mittels Elektronenmikroskopie ermittelt. Auf diese Weise konnte die Partikelmorphologie vollständig aufgeklärt werden, wobei Parallelen zu der Struktur von lösemittelbasierten Solarzellen gefunden wurden. Zudem wurde eine Abhängigkeit der Morphologie von der Präparationstemperatur beobachtet, was eine einfache Steuerung der Partikelstruktur ermöglicht. Im Zuge der Partikelschichtausbildung wurden direkte sowie grenzflächenvermittelnde Beschichtungsmethoden herangezogen. Von diesen Techniken hatte sich aber nur die Rotationsbeschichtung als brauchbare Methode erwiesen, Partikel aus der Dispersion in einen homogenen Film zu überführen. Des Weiteren stand die Aufarbeitung der Partikelschicht durch Ethanol-Waschung und thermische Behandlung im Fokus dieser Arbeit. Beide Maßnahmen wirkten sich positiv auf die Effizienz der Solarzellen aus und trugen entscheidend zu einer Verbesserung der Zellen bei. Insgesamt liefern die gewonnen Erkenntnisse einen detaillierten Überblick über die Herausforderungen, welche bei dem Einsatz von wasserbasierten Dispersionen auftreten. Die Anforderungen partikelbasierter Solarzellen konnten offengelegt werden, dadurch gelang die Herstellung einer Solarzelle mit einer Effizienz von 0.53%. Dieses Ergebnis stellt jedoch noch nicht das Optimum dar und lässt noch Möglichkeiten für Verbesserungen offen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the development of a novel simulation technique for macromolecules in electrolyte solutions, with the aim of a performance improvement over current molecular-dynamics based simulation methods. In solutions containing charged macromolecules and salt ions, it is the complex interplay of electrostatic interactions and hydrodynamics that determines the equilibrium and non-equilibrium behavior. However, the treatment of the solvent and dissolved ions makes up the major part of the computational effort. Thus an efficient modeling of both components is essential for the performance of a method. With the novel method we approach the solvent in a coarse-grained fashion and replace the explicit-ion description by a dynamic mean-field treatment. Hence we combine particle- and field-based descriptions in a hybrid method and thereby effectively solve the electrokinetic equations. The developed algorithm is tested extensively in terms of accuracy and performance, and suitable parameter sets are determined. As a first application we study charged polymer solutions (polyelectrolytes) in shear flow with focus on their viscoelastic properties. Here we also include semidilute solutions, which are computationally demanding. Secondly we study the electro-osmotic flow on superhydrophobic surfaces, where we perform a detailed comparison to theoretical predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since its discovery, top quark has represented one of the most investigated field in particle physics. The aim of this thesis is the reconstruction of hadronic top with high transverse momentum (boosted) with the Template Overlap Method (TOM). Because of the high energy, the decay products of boosted tops are partially or totally overlapped and thus they are contained in a single large radius jet (fat-jet). TOM compares the internal energy distributions of the candidate fat-jet to a sample of tops obtained by a MC simulation (template). The algorithm is based on the definition of an overlap function, which quantifies the level of agreement between the fat-jet and the template, allowing an efficient discrimination of signal from the background contributions. A working point has been decided in order to obtain a signal efficiency close to 90% and a corresponding background rejection at 70%. TOM performances have been tested on MC samples in the muon channel and compared with the previous methods present in literature. All the methods will be merged in a multivariate analysis to give a global top tagging which will be included in ttbar production differential cross section performed on the data acquired in 2012 at sqrt(s)=8 TeV in high phase space region, where new physics processes could be possible. Due to its peculiarity to increase the pT, the Template Overlap Method will play a crucial role in the next data taking at sqrt(s)=13 TeV, where the almost totality of the tops will be produced at high energy, making the standard reconstruction methods inefficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Future experiments in nuclear and particle physics are moving towards the high luminosity regime in order to access rare processes. In this framework, particle detectors require high rate capability together with excellent timing resolution for precise event reconstruction. In order to achieve this, the development of dedicated FrontEnd Electronics (FEE) for detectors has become increasingly challenging and expensive. Thus, a current trend in R&D is towards flexible FEE that can be easily adapted to a great variety of detectors, without impairing the required high performance. This thesis reports on a novel FEE for two different detector types: imaging Cherenkov counters and plastic scintillator arrays. The former requires high sensitivity and precision for detection of single photon signals, while the latter is characterized by slower and larger signals typical of scintillation processes. The FEE design was developed using high-bandwidth preamplifiers and fast discriminators which provide Time-over-Threshold (ToT). The use of discriminators allowed for low power consumption, minimal dead-times and self-triggering capabilities, all fundamental aspects for high rate applications. The output signals of the FEE are readout by a high precision TDC system based on FPGA. The performed full characterization of the analogue signals under realistic conditions proved that the ToT information can be used in a novel way for charge measurements or walk corrections, thus improving the obtainable timing resolution. Detailed laboratory investigations proved the feasibility of the ToT method. The full readout chain was investigated in test experiments at the Mainz Microtron: high counting rates per channel of several MHz were achieved, and a timing resolution of better than 100 ps after walk correction based on ToT was obtained. Ongoing applications to fast Time-of-Flight counters and future developments of FEE have been also recently investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The penetration, translocation, and distribution of ultrafine and nanoparticles in tissues and cells are challenging issues in aerosol research. This article describes a set of novel quantitative microscopic methods for evaluating particle distributions within sectional images of tissues and cells by addressing the following questions: (1) is the observed distribution of particles between spatial compartments random? (2) Which compartments are preferentially targeted by particles? and (3) Does the observed particle distribution shift between different experimental groups? Each of these questions can be addressed by testing an appropriate null hypothesis. The methods all require observed particle distributions to be estimated by counting the number of particles associated with each defined compartment. For studying preferential labeling of compartments, the size of each of the compartments must also be estimated by counting the number of points of a randomly superimposed test grid that hit the different compartments. The latter provides information about the particle distribution that would be expected if the particles were randomly distributed, that is, the expected number of particles. From these data, we can calculate a relative deposition index (RDI) by dividing the observed number of particles by the expected number of particles. The RDI indicates whether the observed number of particles corresponds to that predicted solely by compartment size (for which RDI = 1). Within one group, the observed and expected particle distributions are compared by chi-squared analysis. The total chi-squared value indicates whether an observed distribution is random. If not, the partial chi-squared values help to identify those compartments that are preferential targets of the particles (RDI > 1). Particle distributions between different groups can be compared in a similar way by contingency table analysis. We first describe the preconditions and the way to implement these methods, then provide three worked examples, and finally discuss the advantages, pitfalls, and limitations of this method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study develops an automated analysis tool by combining total internal reflection fluorescence microscopy (TIRFM), an evanescent wave microscopic imaging technique to capture time-sequential images and the corresponding image processing Matlab code to identify movements of single individual particles. The developed code will enable us to examine two dimensional hindered tangential Brownian motion of nanoparticles with a sub-pixel resolution (nanoscale). The measured mean square displacements of nanoparticles are compared with theoretical predictions to estimate particle diameters and fluid viscosity using a nonlinear regression technique. These estimated values will be confirmed by the diameters and viscosities given by manufacturers to validate this analysis tool. Nano-particles used in these experiments are yellow-green polystyrene fluorescent nanospheres (200 nm, 500 nm and 1000 nm in diameter (nominal); 505 nm excitation and 515 nm emission wavelengths). Solutions used in this experiment are de-ionized (DI) water, 10% d-glucose and 10% glycerol. Mean square displacements obtained near the surface shows significant deviation from theoretical predictions which are attributed to DLVO forces in the region but it conforms to theoretical predictions after ~125 nm onwards. The proposed automation analysis tool will be powerfully employed in the bio-application fields needed for examination of single protein (DNA and/or vesicle) tracking, drug delivery, and cyto-toxicity unlike the traditional measurement techniques that require fixing the cells. Furthermore, this tool can be also usefully applied for the microfluidic areas of non-invasive thermometry, particle tracking velocimetry (PTV), and non-invasive viscometry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tremendous application potential of nanosized materials stays in sharp contrast to a growing number of critical reports of their potential toxicity. Applications of in vitro methods to assess nanoparticles are severely limited through difficulties in exposing cells of the respiratory tract directly to airborne engineered nanoparticles. We present a completely new approach to expose lung cells to particles generated in situ by flame spray synthesis. Cerium oxide nanoparticles from a single run were produced and simultaneously exposed to the surface of cultured lung cells inside a glovebox. Separately collected samples were used to measure hydrodynamic particle size distribution, shape, and agglomerate morphology. Cell viability was not impaired by the conditions of the glovebox exposure. The tightness of the lung cell monolayer, the mean total lamellar body volume, and the generation of oxidative DNA damage revealed a dose-dependent cellular response to the airborne engineered nanoparticles. The direct combination of production and exposure allows studying particle toxicity in a simple and reproducible way under environmental conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to study further the long-range correlations ("ridge") observed recently in p+Pb collisions at sqrt(s_NN) =5.02 TeV, the second-order azimuthal anisotropy parameter of charged particles, v_2, has been measured with the cumulant method using the ATLAS detector at the LHC. In a data sample corresponding to an integrated luminosity of approximately 1 microb^(-1), the parameter v_2 has been obtained using two- and four-particle cumulants over the pseudorapidity range |eta|<2.5. The results are presented as a function of transverse momentum and the event activity, defined in terms of the transverse energy summed over 3.1particle correlation methods, and to predictions from hydrodynamic models of p+Pb collisions. Despite the small transverse spatial extent of the p+Pb collision system, the large magnitude of v_2 and its similarity to hydrodynamic predictions provide additional evidence for the importance of final-state effects in p+Pb reactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ATLAS measurements of the azimuthal anisotropy in lead–lead collisions at √sNN = 2.76 TeV are shown using a dataset of approximately 7μb−1 collected at the LHC in 2010. The measurements are performed for charged particles with transversemomenta 0.5 < pT < 20 GeV and in the pseudorapidity range |η| < 2.5. The anisotropy is characterized by the Fourier coefficients, vn, of the charged-particle azimuthal angle distribution for n = 2–4. The Fourier coefficients are evaluated using multi-particle cumulants calculated with the generating function method. Results on the transverse momentum, pseudorapidity and centrality dependence of the vn coefficients are presented. The elliptic flow, v2, is obtained from the two-, four-, six- and eight-particle cumulants while higher-order coefficients, v3 and v4, are determined with two- and four-particle cumulants. Flow harmonics vn measured with four-particle cumulants are significantly reduced compared to the measurement involving two-particle cumulants. A comparison to vn measurements obtained using different analysis methods and previously reported by the LHC experiments is also shown. Results of measurements of flow fluctuations evaluated with multiparticle cumulants are shown as a function of transverse momentum and the collision centrality. Models of the initial spatial geometry and its fluctuations fail to describe the flow fluctuations measurements.