927 resultados para Time correlation function


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In dieser Arbeit wurden Simulation von Flüssigkeiten auf molekularer Ebene durchgeführt, wobei unterschiedliche Multi-Skalen Techniken verwendet wurden. Diese erlauben eine effektive Beschreibung der Flüssigkeit, die weniger Rechenzeit im Computer benötigt und somit Phänomene auf längeren Zeit- und Längenskalen beschreiben kann.rnrnEin wesentlicher Aspekt ist dabei ein vereinfachtes (“coarse-grained”) Modell, welches in einem systematischen Verfahren aus Simulationen des detaillierten Modells gewonnen wird. Dabei werden ausgewählte Eigenschaften des detaillierten Modells (z.B. Paar-Korrelationsfunktion, Druck, etc) reproduziert.rnrnEs wurden Algorithmen untersucht, die eine gleichzeitige Kopplung von detaillierten und vereinfachten Modell erlauben (“Adaptive Resolution Scheme”, AdResS). Dabei wird das detaillierte Modell in einem vordefinierten Teilvolumen der Flüssigkeit (z.B. nahe einer Oberfläche) verwendet, während der Rest mithilfe des vereinfachten Modells beschrieben wird.rnrnHierzu wurde eine Methode (“Thermodynamische Kraft”) entwickelt um die Kopplung auch dann zu ermöglichen, wenn die Modelle in verschiedenen thermodynamischen Zuständen befinden. Zudem wurde ein neuartiger Algorithmus der Kopplung beschrieben (H-AdResS) der die Kopplung mittels einer Hamilton-Funktion beschreibt. In diesem Algorithmus ist eine zur Thermodynamischen Kraft analoge Korrektur mit weniger Rechenaufwand möglich.rnrnAls Anwendung dieser grundlegenden Techniken wurden Pfadintegral Molekulardynamik (MD) Simulationen von Wasser untersucht. Mithilfe dieser Methode ist es möglich, quantenmechanische Effekte der Kerne (Delokalisation, Nullpunktsenergie) in die Simulation einzubeziehen. Hierbei wurde zuerst eine Multi-Skalen Technik (“Force-matching”) verwendet um eine effektive Wechselwirkung aus einer detaillierten Simulation auf Basis der Dichtefunktionaltheorie zu extrahieren. Die Pfadintegral MD Simulation verbessert die Beschreibung der intra-molekularen Struktur im Vergleich mit experimentellen Daten. Das Modell eignet sich auch zur gleichzeitigen Kopplung in einer Simulation, wobei ein Wassermolekül (beschrieben durch 48 Punktteilchen im Pfadintegral-MD Modell) mit einem vereinfachten Modell (ein Punktteilchen) gekoppelt wird. Auf diese Weise konnte eine Wasser-Vakuum Grenzfläche simuliert werden, wobei nur die Oberfläche im Pfadintegral Modell und der Rest im vereinfachten Modell beschrieben wird.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Radial velocities measured from near-infrared (NIR) spectra are a potential tool to search for extrasolar planets around cool stars. High resolution infrared spectrographs now available reach the high precision of visible instruments, with a constant improvement over time. GIANO is an infrared echelle spectrograph and it is a powerful tool to provide high resolution spectra for accurate radial velocity measurements of exo-planets and for chemical and dynamical studies of stellar or extragalactic objects. No other IR instruments have the GIANO's capability to cover the entire NIR wavelength range. In this work we develop an ensemble of IDL procedures to measure high precision radial velocities on a few GIANO spectra acquired during the commissioning run, using the telluric lines as wevelength reference. In Section 1.1 various exoplanet search methods are described. They exploit different properties of the planetary system. In Section 1.2 we describe the exoplanet population discovered trough the different methods. In Section 1.3 we explain motivations for NIR radial velocities and the challenges related the main issue that has limited the pursuit of high-precision NIR radial velocity, that is, the lack of a suitable calibration method. We briefly describe calibration methods in the visible and the solutions for IR calibration, for instance, the use of telluric lines. The latter has advantages and problems, described in detail. In this work we use telluric lines as wavelength reference. In Section 1.4 the Cross Correlation Function (CCF) method is described. This method is widely used to measure the radial velocities.In Section 1.5 we describe GIANO and its main science targets. In Chapter 2 observational data obtained with GIANO spectrograph are presented and the choice criteria are reported. In Chapter 3 we describe the detail of the analysis and examine in depth the flow chart reported in Section 3.1. In Chapter 4 we give the radial velocities measured with our IDL procedure for all available targets. We obtain an rms scatter in radial velocities of about 7 m/s. Finally, we conclude that GIANO can be used to measure radial velocities of late type stars with an accuracy close to or better than 10 m/s, using telluric lines as wevelength reference. In 2014 September GIANO is being operative at TNG for Science Verification and more observational data will allow to further refine this analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Free space optical (FSO) communication links can experience extreme signal degradation due to atmospheric turbulence induced spatial and temporal irradiance fuctuations (scintillation) in the laser wavefront. In addition, turbulence can cause the laser beam centroid to wander resulting in power fading, and sometimes complete loss of the signal. Spreading of the laser beam and jitter are also artifacts of atmospheric turbulence. To accurately predict the signal fading that occurs in a laser communication system and to get a true picture of how this affects crucial performance parameters like bit error rate (BER) it is important to analyze the probability density function (PDF) of the integrated irradiance fuctuations at the receiver. In addition, it is desirable to find a theoretical distribution that accurately models these ?uctuations under all propagation conditions. The PDF of integrated irradiance fuctuations is calculated from numerical wave-optic simulations of a laser after propagating through atmospheric turbulence to investigate the evolution of the distribution as the aperture diameter is increased. The simulation data distribution is compared to theoretical gamma-gamma and lognormal PDF models under a variety of scintillation regimes from weak to very strong. Our results show that the gamma-gamma PDF provides a good fit to the simulated data distribution for all aperture sizes studied from weak through moderate scintillation. In strong scintillation, the gamma-gamma PDF is a better fit to the distribution for point-like apertures and the lognormal PDF is a better fit for apertures the size of the atmospheric spatial coherence radius ρ0 or larger. In addition, the PDF of received power from a Gaussian laser beam, which has been adaptively compensated at the transmitter before propagation to the receiver of a FSO link in the moderate scintillation regime is investigated. The complexity of the adaptive optics (AO) system is increased in order to investigate the changes in the distribution of the received power and how this affects the BER. For the 10 km link, due to the non-reciprocal nature of the propagation path the optimal beam to transmit is unknown. These results show that a low-order level of complexity in the AO provides a better estimate for the optimal beam to transmit than a higher order for non-reciprocal paths. For the 20 km link distance it was found that, although minimal, all AO complexity levels provided an equivalent improvement in BER and that no AO complexity provided the correction needed for the optimal beam to transmit. Finally, the temporal power spectral density of received power from a FSO communication link is investigated. Simulated and experimental results for the coherence time calculated from the temporal correlation function are presented. Results for both simulation and experimental data show that the coherence time increases as the receiving aperture diameter increases. For finite apertures the coherence time increases as the communication link distance is increased. We conjecture that this is due to the increasing speckle size within the pupil plane of the receiving aperture for an increasing link distance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Advances in information technology and global data availability have opened the door for assessments of sustainable development at a truly macro scale. It is now fairly easy to conduct a study of sustainability using the entire planet as the unit of analysis; this is precisely what this work set out to accomplish. The study began by examining some of the best known composite indicator frameworks developed to measure sustainability at the country level today. Most of these were found to value human development factors and a clean local environment, but to gravely overlook consumption of (remote) resources in relation to nature’s capacity to renew them, a basic requirement for a sustainable state. Thus, a new measuring standard is proposed, based on the Global Sustainability Quadrant approach. In a two‐dimensional plot of nations’ Human Development Index (HDI) vs. their Ecological Footprint (EF) per capita, the Sustainability Quadrant is defined by the area where both dimensions satisfy the minimum conditions of sustainable development: an HDI score above 0.8 (considered ‘high’ human development), and an EF below the fair Earth‐share of 2.063 global hectares per person. After developing methods to identify those countries that are closest to the Quadrant in the present‐day and, most importantly, those that are moving towards it over time, the study tackled the question: what indicators of performance set these countries apart? To answer this, an analysis of raw data, covering a wide array of environmental, social, economic, and governance performance metrics, was undertaken. The analysis used country rank lists for each individual metric and compared them, using the Pearson Product Moment Correlation function, to the rank lists generated by the proximity/movement relative to the Quadrant measuring methods. The analysis yielded a list of metrics which are, with a high degree of statistical significance, associated with proximity to – and movement towards – the Quadrant; most notably: Favorable for sustainable development: use of contraception, high life expectancy, high literacy rate, and urbanization. Unfavorable for sustainable development: high GDP per capita, high language diversity, high energy consumption, and high meat consumption. A momentary gain, but a burden in the long‐run: high carbon footprint and debt. These results could serve as a solid stepping stone for the development of more reliable composite index frameworks for assessing countries’ sustainability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Understanding the behavior of large outlet glaciers draining the Greenland Ice Sheet is critical for assessing the impact of climate change on sea level rise. The flow of marine-terminating outlet glaciers is partly governed by calving-related processes taking place at the terminus but is also influenced by the drainage of surface runoff to the bed through moulins, cracks, and other pathways. To investigate the extent of the latter effect, we develop a distributed surface-energy-balance model for Helheim Glacier, East Greenland, to calculate surface melt and thereby estimate runoff. The model is driven by data from an automatic weather station operated on the glacier during the summers of 2007 and 2008, and calibrated with independent measurements of ablation. Modeled melt varies over the deployment period by as much as 68% relative to the mean, with melt rates approximately 77% higher on the lower reaches of the glacier trunk than on the upper glacier. We compare melt variations during the summer season to estimates of surface velocity derived from global positioning system surveys. Near the front of the glacier, there is a significant correlation (on >95% levels) between variations in runoff (estimated from surface melt) and variations in velocity, with a 1 day delay in velocity relative to melt. Although the velocity changes are small compared to accelerations previously observed following some calving events, our findings suggest that the flow speed of Helheim Glacier is sensitive to changes in runoff. The response is most significant in the heavily crevassed, fast-moving region near the calving front. The delay in the peak of the cross-correlation function implies a transit time of 12-36 h for surface runoff to reach the bed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis, we present the generation and studies of a 87Rb Bose-Einstein condensate (BEC) perturbed by an oscillatory excitation. The atoms are trapped in a harmonic magnetic trap where, after an evaporative cooling process, we produce the BEC. In order to study the effect caused by oscillatory excitations, a quadrupole magnetic field time oscillatory is superimposed to the trapping potential. Through this perturbation, collective modes were observed. The dipole mode is excited even for low excitation amplitudes. However, a minimum excitation energy is needed to excite the condensate quadrupole mode. Observing the excited cloud in TOF expansion, we note that for excitation amplitude in which the quadrupole mode is excited, the cloud expands without invert its aspect ratio. By looking these clouds, after long time-of-flight, it was possible to see vortices and, sometimes, a turbulent state in the condensed cloud. We calculated the momentum distribution of the perturbed BECs and a power law behavior, like the law to Kolmogorov turbulence, was observed. Furthermore, we show that using the method that we have developed to calculate the momentum distribution, the distribution curve (including the power law exponent) exhibits a dependence on the quadrupole mode oscillation of the cloud. The randomness distribution of peaks and depletions in density distribution image of an expanded turbulent BEC, remind us to the intensity profile of a speckle light beam. The analogy between matter-wave speckle and light speckle is justified by showing the similarities in the spatial propagation (or time expansion) of the waves. In addition, the second order correlation function is evaluated and the same dependence with distance was observed for the both waves. This creates the possibility to understand the properties of quantum matter in a disordered state. The propagation of a three-dimensional speckle field (as the matter-wave speckle described here) creates an opportunity to investigate the speckle phenomenon existing in dimensions higher than 2D (the case of light speckle).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We review the recent progress on the construction of the determinant representations of the correlation functions for the integrable supersymmetric fermion models. The factorizing F-matrices (or the so-called F-basis) play an important role in the construction. In the F-basis, the creation (and the annihilation) operators and the Bethe states of the integrable models are given in completely symmetric forms. This leads to the determinant representations of the scalar products of the Bethe states for the models. Based on the scalar products, the determinant representations of the correlation functions may be obtained. As an example, in this review, we give the determinant representations of the two-point correlation function for the U-q(gl(2 vertical bar 1)) (i.e. q-deformed) supersymmetric t-J model. The determinant representations are useful for analyzing physical properties of the integrable models in the thermodynamical limit.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objectives: The antiinflammatory effect of macrolide antibiotics has been well-established, as has their role in the treatment of certain disorders of chronic airway inflammation. Several studies have suggested that long-term, low-dose macrolides may be efficacious in the treatment of chronic rhinosinusitis; however, these studies have lacked a control group. To date, this effect has not been tested in a randomized, placebo-controlled study. Method: The authors conducted a double-blind, randomized, placebo-controlled clinical trial on 64 patients with chronic rhinosinusitis. Subjects received either 150 mg roxithromycin daily for 3 months or placebo. Outcome measures included the Sinonasal Outcome Test-20 (SNOT-20), measurements of peak nasal inspiratory flow, saccharine transit time, olfactory function, nasal endoscopic scoring, and nasal lavage assays for interleukin-8, fucose, and a2-macroglobulin. Results. There were statistically significant improvements in SNOT-20 score, nasal endoscopy, saccharine transit time, and IL-8 levels in lavage fluid (P < .05) in the macrolide group. A correlation was noted between improved outcome measures and low IgE levels. No significant improvements were noted for olfactory function, peak nasal inspiratory flow, or lavage levels for fucose and a2-macroglobulin. No improvement in any outcome was noted in the placebo-treated patients. Conclusion: These findings suggest that macrolides may have a beneficial role in the treatment of chronic rhinosinusitis, particularly in patients with low levels of IgE, and supports the in vitro evidence of their antiinflammatory activity. Additional studies are required to assess their place in clinical practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The first clinically proven nicotine replacement product to obtain regulatory approval was Nicorette® gum. It provides a convenient way of delivering nicotine directly to the buccal cavity, thus, circumventing 'first-pass' elimination following gastrointestinal absorption. Since launch, Nicorette® gum has been investigated in numerous studies (clinical) which are often difficult to compare due to large variations in study design and degree of sophistication. In order to standardise testing, in 2000 the European Pharmacopoeia introduced an apparatus to investigate the in vitro release of drug substances from medical chewing gum. With use of the chewing machine, the main aims of this project were to determine factors that could affect release from Nicorette® gum, to develop an in vitro in vivo correlation and to investigate formulation variables on release of nicotine from gums. A standard in vitro test method was developed. The gum was placed in the chewing chamber with 40 mL of artificial saliva at 37'C and chewed at 60 chews per minute. The chew rate, the type of dissolution medium used, pH, volume, temperature and the ionic strength of the dissolution medium were altered to investigate the effects on release in vitro. It was found that increasing the temperature of the dissolution media and the rate at which the gums were chewed resulted in a greater release of nicotine, whilst increasing the ionic strength of the dissolution medium to 80 mM resulted in a lower release. The addition of 0.1 % sodium Jauryl sulphate to the artificial saliva was found to double the release of nicotine compared to the use of artificial saliva and water alone. Although altering the dissolution volume and the starting pH did not affect the release. The increase in pH may be insufficient to provide optimal conditions for nicotine absorption (since the rate at which nicotine is transported through the buccal membrane was found to be higher at pH values greater than 8.6 where nicotine is predominately unionised). Using a time mapping function, it was also possible to establish a level A in vitro in vivo correlation. 4 mg Nicorette® gum was chewed at various chew rates in vitro and correlated to an in vivo chew-out study. All chew rates used in vitro could be successfully used for IVIVC purposes, however statistically, chew rates of 10 and 20 chews per minute performed better than all other chew rates. Finally a series of nicotine gums was made to investigate the effect of formulation variables on release of nicotine from the gum. Using a directly compressible gum base, in comparison to Nicorette® the gums crumbled when chewed in vitro, resulting in a faster release of nicotine. To investigate the effect of altering the gum base, the concentration of sodium salts, sugar syrup, the form of the active drug, the addition sequence and the incorporation of surfactant into the gum, the traditional manufacturing method was used to make a series of gum formulations. Results showed that the time of addition of the active drug, the incorporation of surfactants and using different gum base all increased the release of nicotine from the gum. In contrast, reducing the concentration of sodium carbonate resulted in a lower release. Using a stronger nicotine ion-exchange resin delayed the release of nicotine from the gum, whilst altering the concentration of sugar syrup had little effect on the release but altered the texture of the gum.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Isotropic scattering Raman spectra of liquid acetonitrile (AN) solutions of LiBF4 and NaI at various temperatures and concentrations have been investigated. For the first time imaginary as well as real parts of the solvent vibrational correlation functions have been extracted from the spectra. Such imaginary parts are currently an important component of modern theories of vibrational relaxation in liquids. This investigation thus provides the first experimental data on imaginary parts of a correlation function in AN solutions. Using the fitting algorithm we recently developed, statistically confident models for the Raman spectra were deduced. The parameters of the band shapes, with an additional correction, of the ν2 AN vibration (CN stretching), together with their confidence intervals are also reported for the first time. It is shown that three distinct species, with lifetimes greater than ∼10−13 s, of the AN molecules can be detected in solutions containing Li+ and Na+. These species are attributed to AN molecules directly solvating cations; the single oriented and polarised molecules interleaving the cation and anion of a Solvent Shared Ion Pair (SShIP); and molecules solvating anions. These last are considered to be equivalent to the next layer of solvent molecules, because the CN end of the molecule is distant from the anion and thus less affected by the ionic charge compared with the anion situation. Calculations showed that at the concentrations employed, 1 and 0.3 M, there were essentially no other solvent molecules remaining that could be considered as bulk solvent. Calculations also showed that the internuclear distance in these solutions supported the proposal that the ionic entity dominating in solution was the SShIP, and other evidence was adduced that confirmed the absence of Contact Ion Pairs at these concentrations. The parameters of the shape of the vibrational correlation functions of all three species are reported. The parameters of intramolecular anharmonic coupling between the potential surfaces in AN and the dynamics of the intermolecular environment fluctuations and intermolecular energy transfer are presented. These results will assist investigations made at higher and lower concentrations, when additional species and interactions with AN molecules will be present.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present our approach to real-time service-oriented scheduling problems with the objective of maximizing the total system utility. Different from the traditional utility accrual scheduling problems that each task is associated with only a single time utility function (TUF), we associate two different TUFs—a profit TUF and a penalty TUF—with each task, to model the real-time services that not only need to reward the early completions but also need to penalize the abortions or deadline misses. The scheduling heuristics we proposed in this paper judiciously accept, schedule, and abort real-time services when necessary to maximize the accrued utility. Our extensive experimental results show that our proposed algorithms can significantly outperform the traditional scheduling algorithms such as the Earliest Deadline First (EDF), the traditional utility accrual (UA) scheduling algorithms, and an earlier scheduling approach based on a similar model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The results of inductively coupled argon plasma (ICAP) chemical analyses carried out on some 300 core samples from Ocean Drilling Program Sites 834, 835, 838, and 839 are presented. These sites were drilled during Leg 135 in the Lau Basin. The data are compared with total gamma (SGR) wireline logs at Sites 834 and 835. Pliocene (Piacenzian) nannofossil Zone CN12, which has been identified at Sites 834 and 835, is examined in detail using spectral analyses on core and wireline logs. The potassium and calcium concentrations from the core material were used to calculate an objective depth-to-geological time stretching function, which improved the stratigraphic correlation between sites. The integrated use of chemical analyses, wireline-log data and paleomagnetic results improved confidence in the correlations obtained. Although no significant sedimentation periodicities were obtained from the two sites, a common concentration of energy between 30 and 60 k.y. was recorded.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This work presents a computational, called MOMENTS, code developed to be used in process control to determine a characteristic transfer function to industrial units when radiotracer techniques were been applied to study the unit´s performance. The methodology is based on the measuring the residence time distribution function (RTD) and calculate the first and second temporal moments of the tracer data obtained by two scintillators detectors NaI positioned to register a complete tracer movement inside the unit. Non linear regression technique has been used to fit various mathematical models and a statistical test was used to select the best result to the transfer function. Using the code MOMENTS, twelve different models can be used to fit a curve and calculate technical parameters to the unit.