23 resultados para Stochastic Resonance

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Decisions taken in modern organizations are often multi-dimensional, involving multiple decision makers and several criteria measured on different scales. Multiple Criteria Decision Making (MCDM) methods are designed to analyze and to give recommendations in this kind of situations. Among the numerous MCDM methods, two large families of methods are the multi-attribute utility theory based methods and the outranking methods. Traditionally both method families require exact values for technical parameters and criteria measurements, as well as for preferences expressed as weights. Often it is hard, if not impossible, to obtain exact values. Stochastic Multicriteria Acceptability Analysis (SMAA) is a family of methods designed to help in this type of situations where exact values are not available. Different variants of SMAA allow handling all types of MCDM problems. They support defining the model through uncertain, imprecise, or completely missing values. The methods are based on simulation that is applied to obtain descriptive indices characterizing the problem. In this thesis we present new advances in the SMAA methodology. We present and analyze algorithms for the SMAA-2 method and its extension to handle ordinal preferences. We then present an application of SMAA-2 to an area where MCDM models have not been applied before: planning elevator groups for high-rise buildings. Following this, we introduce two new methods to the family: SMAA-TRI that extends ELECTRE TRI for sorting problems with uncertain parameter values, and SMAA-III that extends ELECTRE III in a similar way. An efficient software implementing these two methods has been developed in conjunction with this work, and is briefly presented in this thesis. The thesis is closed with a comprehensive survey of SMAA methodology including a definition of a unified framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämä työ luo katsauksen ajallisiin ja stokastisiin ohjelmien luotettavuus malleihin sekä tutkii muutamia malleja käytännössä. Työn teoriaosuus sisältää ohjelmien luotettavuuden kuvauksessa ja arvioinnissa käytetyt keskeiset määritelmät ja metriikan sekä varsinaiset mallien kuvaukset. Työssä esitellään kaksi ohjelmien luotettavuusryhmää. Ensimmäinen ryhmä ovat riskiin perustuvat mallit. Toinen ryhmä käsittää virheiden ”kylvöön” ja merkitsevyyteen perustuvat mallit. Työn empiirinen osa sisältää kokeiden kuvaukset ja tulokset. Kokeet suoritettiin käyttämällä kolmea ensimmäiseen ryhmään kuuluvaa mallia: Jelinski-Moranda mallia, ensimmäistä geometrista mallia sekä yksinkertaista eksponenttimallia. Kokeiden tarkoituksena oli tutkia, kuinka syötetyn datan distribuutio vaikuttaa mallien toimivuuteen sekä kuinka herkkiä mallit ovat syötetyn datan määrän muutoksille. Jelinski-Moranda malli osoittautui herkimmäksi distribuutiolle konvergaatio-ongelmien vuoksi, ensimmäinen geometrinen malli herkimmäksi datan määrän muutoksille.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrolyte solutions are of importance in a wide range of scientific contexts and as such have attracted considerable theoretical and experimental effort over many years. Nuclear Magnetic resonance provides a precise and versatile tool for investigation of electrolyte solutions, both in water and in organic solvents. Many structural and dynamic properties can be obtained through NMR experiments. The solution of aluminum chloride in water was studied. Different concentrations were taken for investigation. Independence of maximum line shift from concentration and acidity was shown. Six-coordinated structure of solvation shell was confirmed by experiments on 'H and 27A1 nuclei. Diffusion coefficients were studied. The solution of nickel chloride in methanol was studied. Lines, corresponding to coordinated and bulk methanol were found. Four-, five- and six-coordinated structures were found in different temperatures. The line for coordinated -OD group of deuterated methanol was observed on 2H spectrum for the first time. Partial deuteration of CH3 group was detected. Inability to observe coordinated -OH group was explained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fluorescence resonance energy transfer (FRET) is a non-radiative energy transfer from a fluorescent donor molecule to an appropriate acceptor molecule and a commonly used technique to develop homogeneous assays. If the emission spectrum of the donor overlaps with the excitation spectrum of the acceptor, FRET might occur. As a consequence, the emission of the donor is decreased and the emission of the acceptor (if fluorescent) increased. Furthermore, the distance between the donor and the acceptor needs to be short enough, commonly 10-100 Å. Typically, the close proximity between the donor and the acceptor is achieved via bioaffinity interactions e.g. antibody binding antigen. Large variety of donors and acceptors exist. The selection of the donor/acceptor pair should be done not only based on the requirements of FRET but also the performance expectancies and the objectives of the application should be considered. In this study, the exceptional fluorescence properties of the lanthanide chelates were employed to develop two novel homogeneous immunoassays: a non-competitive hapten (estradiol) assay based on a single binder and a dual-parametric total and free PSA assay. In addition, the quenching efficiencies and energy transfer properties of various donor/acceptor pairs were studied. The applied donors were either europium(III) or terbium(III) chelates; whereas several organic dyes (both fluorescent and quenchers) acted as acceptors. First, it was shown that if the interaction between the donor/acceptor complexes is of high quality (e.g. biotin-streptavidin) the fluorescence of the europium(III) chelate could be quenched rather efficiently. Furthermore, the quenching based homogeneous non-competitive assay for estradiol had significantly better sensitivity (~67 times) than a corresponding homogeneous competitive assay using the same assay components. Second, if the acceptors were chosen to emit at the emission minima of the terbium(III) chelate, several acceptor emissions could be measured simultaneously without significant cross-talk from other acceptors. Based on these results, the appropriate acceptors were chosen for the dual-parameter assay. The developed homogeneous dual-parameter assay was able to measure both total and free PSA simultaneously using a simple mix and measure protocol. Correlation of this assay to a heterogeneous single parameter assay was excellent (above 0.99 for both) when spiked human plasma samples were used. However, due to the interference of the sample material, the obtained concentrations were slightly lower with the homogeneous than the heterogeneous assay, especially for the free PSA. To conclude, in this work two novel immunoassay principles were developed, which both are adaptable to other analytes. However, the hapten assay requires a rather good antibody with low dissociation rate and high affinity; whereas the dual-parameter assay principle is applicable whenever two immunometric complexes can form simultaneously, provided that the requirements of FRET are fulfilled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis three experiments with atomic hydrogen (H) at low temperatures T<1 K are presented. Experiments were carried out with two- (2D) and three-dimensional (3D) H gas, and with H atoms trapped in solid H2 matrix. The main focus of this work is on interatomic interactions, which have certain specific features in these three systems considered. A common feature is the very high density of atomic hydrogen, the systems are close to quantum degeneracy. Short range interactions in collisions between atoms are important in gaseous H. The system of H in H2 differ dramatically because atoms remain fixed in the H2 lattice and properties are governed by long-range interactions with the solid matrix and with H atoms. The main tools in our studies were the methods of magnetic resonance, with electron spin resonance (ESR) at 128 GHz being used as the principal detection method. For the first time in experiments with H in high magnetic fields and at low temperatures we combined ESR and NMR to perform electron-nuclear double resonance (ENDOR) as well as coherent two-photon spectroscopy. This allowed to distinguish between different types of interactions in the magnetic resonance spectra. Experiments with 2D H gas utilized the thermal compression method in homogeneous magnetic field, developed in our laboratory. In this work methods were developed for direct studies of 3D H at high density, and for creating high density samples of H in H2. We measured magnetic resonance line shifts due to collisions in the 2D and 3D H gases. First we observed that the cold collision shift in 2D H gas composed of atoms in a single hyperfine state is much smaller than predicted by the mean-field theory. This motivated us to carry out similar experiments with 3D H. In 3D H the cold collision shift was found to be an order of magnitude smaller for atoms in a single hyperfine state than that for a mixture of atoms in two different hyperfine states. The collisional shifts were found to be in fair agreement with the theory, which takes into account symmetrization of the wave functions of the colliding atoms. The origin of the small shift in the 2D H composed of single hyperfine state atoms is not yet understood. The measurement of the shift in 3D H provides experimental determination for the difference of the scattering lengths of ground state atoms. The experiment with H atoms captured in H2 matrix at temperatures below 1 K originated from our work with H gas. We found out that samples of H in H2 were formed during recombination of gas phase H, enabling sample preparation at temperatures below 0.5 K. Alternatively, we created the samples by electron impact dissociation of H2 molecules in situ in the solid. By the latter method we reached highest densities of H atoms reported so far, 3.5(5)x1019 cm-3. The H atoms were found to be stable for weeks at temperatures below 0.5 K. The observation of dipolar interaction effects provides a verification for the density measurement. Our results point to two different sites for H atoms in H2 lattice. The steady-state nuclear polarizations of the atoms were found to be non-thermal. The possibility for further increase of the impurity H density is considered. At higher densities and lower temperatures it might be possible to observe phenomena related to quantum degeneracy in solid.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In any decision making under uncertainties, the goal is mostly to minimize the expected cost. The minimization of cost under uncertainties is usually done by optimization. For simple models, the optimization can easily be done using deterministic methods.However, many models practically contain some complex and varying parameters that can not easily be taken into account using usual deterministic methods of optimization. Thus, it is very important to look for other methods that can be used to get insight into such models. MCMC method is one of the practical methods that can be used for optimization of stochastic models under uncertainty. This method is based on simulation that provides a general methodology which can be applied in nonlinear and non-Gaussian state models. MCMC method is very important for practical applications because it is a uni ed estimation procedure which simultaneously estimates both parameters and state variables. MCMC computes the distribution of the state variables and parameters of the given data measurements. MCMC method is faster in terms of computing time when compared to other optimization methods. This thesis discusses the use of Markov chain Monte Carlo (MCMC) methods for optimization of Stochastic models under uncertainties .The thesis begins with a short discussion about Bayesian Inference, MCMC and Stochastic optimization methods. Then an example is given of how MCMC can be applied for maximizing production at a minimum cost in a chemical reaction process. It is observed that this method performs better in optimizing the given cost function with a very high certainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Resonance energy transfer (RET) is a non-radiative transfer of the excitation energy from the initially excited luminescent donor to an acceptor. The requirements for the resonance energy transfer are: i) the spectral overlap between the donor emission spectrum and the acceptor absorption spectrum, ii) the close proximity of the donor and the acceptor, and iii) the suitable relative orientations of the donor emission and the acceptor absorption transition dipoles. As a result of the RET process the donor luminescence intensity and the donor lifetime are decreased. If the acceptor is luminescent, a sensitized acceptor emission appears. The rate of RET depends strongly on the donor–acceptor distance (r) and is inversely proportional to r6. The distance dependence of RET is utilized in binding assays. The proximity requirement and the selective detection of the RET-modified emission signal allow homogeneous separation free assays. The term lanthanide-based RET is used when luminescent lanthanide compounds are used as donors. The long luminescence lifetimes, the large Stokes’ shifts and the intense, sharply-spiked emission spectra of the lanthanide donors offer advantages over the conventional organic donor molecules. Both the organic lanthanide chelates and the inorganic up-converting phosphor (UCP) particles have been used as donor labels in the RET based binding assays. In the present work lanthanide luminescence and lanthanide-based resonance energy transfer phenomena were studied. Luminescence lifetime measurements had an essential role in the research. Modular frequency-domain and time-domain luminometers were assembled and used successfully in the lifetime measurements. The frequency-domain luminometer operated in the low frequency domain ( 100 kHz) and utilized a novel dual-phase lock-in detection of the luminescence. One of the studied phenomena was the recently discovered non-overlapping fluorescence resonance energy transfer (nFRET). The studied properties were the distance and temperature dependences of nFRET. The distance dependence was found to deviate from the Förster theory and a clear temperature dependence was observed whereas conventional RET was completely independent of the temperature. Based on the experimental results two thermally activated mechanisms were proposed for the nFRET process. The work with the UCP particles involved the measurement of the luminescence properties of the UCP particles synthesized in our laboratory. The goal of the UCP particle research is to develop UCP donor labels for binding assays. In the present work the effect of the dopant concentrations and the core–shell structure on the total up-conversion luminescence intensity, the red–green emission ratio, and the luminescence lifetime was studied. Also the non-radiative nature of the energy transfer from the UCP particle donors to organic acceptors was demonstrated for the first time in aqueous environment and with a controlled donor–acceptor distance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quite often, in the construction of a pulp mill involves establishing the size of tanks which will accommodate the material from the various processes in which case estimating the right tank size a priori would be vital. Hence, simulation of the whole production process would be worthwhile. Therefore, there is need to develop mathematical models that would mimic the behavior of the output from the various production units of the pulp mill to work as simulators. Markov chain models, Autoregressive moving average (ARMA) model, Mean reversion models with ensemble interaction together with Markov regime switching models are proposed for that purpose.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Approximately two percent of Finns have sequels after traumatic brain injury (TBI), and many TBI patients are young or middle-aged. The high rate of unemployment after TBI has major economic consequences for society, and traumatic brain injury often has remarkable personal consequences, as well. Structural imaging is often needed to support the clinical TBI diagnosis. Accurate early diagnosis is essential for successful rehabilition and, thus, may also influence the patient’s outcome. Traumatic axonal injury and cortical contusions constitute the majority of traumatic brain lesions. Several studies have shown magnetic resonance imaging (MRI) to be superior to computed tomography (CT) in the detection of these lesions. However, traumatic brain injury often leads to persistent symptoms even in cases with few or no findings in conventional MRI. Aims and methods: The aim of this prospective study was to clarify the role of conventional MRI in the imaging of traumatic brain injury, and to investigate how to improve the radiologic diagnostics of TBI by using more modern diffusion-weighted imaging (DWI) and diffusion tensor imaging (DTI) techniques. We estimated, in a longitudinal study, the visibility of the contusions and other intraparenchymal lesions in conventional MRI at one week and one year after TBI. We used DWI-based measurements to look for changes in the diffusivity of the normal-appearing brain in a case-control study. DTI-based tractography was used in a case-control study to evaluate changes in the volume, diffusivity, and anisotropy of the long association tracts in symptomatic TBI patients with no visible signs of intracranial or intraparenchymal abnormalities on routine MRI. We further studied the reproducibility of different tools to identify and measure white-matter tracts by using a DTI sequence suitable for clinical protocols. Results: Both the number and extent of visible traumatic lesions on conventional MRI diminished significantly with time. Slightly increased diffusion in the normal-appearing brain was a common finding at one week after TBI, but it was not significantly associated with the injury severity. Fractional anisotropy values, that represent the integrity of the white-matter tracts, were significantly diminished in several tracts in TBI patients compared to the control subjects. Compared to the cross-sectional ROI method, the tract-based analyses had better reproducibility to identify and measure white-matter tracts of interest by means of DTI tractography. Conclusions: As conventional MRI is still applied in clinical practice, it should be carried out soon after the injury, at least in symptomatic patients with negative CT scan. DWI-related brain diffusivity measurements may be used to improve the documenting of TBI. DTI tractography can be used to improve radiologic diagnostics in a symptomatic TBI sub-population with no findings on conventional MRI. Reproducibility of different tools to quantify fibre tracts vary considerably, which should be taken into consideration in the clinical DTI applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic approximation methods for stochastic optimization are considered. Reviewed the main methods of stochastic approximation: stochastic quasi-gradient algorithm, Kiefer-Wolfowitz algorithm and adaptive rules for them, simultaneous perturbation stochastic approximation (SPSA) algorithm. Suggested the model and the solution of the retailer's profit optimization problem and considered an application of the SQG-algorithm for the optimization problems with objective functions given in the form of ordinary differential equation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information gained from the human genome project and improvements in compound synthesizing have increased the number of both therapeutic targets and potential lead compounds. This has evolved a need for better screening techniques to have a capacity to screen number of compound libraries against increasing amount of targets. Radioactivity based assays have been traditionally used in drug screening but the fluorescence based assays have become more popular in high throughput screening (HTS) as they avoid safety and waste problems confronted with radioactivity. In comparison to conventional fluorescence more sensitive detection is obtained with time-resolved luminescence which has increased the popularity of time-resolved fluorescence resonance energy transfer (TR-FRET) based assays. To simplify the current TR-FRET based assay concept the luminometric homogeneous single-label utilizing assay technique, Quenching Resonance Energy Transfer (QRET), was developed. The technique utilizes soluble quencher to quench non-specifically the signal of unbound fraction of lanthanide labeled ligand. One labeling procedure and fewer manipulation steps in the assay concept are saving resources. The QRET technique is suitable for both biochemical and cell-based assays as indicated in four studies:1) ligand screening study of β2 -adrenergic receptor (cell-based), 2) activation study of Gs-/Gi-protein coupled receptors by measuring intracellular concentration of cyclic adenosine monophosphate (cell-based), 3) activation study of G-protein coupled receptors by observing the binding of guanosine-5’-triphosphate (cell membranes), and 4) activation study of small GTP binding protein Ras (biochemical). Signal-to-background ratios were between 2.4 to 10 and coefficient of variation varied from 0.5 to 17% indicating their suitability to HTS use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stochastic differential equation (SDE) is a differential equation in which some of the terms and its solution are stochastic processes. SDEs play a central role in modeling physical systems like finance, Biology, Engineering, to mention some. In modeling process, the computation of the trajectories (sample paths) of solutions to SDEs is very important. However, the exact solution to a SDE is generally difficult to obtain due to non-differentiability character of realizations of the Brownian motion. There exist approximation methods of solutions of SDE. The solutions will be continuous stochastic processes that represent diffusive dynamics, a common modeling assumption for financial, Biology, physical, environmental systems. This Masters' thesis is an introduction and survey of numerical solution methods for stochastic differential equations. Standard numerical methods, local linearization methods and filtering methods are well described. We compute the root mean square errors for each method from which we propose a better numerical scheme. Stochastic differential equations can be formulated from a given ordinary differential equations. In this thesis, we describe two kind of formulations: parametric and non-parametric techniques. The formulation is based on epidemiological SEIR model. This methods have a tendency of increasing parameters in the constructed SDEs, hence, it requires more data. We compare the two techniques numerically.