956 resultados para General-method


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A general method for the synthesis of triazoles containing selenium and tellurium was accomplished via a CuCAAC reaction between organic azides and a terminal triple bond, generated by in situ deprotection of the silyl group. The reaction tolerates alkyl and arylazides, with alkyl and aryl substituents directly bonded to the chalcogen atom. The products were readily functionalized by a nickel-catalyzed Negishi cross-coupling reaction, furnishing the aryl-heteroaryl products at the 4-position in good yields. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis deals with inflation theory, focussing on the model of Jarrow & Yildirim, which is nowadays used when pricing inflation derivatives. After recalling main results about short and forward interest rate models, the dynamics of the main components of the market are derived. Then the most important inflation-indexed derivatives are explained (zero coupon swap, year-on-year, cap and floor), and their pricing proceeding is shown step by step. Calibration is explained and performed with a common method and an heuristic and non standard one. The model is enriched with credit risk, too, which allows to take into account the possibility of bankrupt of the counterparty of a contract. In this context, the general method of pricing is derived, with the introduction of defaultable zero-coupon bonds, and the Monte Carlo method is treated in detailed and used to price a concrete example of contract. Appendixes: A: martingale measures, Girsanov's theorem and the change of numeraire. B: some aspects of the theory of Stochastic Differential Equations; in particular, the solution for linear EDSs, and the Feynman-Kac Theorem, which shows the connection between EDSs and Partial Differential Equations. C: some useful results about normal distribution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nell’ambito della presente tesi verrà descritto un approccio generalizzato per il controllo delle macchine elettriche trifasi; la prima parte è incentrata nello sviluppo di una metodologia di modellizzazione generale, ossia in grado di descrivere, da un punto di vista matematico, il comportamento di una generica macchina elettrica, che possa quindi includere in sé stessa tutte le caratteristiche salienti che possano caratterizzare ogni specifica tipologia di macchina elettrica. Il passo successivo è quello di realizzare un algoritmo di controllo per macchine elettriche che si poggi sulla teoria generalizzata e che utilizzi per il proprio funzionamento quelle grandezze offerte dal modello unico delle macchine elettriche. La tipologia di controllo che è stata utilizzata è quella che comunemente viene definita come controllo ad orientamento di campo (FOC), per la quale sono stati individuati degli accorgimenti atti a migliorarne le prestazioni dinamiche e di controllo della coppia erogata. Per concludere verrà presentata una serie di prove sperimentali con lo scopo di mettere in risalto alcuni aspetti cruciali nel controllo delle macchine elettriche mediante un algoritmo ad orientamento di campo e soprattutto di verificare l’attendibilità dell’approccio generalizzato alle macchine elettriche trifasi. I risultati sperimentali confermano quindi l’applicabilità del metodo a diverse tipologie di macchine (asincrone e sincrone) e sono stati verificate nelle condizioni operative più critiche: bassa velocità, alta velocità bassi carichi, dinamica lenta e dinamica veloce.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this thesis we have developed solutions to common issues regarding widefield microscopes, facing the problem of the intensity inhomogeneity of an image and dealing with two strong limitations: the impossibility of acquiring either high detailed images representative of whole samples or deep 3D objects. First, we cope with the problem of the non-uniform distribution of the light signal inside a single image, named vignetting. In particular we proposed, for both light and fluorescent microscopy, non-parametric multi-image based methods, where the vignetting function is estimated directly from the sample without requiring any prior information. After getting flat-field corrected images, we studied how to fix the problem related to the limitation of the field of view of the camera, so to be able to acquire large areas at high magnification. To this purpose, we developed mosaicing techniques capable to work on-line. Starting from a set of overlapping images manually acquired, we validated a fast registration approach to accurately stitch together the images. Finally, we worked to virtually extend the field of view of the camera in the third dimension, with the purpose of reconstructing a single image completely in focus, stemming from objects having a relevant depth or being displaced in different focus planes. After studying the existing approaches for extending the depth of focus of the microscope, we proposed a general method that does not require any prior information. In order to compare the outcome of existing methods, different standard metrics are commonly used in literature. However, no metric is available to compare different methods in real cases. First, we validated a metric able to rank the methods as the Universal Quality Index does, but without needing any reference ground truth. Second, we proved that the approach we developed performs better in both synthetic and real cases.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Heusler Materialien wurden bisher vor allem in Volumen- und Dünnfilmproben aufgrund ihrer technischen Bedeutung untersucht. In dieser Arbeit berichtet über die experimentellen Untersuchungen der chemischen Synthese, Struktur, und der magnetischen Eigenschaften von ternären Heusler-Nanopartikeln. Die grundlegenden Aspekte der Physik, Chemie und Materialwissenschaft bezüglich der Heusler Nanopartiikel wurden untersucht. Außerdem wurde eine silicatgestützte Herstellungsmethode für Karbon-ummantelte, ternäre intermetallische Co2FeGa Nanopartikel entwickelt. Die Bildung der L21 Co2FeGa Phase wurde mit Röntgenbeugung (XRD), Extended X-ray Absorption Fine Structure Spektroskopie (EXAFS), und 57Fe Mössbauer Spektroskopie bestätigt. Die Abhängigkeit der Phase und der der Größe der Co2FeGa Nanopartikel vom der Zusammensetzung der Precursor und des Silicats wurden untersucht. Durch das Koppeln der aus Transmissions-Elektronen-Mikroskopie (TEM) gewonnen Teilchengröße und der Mössbauerspektroskopie konnte die kritische Größe für den Übergang von superparamgnetischem zu ferromagnetischem Verhalten von Co2FeGa Nanopartikel ermittelt werden. Die silicatgestützte chemische Synthese von Co2FeGa Nanopartikeln besitzt großes Potential für eine generelle Herstellungsmethode für Co-basierte Heuser Nanopartikel. Des weiteren wurde auch eine chemische Herstellungsmethode von metallischen Nanopartikeln mit Synchrotronstrahlung untersucht, die so gewonnen Nanopartikel sind vielversprechende Materialien für die Nanobiotechnologie und die Nanomedizin.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The abundance of alpha-fetoprotein (AFP), a natural protein produced by the fetal yolk sac during pregnancy, correlates with lower incidence of estrogen receptor positive (ER+) breast cancer. The pharmacophore region of AFP has been narrowed down to a four amino acid (AA) region in the third domain of the 591 AA peptide. Our computational study focuses on a 4-mer segment consisting of the amino acids threonine-proline-valine-asparagine (TPVN). We have run replica exchange molecular dynamics (REMD) simulations and used 120 configurational snapshots from the total trajectory as starting configurations for quantum chemical calculations. We optimized structures using semiempirical (PM3, PM6, PM6-D2, PM6-H2, PM6-DH+, PM6-DH2) and density functional methods (TPSS, PBE0, M06-2X). By comparing the accuracy of these methods against RI-MP2 benchmarks, we devised a protocol for calculating the lowest energy conformers of these peptides accurately and efficiently. This protocol screens out high-energy conformers using lower levels of theory and outlines a general method for predicting small peptide structures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In many applications the observed data can be viewed as a censored high dimensional full data random variable X. By the curve of dimensionality it is typically not possible to construct estimators that are asymptotically efficient at every probability distribution in a semiparametric censored data model of such a high dimensional censored data structure. We provide a general method for construction of one-step estimators that are efficient at a chosen submodel of the full-data model, are still well behaved off this submodel and can be chosen to always improve on a given initial estimator. These one-step estimators rely on good estimators of the censoring mechanism and thus will require a parametric or semiparametric model for the censoring mechanism. We present a general theorem that provides a template for proving the desired asymptotic results. We illustrate the general one-step estimation methods by constructing locally efficient one-step estimators of marginal distributions and regression parameters with right-censored data, current status data and bivariate right-censored data, in all models allowing the presence of time-dependent covariates. The conditions of the asymptotics theorem are rigorously verified in one of the examples and the key condition of the general theorem is verified for all examples.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation concerns the intersection of three areas of discrete mathematics: finite geometries, design theory, and coding theory. The central theme is the power of finite geometry designs, which are constructed from the points and t-dimensional subspaces of a projective or affine geometry. We use these designs to construct and analyze combinatorial objects which inherit their best properties from these geometric structures. A central question in the study of finite geometry designs is Hamada’s conjecture, which proposes that finite geometry designs are the unique designs with minimum p-rank among all designs with the same parameters. In this dissertation, we will examine several questions related to Hamada’s conjecture, including the existence of counterexamples. We will also study the applicability of certain decoding methods to known counterexamples. We begin by constructing an infinite family of counterexamples to Hamada’s conjecture. These designs are the first infinite class of counterexamples for the affine case of Hamada’s conjecture. We further demonstrate how these designs, along with the projective polarity designs of Jungnickel and Tonchev, admit majority-logic decoding schemes. The codes obtained from these polarity designs attain error-correcting performance which is, in certain cases, equal to that of the finite geometry designs from which they are derived. This further demonstrates the highly geometric structure maintained by these designs. Finite geometries also help us construct several types of quantum error-correcting codes. We use relatives of finite geometry designs to construct infinite families of q-ary quantum stabilizer codes. We also construct entanglement-assisted quantum error-correcting codes (EAQECCs) which admit a particularly efficient and effective error-correcting scheme, while also providing the first general method for constructing these quantum codes with known parameters and desirable properties. Finite geometry designs are used to give exceptional examples of these codes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The reproducibility of metabolite content determined by MR spectroscopy (MRS) is usually at best a few percent for the prominent singlets. When studying low-concentration metabolites, like phenylalanine (Phe), where tissue content can be <100 micromol/kg, better reproducibility is paramount-particularly in view of using MRS results for potential individual treatment advice. An optimized, targeted spectroscopy method was established at 1.5T and reproducibility was established in 21 patients with phenylketonuria (PKU) where three spectra were recorded in each of three independent sessions, two of which were in immediate succession to minimize physiologic variation. Intersession variation was found to be only 7 micromol/kg Phe for back-to-back repetition of sessions, in close agreement with the variation of 16 micromol/kg observed for single spectra within a session. Analysis of variance proved the individuality of the blood/brain Phe ratio-though this ratio seems to be influenced by physiologic factors that are not stable in time. The excellent reproducibility was achieved through optimization of various factors, including signal-to-noise ratio, repositioning, and prescan calibrations, but also by enforcing as much prior information as possible (e.g., lineshape and phase from reference scans, constant prior-knowledge-locked baseline). While the application of maximum general prior knowledge is a general method to reduce fluctuations, one should remember that it may introduce systematic errors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Spatial tracking is one of the most challenging and important parts of Mixed Reality environments. Many applications, especially in the domain of Augmented Reality, rely on the fusion of several tracking systems in order to optimize the overall performance. While the topic of spatial tracking sensor fusion has already seen considerable interest, most results only deal with the integration of carefully arranged setups as opposed to dynamic sensor fusion setups. A crucial prerequisite for correct sensor fusion is the temporal alignment of the tracking data from several sensors. Tracking sensors are typically encountered in Mixed Reality applications, are generally not synchronized. We present a general method to calibrate the temporal offset between different sensors by the Time Delay Estimation method which can be used to perform on-line temporal calibration. By applying Time Delay Estimation on the tracking data, we show that the temporal offset between generic Mixed Reality spatial tracking sensors can be calibrated. To show the correctness and the feasibility of this approach, we have examined different variations of our method and evaluated various combinations of tracking sensors. We furthermore integrated this time synchronization method into our UBITRACK Mixed Reality tracking framework to provide facilities for calibration and real-time data alignment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a general method for inserting proofs in Frege systems for classical logic that produces systems that can internalize their own proofs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Currently there is no general method to study the impact of population admixture within families on the assumptions of random mating and consequently, Hardy-Weinberg equilibrium (HWE) and linkage equilibrium (LE) and on the inference obtained from traditional linkage analysis. ^ First, through simulation, the effect of admixture of two populations on the log of the odds (LOD) score was assessed, using Prostate Cancer as the typical disease model. Comparisons between simulated mixed and homogeneous families were performed. LOD scores under both models of admixture (within families and within a data set of homogeneous families) were closest to the homogeneous family scores of the population having the highest mixing proportion. Random sampling of families or ascertainment of families with disease affection status did not affect this observation, nor did the mode of inheritance (dominant/recessive) or sample size. ^ Second, after establishing the effect of admixture on the LOD score and inference for linkage, the presence of induced disequilibria by population admixture within families was studied and an adjustment procedure was developed. The adjustment did not force all disequilibria to disappear but because the families were adjusted for the population admixture, those replicates where the disequilibria exist are no longer affected by the disequilibria in terms of maximization for linkage. Furthermore, the adjustment was able to exclude uninformative families or families that had such a high departure from HWE and/or LE that their LOD scores were not reliable. ^ Together these observations imply that the presence of families of mixed population ancestry impacts linkage analysis in terms of the LOD score and the estimate of the recombination fraction. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Authors of experimental, empirical, theoretical and computational studies of two-sided matching markets have recognized the importance of correlated preferences. We develop a general method for the study of the effect of correlation of preferences on the outcomes generated by two-sided matching mechanisms. We then illustrate our method by using it to quantify the effect of correlation of preferences on satisfaction with the men-propose Gale-Shapley matching for a simple one-to-one matching problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is well known that an identification problem exists in the analysis of age-period-cohort data because of the relationship among the three factors (date of birth + age at death = date of death). There are numerous suggestions about how to analyze the data. No one solution has been satisfactory. The purpose of this study is to provide another analytic method by extending the Cox's lifetable regression model with time-dependent covariates. The new approach contains the following features: (1) It is based on the conditional maximum likelihood procedure using a proportional hazard function described by Cox (1972), treating the age factor as the underlying hazard to estimate the parameters for the cohort and period factors. (2) The model is flexible so that both the cohort and period factors can be treated as dummy or continuous variables, and the parameter estimations can be obtained for numerous combinations of variables as in a regression analysis. (3) The model is applicable even when the time period is unequally spaced.^ Two specific models are considered to illustrate the new approach and applied to the U.S. prostate cancer data. We find that there are significant differences between all cohorts and there is a significant period effect for both whites and nonwhites. The underlying hazard increases exponentially with age indicating that old people have much higher risk than young people. A log transformation of relative risk shows that the prostate cancer risk declined in recent cohorts for both models. However, prostate cancer risk declined 5 cohorts (25 years) earlier for whites than for nonwhites under the period factor model (0 0 0 1 1 1 1). These latter results are similar to the previous study by Holford (1983).^ The new approach offers a general method to analyze the age-period-cohort data without using any arbitrary constraint in the model. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Publishing Linked Data is a process that involves several design decisions and technologies. Although some initial guidelines have been already provided by Linked Data publishers, these are still far from covering all the steps that are necessary (from data source selection to publication) or giving enough details about all these steps, technologies, intermediate products, etc. Furthermore, given the variety of data sources from which Linked Data can be generated, we believe that it is possible to have a single and uni�ed method for publishing Linked Data, but we should rely on di�erent techniques, technologies and tools for particular datasets of a given domain. In this paper we present a general method for publishing Linked Data and the application of the method to cover di�erent sources from di�erent domains.