882 resultados para exponential sum onelliptic curve
Resumo:
The important application of semistatic hedging in financial markets naturally leads to the notion of quasi--self-dual processes. The focus of our study is to give new characterizations of quasi--self-duality. We analyze quasi--self-dual Lévy driven markets which do not admit arbitrage opportunities and derive a set of equivalent conditions for the stochastic logarithm of quasi--self-dual martingale models. Since for nonvanishing order parameter two martingale properties have to be satisfied simultaneously, there is a nontrivial relation between the order and shift parameter representing carrying costs in financial applications. This leads to an equation containing an integral term which has to be inverted in applications. We first discuss several important properties of this equation and, for some well-known Lévy-driven models, we derive a family of closed-form inversion formulae.
Resumo:
PURPOSE To systematically evaluate the dependence of intravoxel-incoherent-motion (IVIM) parameters on the b-value threshold separating the perfusion and diffusion compartment, and to implement and test an algorithm for the standardized computation of this threshold. METHODS Diffusion weighted images of the upper abdomen were acquired at 3 Tesla in eleven healthy male volunteers with 10 different b-values and in two healthy male volunteers with 16 different b-values. Region-of-interest IVIM analysis was applied to the abdominal organs and skeletal muscle with a systematic increase of the b-value threshold for computing pseudodiffusion D*, perfusion fraction Fp , diffusion coefficient D, and the sum of squared residuals to the bi-exponential IVIM-fit. RESULTS IVIM parameters strongly depended on the choice of the b-value threshold. The proposed algorithm successfully provided optimal b-value thresholds with the smallest residuals for all evaluated organs [s/mm2]: e.g., right liver lobe 20, spleen 20, right renal cortex 150, skeletal muscle 150. Mean D* [10(-3) mm(2) /s], Fp [%], and D [10(-3) mm(2) /s] values (±standard deviation) were: right liver lobe, 88.7 ± 42.5, 22.6 ± 7.4, 0.73 ± 0.12; right renal cortex: 11.5 ± 1.8, 18.3 ± 2.9, 1.68 ± 0.05; spleen: 41.9 ± 57.9, 8.2 ± 3.4, 0.69 ± 0.07; skeletal muscle: 21.7 ± 19.0; 7.4 ± 3.0; 1.36 ± 0.04. CONCLUSION IVIM parameters strongly depend upon the choice of the b-value threshold used for computation. The proposed algorithm may be used as a robust approach for IVIM analysis without organ-specific adaptation. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.
Resumo:
Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon’s implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike’s preceding ISI. As we show, the EIF’s exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron’s ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational theories about UP states during slow wave sleep and present possible extensions of the model in the context of spike-frequency adaptation.
Resumo:
We determine the mass of the bottom quark from high moments of the bbproduction cross section in e+e−annihilation, which are dominated by the threshold region. On the theory side next-to-next-to-next-to-leading order (NNNLO) calculations both for the resonances and the continuum cross section are used for the first time. We find mPSb(2GeV) =4.532+0.013−0.039GeVfor the potential-subtracted mass and mMSb(mMSb) =4.193+0.022−0.035GeVfor the MSbottom-quark mass.
Resumo:
OBJECTIVES In HIV-negative populations light to moderate alcohol consumption is associated with a lower cardiovascular morbidity and mortality than alcohol abstention. Whether the same holds true for HIV-infected individuals has not been evaluated in detail. DESIGN Cohort study METHODS:: Adults on antiretroviral therapy in the Swiss HIV Cohort Study with follow-up after August 2005 were included. We categorized alcohol consumption into: abstention, low (1-9 g/d), moderate (10-29 g/d in females and 10-39g/d in men) and high alcohol intake. Cox proportional hazards models were used to describe the association between alcohol consumption and cardiovascular disease free survival (combined endpoint) as well as cardiovascular disease events (CADE) and overall survival. Baseline and time-updated risk factors for CADE were included in the models. RESULTS Among 9,741 individuals included, there were 788 events of major CADE or death during 46,719 years of follow-up, corresponding to an incidence of 1.69 events/100 person-years. Follow-up according to alcohol consumption level was 51% abstention, 20% low, 23% moderate and 6% high intake. As compared to abstention, low (hazard ratio 0.79, 95% confidence interval 0.63-0.98) and moderate alcohol intake (0.78, 0.64-0.95) were associated with a lower incidence of the combined endpoint. There was no significant association between alcohol consumption and CADE. CONCLUSIONS Compared to abstention, low and moderate alcohol intake were associated with a better CADE-free survival. However, this result was mainly driven by mortality and the specific impact of drinking patterns and type of alcoholic beverage on this outcome remains to be determined.
Resumo:
The currently proposed space debris remediation measures include the active removal of large objects and “just in time” collision avoidance by deviating the objects using, e.g., ground-based lasers. Both techniques require precise knowledge of the attitude state and state changes of the target objects. In the former case, to devise methods to grapple the target by a tug spacecraft, in the latter, to precisely propagate the orbits of potential collision partners as disturbing forces like air drag and solar radiation pressure depend on the attitude of the objects. Non-resolving optical observations of the magnitude variations, so-called light curves, are a promising technique to determine rotation or tumbling rates and the orientations of the actual rotation axis of objects, as well as their temporal changes. The 1-meter telescope ZIMLAT of the Astronomical Institute of the University of Bern has been used to collect light curves of MEO and GEO objects for a considerable period of time. Recently, light curves of Low Earth Orbit (LEO) targets were acquired as well. We present different observation methods, including active tracking using a CCD subframe readout technique, and the use of a high-speed scientific CMOS camera. Technical challenges when tracking objects with poor orbit redictions, as well as different data reduction methods are addressed. Results from a survey of abandoned rocket upper stages in LEO, examples of abandoned payloads and observations of high area-to-mass ratio debris will be resented. Eventually, first results of the analysis of these light curves are provided.
Resumo:
Resistance in Neisseria gonorrhoeae to all available therapeutic antimicrobials has emerged and new efficacious drugs for treatment of gonorrhea are essential. The topoisomerase II inhibitor ETX0914 (also known as AZD0914) is a new spiropyrimidinetrione antimicrobial that has different mechanisms of action from all previous and current gonorrhea treatment options. In this study, the N. gonorrhoeae resistance determinants for ETX0914 were further described and the effects of ETX0914 on the growth of N. gonorrhoeae (ETX0914 wild type, single step selected resistant mutants, and efflux pump mutants) were examined in a novel in vitro time-kill curve analysis to estimate pharmacodynamic parameters of the new antimicrobial. For comparison, ciprofloxacin, azithromycin, ceftriaxone, and tetracycline were also examined (separately and in combination with ETX0914). ETX0914 was rapidly bactericidal for all wild type strains and had similar pharmacodynamic properties to ciprofloxacin. All selected resistant mutants contained mutations in amino acid codons D429 or K450 of GyrB and inactivation of the MtrCDE efflux pump fully restored the susceptibility to ETX0914. ETX0914 alone and in combination with azithromycin and ceftriaxone was highly effective against N. gonorrhoeae and synergistic interaction with ciprofloxacin, particularly for ETX0914-resistant mutants, was found. ETX0914, monotherapy or in combination with azithromycin (to cover additional sexually transmitted infections), should be considered for phase III clinical trials and future gonorrhea treatment.
Resumo:
This article analyzes the Jakobsonian classification of aphasias. It aims to show on the one hand the non-linguistic character of this classification and on the other hand its asymmetry, in spite of the fact that its author had conceived his structural construction as symmetrical. The non-linguistic character of Jakobson’s formulation is due to the absence of any definition of language, this absence being the main characteristic of Jakobsonian linguistics: concerning the aphasia problem, the Jakobsonian formulation is linguistic solely by virtue of its object, aphasia, which is already considered as a linguistic concern because it belongs to the field of « language », but which is not defined as such (as linguistic). As for asymmetry, it demonstrates first the circularity of the Jakobsonian representation of language (the duality between structure and functioning), and secondly the non-linguistic character – in the Saussurean sense of the term – of the aphasia problem. Thus it appears that breaking (in the sense of Gaston Bachelard) with idiom is the prerequisite of a scientific apprehension of language, and therefore of any interdisciplinarity, this being one of Jakobson’s favorite topics but one that this linguist failed to render fruitful because he did not offer a real definition of language.
Resumo:
This dissertation examined body mass index (BMI) growth trajectories and the effects of gender, ethnicity, dietary intake, and physical activity (PA) on BMI growth trajectories among 3rd to 12th graders (9-18 years of age). Growth curve model analysis was performed using data from The Child and Adolescent Trial for Cardiovascular Health (CATCH) study. The study population included 2909 students who were followed up from grades 3-12. The main outcome was BMI at grades 3, 4, 5, 8, and 12. ^ The results revealed that BMI growth differed across two distinct developmental periods of childhood and adolescence. Rate of BMI growth was faster in middle childhood (9-11 years old or 3rd - 5th grades) than in adolescence (11-18 years old or 5th - 12th grades). Students with higher BMI at 3rd grade (baseline) had faster rates of BMI growth. Three groups of students with distinct BMI growth trajectories were identified: high, average, and low. ^ Black and Hispanic children were more likely to be in the groups with higher baseline BMI and faster rates of BMI growth over time. The effects of gender or ethnicity on BMI growth differed across the three groups. The effects of ethnicity on BMI growth were weakened as the children aged. The effects of gender on BMI growth were attenuated in the groups with a large proportion of black and Hispanic children, i.e., “high” or “average” BMI trajectory group. After controlling for gender, ethnicity, and age at baseline, in the “high BMI trajectory”, rate of yearly BMI growth in middle childhood increased 0.102 for every 500 Kcals increase (p=0.049). No significant effects of percentage of energy from total fat and saturated fat on BMI growth were found. Baseline BMI increased 0.041 for every 30 minutes increased in moderate-to-vigorous PA (MVPA) in the “low BMI trajectory”, while Baseline BMI decreased 0.345 for every 30 minutes increased in vigorous PA (VPA) in the “high BMI trajectory”. ^ Childhood overweight and obesity interventions should start at the earliest possible ages, prior to 3rd grade and continue through grade school. Interventions should focus on all children, but specifically black and Hispanic children, who are more likely to be highest at-risk. Promoting VPA earlier in childhood is important for preventing overweight and obesity among children and adolescents. Interventions should target total energy intake, rather than only percentage of energy from total fat or saturated fat. ^
Resumo:
Sizes and power of selected two-sample tests of the equality of survival distributions are compared by simulation for small samples from unequally, randomly-censored exponential distributions. The tests investigated include parametric tests (F, Score, Likelihood, Asymptotic), logrank tests (Mantel, Peto-Peto), and Wilcoxon-Type tests (Gehan, Prentice). Equal sized samples, n = 18, 16, 32 with 1000 (size) and 500 (power) simulation trials, are compared for 16 combinations of the censoring proportions 0%, 20%, 40%, and 60%. For n = 8 and 16, the Asymptotic, Peto-Peto, and Wilcoxon tests perform at nominal 5% size expectations, but the F, Score and Mantel tests exceeded 5% size confidence limits for 1/3 of the censoring combinations. For n = 32, all tests showed proper size, with the Peto-Peto test most conservative in the presence of unequal censoring. Powers of all tests are compared for exponential hazard ratios of 1.4 and 2.0. There is little difference in power characteristics of the tests within the classes of tests considered. The Mantel test showed 90% to 95% power efficiency relative to parametric tests. Wilcoxon-type tests have the lowest relative power but are robust to differential censoring patterns. A modified Peto-Peto test shows power comparable to the Mantel test. For n = 32, a specific Weibull-exponential comparison of crossing survival curves suggests that the relative powers of logrank and Wilcoxon-type tests are dependent on the scale parameter of the Weibull distribution. Wilcoxon-type tests appear more powerful than logrank tests in the case of late-crossing and less powerful for early-crossing survival curves. Guidelines for the appropriate selection of two-sample tests are given. ^