16 resultados para New Keynesian Phillips curve

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Both cointegration methods, and non-cointegrated structural VARs identified based on either long-run restrictions, or a combination of long-run and sign restrictions, are used in order to explore the long-run trade-off between inflation and the unemployment rate in the post-WWII U.S., U.K., Euro area, Canada, and Australia. Overall, neither approach produces clear evidence of a non-vertical trade-off. The extent of uncertainty surrounding the estimates is however substantial, thus implying that a researcher holding alternative priors about what a reasonable slope of the long-run trade-off might be will likely not see her views falsified

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to develop a new simple method for analyzing one-dimensional transcranial magnetic stimulation (TMS) mapping studies in humans. Motor evoked potentials (MEP) were recorded from the abductor pollicis brevis (APB) muscle during stimulation at nine different positions on the scalp along a line passing through the APB hot spot and the vertex. Non-linear curve fitting according to the Levenberg-Marquardt algorithm was performed on the averaged amplitude values obtained at all points to find the best-fitting symmetrical and asymmetrical peak functions. Several peak functions could be fitted to the experimental data. Across all subjects, a symmetric, bell-shaped curve, the complementary error function (erfc) gave the best results. This function is characterized by three parameters giving its amplitude, position, and width. None of the mathematical functions tested with less or more than three parameters fitted better. The amplitude and position parameters of the erfc were highly correlated with the amplitude at the hot spot and with the location of the center of gravity of the TMS curve. In conclusion, non-linear curve fitting is an accurate method for the mathematical characterization of one-dimensional TMS curves. This is the first method that provides information on amplitude, position and width simultaneously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Benign Prostatic Hyperplasia is a common entity among the aging male population. Its prevalence is increasing with age and is around 80% in the over 80-years old. The androgen-estrogen ratio changes in favor of the estrogens, which leads to a growth of prostatic tissue, presenting histologically as hyperplasia. BPH can cause irritative or obstructive symptoms or both. Nowadays we speak of bladder storage or bladder voiding symptoms, summarised as LUTS (Lower Urinary Tract Symptoms). LUTS has a structural and a functional component, the structural being caused by the size of the adenoma itself the functional depending on the muscle tone of the bladder neck and the prostatic urethra. To investigate LUTS, we use validated symptom scores, sonography for residual urine and eventually a urodynamic evaluation. There are 3 grades of BPH. The indication for an interventional therapy is relative in BPH II, and absolute in BPH III. Prior to treatment, other diseases mimicking the same symptoms, have to be ruled out and adequatly treated. Electro-resection of the prostate (TUR-P) remains the standard therapy and the benchmark any new technology has to compete with. TUR-P has good short- and longterm results, but can be associated with a considerable perioperative morbidity, and the learning curve for the operator is long. The most promising of the newer techniques is the Holmium-Laser-Enucleation of the prostate (Laser-TUR-P), showing at least identical short- and median-term results, but a lower perioperative morbidity than TUR-P For several minimally-invasive techniques, indications are limited. TUMT TUNA, WIT and laser-coagulation all produce a coagulation necrosis of the prostatic tissue by thermic damage with secondary tissue shrinking. Urodynamic results however, are not comparable to TUR-P or Laser-TUR-P, and significantly more secondary interventions within 2 to 5 years are required. Minimal-invasive techniques present a favorable alternative for younger patients without complications of BPH, and for older patients with relevant comorbidities, and can usually be performed under local anaesthesia. The morbidity is low and further therapies remain possible later, if necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to assess the influence of the zero value subtraction on the performance of laser fluorescence (LFpen) for approximal caries detection. Three areas (cuspal, middle and cervical) of both mesial and distal buccal surfaces of 78 permanent molars were assessed using both wedge-shaped (WDG) and tapered wedge-shaped (TWDG) tips. With the addition of the average, one cut-off value for each area was obtained and the performance was assessed. The areas under the receiver operating characteristics (ROC) curve, specificity, sensitivity and accuracy with and without the zero value subtraction were calculated. The McNemar test revealed a statistically significant difference for specificity at thresholds D(1), D(2) and D(3) (WDG) and D(1) and D(2) (TWDG) when the zero value subtraction was not performed. Influence of the zero value subtraction on the LFpen performance was observed for approximal caries detection. However, when modified cut-off values were used, the zero value subtraction could be eliminated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Development of an interpolation algorithm for re‐sampling spatially distributed CT‐data with the following features: global and local integral conservation, avoidance of negative interpolation values for positively defined datasets and the ability to control re‐sampling artifacts. Method and Materials: The interpolation can be separated into two steps: first, the discrete CT‐data has to be continuously distributed by an analytic function considering the boundary conditions. Generally, this function is determined by piecewise interpolation. Instead of using linear or high order polynomialinterpolations, which do not fulfill all the above mentioned features, a special form of Hermitian curve interpolation is used to solve the interpolation problem with respect to the required boundary conditions. A single parameter is determined, by which the behavior of the interpolation function is controlled. Second, the interpolated data have to be re‐distributed with respect to the requested grid. Results: The new algorithm was compared with commonly used interpolation functions based on linear and second order polynomial. It is demonstrated that these interpolation functions may over‐ or underestimate the source data by about 10%–20% while the parameter of the new algorithm can be adjusted in order to significantly reduce these interpolation errors. Finally, the performance and accuracy of the algorithm was tested by re‐gridding a series of X‐ray CT‐images. Conclusion: Inaccurate sampling values may occur due to the lack of integral conservation. Re‐sampling algorithms using high order polynomialinterpolation functions may result in significant artifacts of the re‐sampled data. Such artifacts can be avoided by using the new algorithm based on Hermitian curve interpolation

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Palynology provides the opportunity to make inferences on changes in diversity of terrestrial vegetation over long time scales. The often coarse taxonomic level achievable in pollen analysis, differences in pollen production and dispersal, and the lack of pollen source boundaries hamper the application of diversity indices to palynology. Palynological richness, the number of pollen types at a constant pollen count, is the most robust and widely used diversity indicator for pollen data. However, this index is also influenced by the abundance distribution of pollen types in sediments. In particular, where the index is calculated by rarefaction analysis, information on taxonomic richness at low abundance may be lost. Here we explore information that can be extracted from the accumulation of taxa over consecutive samples. The log-transformed taxa accumulation curve can be broken up into linear sections with different slope and intersect parameters, describing the accumulation of new taxa within the section. The breaking points may indicate changes in the species pool or in the abundance of high versus low pollen producers. Testing this concept on three pollen diagrams from different landscapes, we find that the break points in the taxa accumulation curves provide convenient zones for identifying changes in richness and evenness. The linear regressions over consecutive samples can be used to inter- and extrapolate to low or extremely high pollen counts, indicating evenness and richness in taxonomic composition within these zones. An evenness indicator, based on the rank-order-abundance is used to assist in the evaluation of the results and the interpretation of the fossil records. Two central European pollen diagrams show major changes in the taxa accumulation curves for the Lateglacial period and the time of human induced land-use changes, while they do not indicate strong changes in the species pool with the onset of the Holocene. In contrast, a central Swedish pollen diagram shows comparatively little change, but high richness during the early Holocene forest establishment. Evenness and palynological richness are related for most periods in the three diagrams, however, sections before forest establishment and after forest clearance show high evenness, which is not necessarily accompanied by high palynological richness, encouraging efforts to separate the two.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Estimates of the size of the undiagnosed HIV-infected population are important to understand the HIV epidemic and to plan interventions, including "test-and-treat" strategies. METHODS We developed a multi-state back-calculation model to estimate HIV incidence, time between infection and diagnosis, and the undiagnosed population by CD4 count strata, using surveillance data on new HIV and AIDS diagnoses. The HIV incidence curve was modelled using cubic splines. The model was tested on simulated data and applied to surveillance data on men who have sex with men in The Netherlands. RESULTS The number of HIV infections could be estimated accurately using simulated data, with most values within the 95% confidence intervals of model predictions. When applying the model to Dutch surveillance data, 15,400 (95% confidence interval [CI] = 15,000, 16,000) men who have sex with men were estimated to have been infected between 1980 and 2011. HIV incidence showed a bimodal distribution, with peaks around 1985 and 2005 and a decline in recent years. Mean time to diagnosis was 6.1 (95% CI = 5.8, 6.4) years between 1984 and 1995 and decreased to 2.6 (2.3, 3.0) years in 2011. By the end of 2011, 11,500 (11,000, 12,000) men who have sex with men in The Netherlands were estimated to be living with HIV, of whom 1,750 (1,450, 2,200) were still undiagnosed. Of the undiagnosed men who have sex with men, 29% (22, 37) were infected for less than 1 year, and 16% (13, 20) for more than 5 years. CONCLUSIONS This multi-state back-calculation model will be useful to estimate HIV incidence, time to diagnosis, and the undiagnosed HIV epidemic based on routine surveillance data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND  Reducing the fraction of transmissions during recent human immunodeficiency virus (HIV) infection is essential for the population-level success of "treatment as prevention". METHODS  A phylogenetic tree was constructed with 19 604 Swiss sequences and 90 994 non-Swiss background sequences. Swiss transmission pairs were identified using 104 combinations of genetic distance (1%-2.5%) and bootstrap (50%-100%) thresholds, to examine the effect of those criteria. Monophyletic pairs were classified as recent or chronic transmission based on the time interval between estimated seroconversion dates. Logistic regression with adjustment for clinical and demographic characteristics was used to identify risk factors associated with transmission during recent or chronic infection. FINDINGS  Seroconversion dates were estimated for 4079 patients on the phylogeny, and comprised between 71 (distance, 1%; bootstrap, 100%) to 378 transmission pairs (distance, 2.5%; bootstrap, 50%). We found that 43.7% (range, 41%-56%) of the transmissions occurred during the first year of infection. Stricter phylogenetic definition of transmission pairs was associated with higher recent-phase transmission fraction. Chronic-phase viral load area under the curve (adjusted odds ratio, 3; 95% confidence interval, 1.64-5.48) and time to antiretroviral therapy (ART) start (adjusted odds ratio 1.4/y; 1.11-1.77) were associated with chronic-phase transmission as opposed to recent transmission. Importantly, at least 14% of the chronic-phase transmission events occurred after the transmitter had interrupted ART. CONCLUSIONS  We demonstrate a high fraction of transmission during recent HIV infection but also chronic transmissions after interruption of ART in Switzerland. Both represent key issues for treatment as prevention and underline the importance of early diagnosis and of early and continuous treatment.