997 resultados para Threshold regression


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, by employing the threshold regression method, we estimate the average tariff equivalent of fixed costs for the use of a free trade agreement (FTA) among all existing FTAs in the world. It is estimated to be 3.2%. This global estimate serves as a reference rate in the evaluation of each FTA’s fixed costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This Working Project studies five portfolios of currency carry trades formed with the G10 currencies. Performance varies among strategies and the most basic one presents the worst results. I also study the equity and Pure FX risk factors which can explain the portfolios’ returns. Equity factors do not explain these returns while the Pure FX do for some of the strategies. Downside risk measures indicate the importance of using regime indicators to avoid losses. I conclude that although using VAR and threshold regression models with a variety of regime indicators do not allow the perception of different regimes, with a defined exogenous threshold on real exchange rates, an indicator of liquidity and the volatilities of the spot exchange rates it is possible to increase the average returns and reduce drawdowns of the carry trades

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The follow-up care for women with breast cancer requires an understanding of disease recurrence patterns and the follow-up visit schedule should be determined according to the times when the recurrence are most likely to occur, so that preventive measure can be taken to avoid or minimize the recurrence. Objective: To model breast cancer recurrence through stochastic process with an aim to generate a hazard function for determining a follow-up schedule. Methods: We modeled the process of disease progression as the time transformed Weiner process and the first-hitting-time was used as an approximation of the true failure time. The women's "recurrence-free survival time" or a "not having the recurrence event" is modeled by the time it takes Weiner process to cross a threshold value which represents a woman experiences breast cancer recurrence event. We explored threshold regression model which takes account of covariates that contributed to the prognosis of breast cancer following development of the first-hitting time model. Using real data from SEER-Medicare, we proposed models of follow-up visits schedule on the basis of constant probability of disease recurrence between consecutive visits. Results: We demonstrated that the threshold regression based on first-hitting-time modeling approach can provide useful predictive information about breast cancer recurrence. Our results suggest the surveillance and follow-up schedule can be determined for women based on their prognostic factors such as tumor stage and others. Women with early stage of disease may be seen less frequently for follow-up visits than those women with locally advanced stages. Our results from SEER-Medicare data support the idea of risk-controlled follow-up strategies for groups of women. Conclusion: The methodology we proposed in this study allows one to determine individual follow-up scheduling based on a parametric hazard function that incorporates known prognostic factors.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Several studies have analyzed discretionary accruals to address earnings-smoothing behaviors in the banking industry. We argue that the characteristic link between accruals and earnings may be nonlinear, since both the incentives to manipulate income and the practical way to do so depend partially on the relative size of earnings. Given a sample of 15,268 US banks over the period 1996–2011, the main results in this paper suggest that, depending on the size of earnings, bank managers tend to engage in earnings-decreasing strategies when earnings are negative (“big-bath”), use earnings-increasing strategies when earnings are positive, and use provisions as a smoothing device when earnings are positive and substantial (“cookie-jar” accounting). This evidence, which cannot be explained by the earnings-smoothing hypothesis, is consistent with the compensation theory. Neglecting nonlinear patterns in the econometric modeling of these accruals may lead to misleading conclusions regarding the characteristic strategies used in earnings management.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the wake of the global financial crisis, several macroeconomic contributions have highlighted the risks of excessive credit expansion. In particular, too much finance can have a negative impact on growth. We examine the microeconomic foundations of this argument, positing a non-monotonic relationship between leverage and firm-level productivity growth in the spirit of the trade-off theory of capital structure. A threshold regression model estimated on a sample of Central and Eastern European countries confirms that TFP growth increases with leverage until the latter reaches a critical threshold beyond which leverage lowers TFP growth. This estimate can provide guidance to firms and policy makers on identifying "excessive" leverage. We find similar non-monotonic relationships between leverage and proxies for firm value. Our results are a first step in bridging the gap between the literature on optimal capital structure and the wider macro literature on the finance-growth nexus. © 2012 Elsevier Ltd.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The accumulated oxygen deficit (AOD) method assumes a linear VO<sub>2</sub>-power relationship for exercise intensities increasing from below the lactate threshold (BLT) to above the lactate threshold (ALT). Factors that were likely to effect the linearity of the VO<sub>2</sub>-power regression and the precision of the estimated total energy demand (ETED) were investigated. These included the slow component of VO<sub>2</sub> kinetics (SC), a forced resting y-intercept and exercise intensities BLT and ALT. Criteria for linearity and precision included the Pearson correlation coefficient (PCC) of the VO<sub>2</sub>-power relationship, the length of the 95% confidence interval (95% CI) of the ETED and the standard error of the predicted value (SEP), respectively. Eight trained male and one trained female triathlete completed the required cycling tests to establish the AOD when pedalling at 80 rev/min. The influence of the SC on the linear extrapolation of the ETED was reduced by measuring VO<sub>2</sub> after three min of exercise. Measuring VO<sub>2</sub> at this time provided a new linear extrapolation method consisting of ten regression points spread evenly from BLT and ALT. This method produced an ETED with increased precision compared to using regression equations developed from intensities BLT with no forced y-intercept value; (95%CI (L), 0.70±0.26 versus 1.85±1.10, P<0.01; SEP(L/Watt), 0.07±0.02 versus 0.28±0.17; P<0.01). Including a forced y-intercept value with five regression points either BLT or ALT increased the precision of estimating the total energy demand to the same level as when using 10 regression points, (5 points BLT + y-intercept versus 5 points ALT + y-intercept versus 10 points; 95%CI(l), 0.61±0.32, 0.87±0.40, 0.70±0.26; SEP(L/Watt), 0.07±0.03, 0.08±0.04, 0.07±0.02; p>0.05). The VO<sub>2</sub>-power regression can be designed using a reduced number of regression points... ABSTRACT FROM AUTHOR

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dengue virus (DENV) transmission in Australia is driven by weather factors and imported dengue fever (DF) cases. However, uncertainty remains regarding the threshold effects of high-order interactions among weather factors and imported DF cases and the impact of these factors on autochthonous DF. A time-series regression tree model was used to assess the threshold effects of natural temporal variations of weekly weather factors and weekly imported DF cases in relation to incidence of weekly autochthonous DF from 1 January 2000 to 31 December 2009 in Townsville and Cairns, Australia. In Cairns, mean weekly autochthonous DF incidence increased 16.3-fold when the 3-week lagged moving average maximum temperature was <32 °C, the 4-week lagged moving average minimum temperature was ≥24 °C and the sum of imported DF cases in the previous 2 weeks was >0. When the 3-week lagged moving average maximum temperature was ≥32 °C and the other two conditions mentioned above remained the same, mean weekly autochthonous DF incidence only increased 4.6-fold. In Townsville, the mean weekly incidence of autochthonous DF increased 10-fold when 3-week lagged moving average rainfall was ≥27 mm, but it only increased 1.8-fold when rainfall was <27 mm during January to June. Thus, we found different responses of autochthonous DF incidence to weather factors and imported DF cases in Townsville and Cairns. Imported DF cases may also trigger and enhance local outbreaks under favorable climate conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The authors sought to quantify neighboring and distant interpoint correlations of threshold values within the visual field in patients with glaucoma. Methods: Visual fields of patients with confirmed or suspected glaucoma were analyzed (n = 255). One eye per patient was included. Patients were examined using the 32 program of the Octopus 1-2-3. Linear regression analysis among each of the locations and the rest of the points of the visual field was performed, and the correlation coefficient was calculated. The degree of correlation was categorized as high (r > 0.66), moderate (0.66 = r > 0.33), or low (r = 0.33). The standard error of threshold estimation was calculated. Results: Most locations of the visual field had high and moderate correlations with neighboring points and with distant locations corresponding to the same nerve fiber bundle. Locations of the visual field had low correlations with those of the opposite hemifield, with the exception of locations temporal to the blind spot. The standard error of threshold estimation increased from 0.6 to 0.9 dB with an r reduction of 0.1. Conclusion: Locations of the visual field have highest interpoint correlation with neighboring points and with distant points in areas corresponding to the distribution of the retinal nerve fiber layer. The quantification of interpoint correlations may be useful in the design and interpretation of visual field tests in patients with glaucoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Children have been shown to have higher lactate (LaTh) and ventilatory (VeTh) thresholds than adults, which might be explained by lower levels of type-II motor-unit (MU) recruitment. However, the electromyographic threshold (EMGTh), regarded as indicating the onset of accelerated type-II MU recruitment, has been investigated only in adults. Purpose To compare the relative exercise intensity at which the EMGTh occurs in boys versus men. Methods Participants were 21 men (23.4 ± 4.1 years) and 23 boys (11.1 ± 1.1 years), with similar habitual physical activity and peak oxygen consumption (VO2pk) (49.7 ± 5.5 vs. 50.1 ± 7.4 ml kg−1 min−1, respectively). Ramped cycle ergometry was conducted to volitional exhaustion with surface EMG recorded from the right and left vastus lateralis muscles throughout the test (~10 min). The composite right–left EMG root mean square (EMGRMS) was then calculated per pedal revolution. The EMGTh was then determined as the exercise intensity at the point of least residual sum of squares for any two regression line divisions of the EMGRMS plot. Results EMGTh was detected in 20/21 of the men (95.2 %) and only in 18/23 of the boys (78.3 %). The boys’ EMGTh was significantly higher than the men’s (86.4 ± 9.6 vs. 79.7 ± 10.0 % of peak power output at exhaustion; p < 0.05). The pattern was similar when EMGTh was expressed as percentage of VO2pk. Conclusions The boys’ higher EMGTh suggests delayed and hence lesser utilization of type-II MUs in progressive exercise, compared with men. The boys–men EMGTh differences were of similar magnitude as those shown for LaTh and VeTh, further suggesting a common underlying factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the relation between support vector machines (SVMs) for regression (SVMR) and SVM for classification (SVMC). We show that for a given SVMC solution there exists a SVMR solution which is equivalent for a certain choice of the parameters. In particular our result is that for $epsilon$ sufficiently close to one, the optimal hyperplane and threshold for the SVMC problem with regularization parameter C_c are equal to (1-epsilon)^{- 1} times the optimal hyperplane and threshold for SVMR with regularization parameter C_r = (1-epsilon)C_c. A direct consequence of this result is that SVMC can be seen as a special case of SVMR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two types of ecological thresholds are now being widely used to develop conservation targets: breakpoint-based thresholds represent tipping points where system properties change dramatically, whereas classification thresholds identify groups of data points with contrasting properties. Both breakpoint-based and classification thresholds are useful tools in evidence-based conservation. However, it is critical that the type of threshold to be estimated corresponds with the question of interest and that appropriate statistical procedures are used to determine its location. On the basis of their statistical properties, we recommend using piecewise regression methods to identify breakpoint-based thresholds and discriminant analysis or classification and regression trees to identify classification thresholds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sensory thresholds are often collected through ascending forced-choice methods. Group thresholds are important for comparing stimuli or populations; yet, the method has two problems. An individual may correctly guess the correct answer at any concentration step and might detect correctly at low concentrations but become adapted or fatigued at higher concentrations. The survival analysis method deals with both issues. Individual sequences of incorrect and correct answers are adjusted, taking into account the group performance at each concentration. The technique reduces the chance probability where there are consecutive correct answers. Adjusted sequences are submitted to survival analysis to determine group thresholds. The technique was applied to an aroma threshold and a taste threshold study. It resulted in group thresholds similar to ASTM or logarithmic regression procedures. Significant differences in taste thresholds between younger and older adults were determined. The approach provides a more robust technique over previous estimation methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use sunspot group observations from the Royal Greenwich Observatory (RGO) to investigate the effects of intercalibrating data from observers with different visual acuities. The tests are made by counting the number of groups RB above a variable cut-off threshold of observed total whole-spot area (uncorrected for foreshortening) to simulate what a lower acuity observer would have seen. The synthesised annual means of RB are then re-scaled to the full observed RGO group number RA using a variety of regression techniques. It is found that a very high correlation between RA and RB (rAB > 0.98) does not prevent large errors in the intercalibration (for example sunspot maximum values can be over 30 % too large even for such levels of rAB). In generating the backbone sunspot number (RBB), Svalgaard and Schatten (2015, this issue) force regression fits to pass through the scatter plot origin which generates unreliable fits (the residuals do not form a normal distribution) and causes sunspot cycle amplitudes to be exaggerated in the intercalibrated data. It is demonstrated that the use of Quantile-Quantile (“Q  Q”) plots to test for a normal distribution is a useful indicator of erroneous and misleading regression fits. Ordinary least squares linear fits, not forced to pass through the origin, are sometimes reliable (although the optimum method used is shown to be different when matching peak and average sunspot group numbers). However, other fits are only reliable if non-linear regression is used. From these results it is entirely possible that the inflation of solar cycle amplitudes in the backbone group sunspot number as one goes back in time, relative to related solar-terrestrial parameters, is entirely caused by the use of inappropriate and non-robust regression techniques to calibrate the sunspot data.