796 resultados para Empirical Algorithm Analysis
Resumo:
Die Organisation und die strategische Kommunikation von Wahlkämpfen haben sich in den letzten Jahrzehnten in den meisten westeuropäischen Staaten gewandelt, so auch in der Schweiz. Die Kommunikationswissenschaft hat dafür den Begriff der „Professionalisierung“ geprägt und Eigenschaften zusammengetragen, die zu einem „professionalisierten“ Wahlkampf gehören – wie z.B. die Beauftragung von externen Expertinnen und Experten oder die direkte Ansprache von Wählerinnen und Wählern („narrowcasting“). Welche Hintergründe diese Professionalisierung aber hat und wie das Phänomen nicht nur praktisch zu beschreiben, sondern auch theoretisch zu begründen ist, wurde bisher kaum diskutiert. Hier setzt die vorliegende Dissertation an. Basierend auf einer Analyse von 23 Wahlkämpfen aus den Kantonen Aargau, Appenzell Ausserrhoden, Bern, Neuchâtel und Zürich mithilfe der Methode Fuzzy Set Qualitative Comparative Analysis (fsQCA) kommt sie zum Schluss, dass die Professionalisierung der Wahlkämpfe vor dem theoretischen Hintergrund des soziologischen Neo-Institutionalismus als Anpassung von Wahlkämpfen an sich verändernde Bedingungen, Erwartungen und Anforderungen in den wichtigsten Anspruchsgruppen oder „Umwelten“ für den Wahlkampf (Wählerinnen und Wähler, Mitglieder, Medien, andere Parteien) definiert werden kann. Daraus folgt, dass es nicht nur „die“ Professionalisierung gibt, sondern dass jeder Wahlkampf an jene Umwelten angepasst wird, wo diese Anpassung den Wahlkampfverantwortlichen am dringlichsten erscheint. Daher sollte Professionalisierung mit vier einzelnen Messinstrumenten bzw. Professionalisierungsindices – einem pro Umwelt – gemessen werden. Misst man Professionalisierung wie bisher üblich nur mit einem einzigen Messinstrument, gibt der resultierende Wert nur ein ungenaues Bild vom Grad der Professionalisierung des Wahlkampfs wieder und verschleiert, als Anpassung an welche Umwelt die Professionalisierung geschieht. Hat man ermittelt, wie professionalisiert ein Wahlkampf im Hinblick auf jede der vier relevantesten Umwelten ist, können dann auch zuverlässiger die Gründe analysiert werden, die zur jeweiligen Professionalisierung geführt haben. Die empirische Analyse der kantonalen Wahlkämpfe bestätigte, dass hinter der Professionalisierung in Bezug auf jede der vier Umwelten auch tatsächlich unterschiedliche Gründe stecken. Wahlkämpfe werden in Bezug auf die Ansprache der Wähler angepasst („professionalisiert“), wenn sie in urbanen Kontexten stattfinden. Den Wahlkampf im Hinblick auf die Mitglieder zu professionalisieren ist besonders wichtig, wenn die Konkurrenz zwischen den Parteien gross ist oder wenn eine Ansprache der Gesamtwählerschaft für eine Partei wenig gewinnbringend erscheint. Die Professionalisierung des Wahlkampfes in Bezug auf die Medien erfolgt dann, wenn er eine grosse, regional stark verteilte oder aber eine urbane Wählerschaft ansprechen muss. Für die Professionalisierung der Wahlkämpfe gegenüber anderen Parteien kann kein aussagekräftiger Schluss gezogen werden, da nur wenige der untersuchten Kantonalparteien ihre Wahlkämpfe überhaupt im Hinblick auf andere Parteien professionalisierten, indem sie die gegnerischen Wahlkämpfe beobachteten und den eigenen wenn nötig entsprechend anpassten.
Resumo:
In this paper, we extend the debate concerning Credit Default Swap valuation to include time varying correlation and co-variances. Traditional multi-variate techniques treat the correlations between covariates as constant over time; however, this view is not supported by the data. Secondly, since financial data does not follow a normal distribution because of its heavy tails, modeling the data using a Generalized Linear model (GLM) incorporating copulas emerge as a more robust technique over traditional approaches. This paper also includes an empirical analysis of the regime switching dynamics of credit risk in the presence of liquidity by following the general practice of assuming that credit and market risk follow a Markov process. The study was based on Credit Default Swap data obtained from Bloomberg that spanned the period January 1st 2004 to August 08th 2006. The empirical examination of the regime switching tendencies provided quantitative support to the anecdotal view that liquidity decreases as credit quality deteriorates. The analysis also examined the joint probability distribution of the credit risk determinants across credit quality through the use of a copula function which disaggregates the behavior embedded in the marginal gamma distributions, so as to isolate the level of dependence which is captured in the copula function. The results suggest that the time varying joint correlation matrix performed far superior as compared to the constant correlation matrix; the centerpiece of linear regression models.
Resumo:
Diamonds are known for both their beauty and their durability. Jefferson National Lab in Newport News, VA has found a way to utilize the diamond's strength to view the beauty of the inside of the atomic nucleus with the hopes of finding exotic forms of matter. By firing very fast electrons at a diamond sheet no thicker than a human hair, high energy particles of light known as photons are produced with a high degree of polarization that can illuminate the constituents of the nucleus known as quarks. The University of Connecticut Nuclear Physics group has responsibility for crafting these extremely thin, high quality diamond wafers. These wafers must be cut from larger stones that are about the size of a human finger, and then carefully machined down to the final thickness. The thinning of these diamonds is extremely challenging, as the diamond's greatest strength also becomes its greatest weakness. The Connecticut Nuclear Physics group has developed a novel technique to assist industrial partners in assessing the quality of the final machining steps, using a technique based on laser interferometry. The images of the diamond surface produced by the interferometer encode the thickness and shape of the diamond surface in a complex way that requires detailed analysis to extract. We have developed a novel software application to analyze these images based on the method of simulated annealing. Being able to image the surface of these diamonds without requiring costly X-ray diffraction measurements allows rapid feedback to the industrial partners as they refine their thinning techniques. Thus, by utilizing a material found to be beautiful by many, the beauty of nature can be brought more clearly into view.
Resumo:
Random Forests™ is reported to be one of the most accurate classification algorithms in complex data analysis. It shows excellent performance even when most predictors are noisy and the number of variables is much larger than the number of observations. In this thesis Random Forests was applied to a large-scale lung cancer case-control study. A novel way of automatically selecting prognostic factors was proposed. Also, synthetic positive control was used to validate Random Forests method. Throughout this study we showed that Random Forests can deal with large number of weak input variables without overfitting. It can account for non-additive interactions between these input variables. Random Forests can also be used for variable selection without being adversely affected by collinearities. ^ Random Forests can deal with the large-scale data sets without rigorous data preprocessing. It has robust variable importance ranking measure. Proposed is a novel variable selection method in context of Random Forests that uses the data noise level as the cut-off value to determine the subset of the important predictors. This new approach enhanced the ability of the Random Forests algorithm to automatically identify important predictors for complex data. The cut-off value can also be adjusted based on the results of the synthetic positive control experiments. ^ When the data set had high variables to observations ratio, Random Forests complemented the established logistic regression. This study suggested that Random Forests is recommended for such high dimensionality data. One can use Random Forests to select the important variables and then use logistic regression or Random Forests itself to estimate the effect size of the predictors and to classify new observations. ^ We also found that the mean decrease of accuracy is a more reliable variable ranking measurement than mean decrease of Gini. ^
Resumo:
Foreign currency deposits (FCD) are prevalent in many low-income developing countries, but their impact on bank lending has rarely been examined. An examination of cross-country data indicates that a higher proportion of FCD in total deposits is associated with growth in private credit only in inflationary circumstances (over 24 percent of the annual inflation rate). FCD can lead to a decline in private credit below this threshold level of inflation. Given that FCD exhibit persistence, deregulating them in low-income countries may do more harm than good on financial development in the long term, notably after successful containment of inflation.
Resumo:
This paper empirically analyzes India’s money demand function during the period of 1980 to 2007 using monthly data and the period of 1976 to 2007 using annual data. Cointegration test results indicated that when money supply is represented by M1 and M2, a cointegrating vector is detected among real money balances, interest rates, and output. In contrast, it was found that when money supply is represented by M3, there is no long-run equilibrium relationship in the money demand function. Moreover, when the money demand function was estimated using dynamic OLS, the sign onditions of the coefficients of output and interest rates were found to be consistent with theoretical rationale, and statistical significance was confirmed when money supply was represented by either M1 or M2. Consequently, though India’s central bank presently uses M3 as an indicator of future price movements, it is thought appropriate to focus on M1 or M2, rather than M3, in managing monetary policy.
Resumo:
This study aims to examine the international value distribution structure among major East Asian economies and the US. The mainstream trade theory explains the gains from trade; however, global value chain (GVC) approach emphasises uneven benefits of globalization among trading partners. The present study is mainly based on this view, examining which economy gains the most and which the least from the East Asian production networks. Two key industries, i.e., electronics and automobile, are our principle focus. Input-output method is employed to trace the creation and flows of value-added within the region. A striking fact is that some ASEAN economies increasingly reduce their shares of value-added, taken by developed countries, particularly by Japan. Policy implications are discussed in the final section.
Resumo:
This paper empirically analyzes India’s monetary policy reaction function by applying the Taylor (1993) rule and its open-economy version which employs dynamic OLS. The analysis uses monthly data from the period of April 1998 to December 2007. When the simple Taylor rule was estimated for India, the output gap coefficient was statistically significant, and its sign condition was found to be consistent with theoretical rationale; however, the same was not true of the inflation coefficient. When the Taylor rule with exchange rate was estimated, the coefficients of output gap and exchange rate had statistical significance with the expected signs, whereas the results of inflation remained the same as before. Therefore, the inflation rate has not played a role in the conduct of India’s monetary policy, and it is inappropriate for India to adopt an inflation-target type policy framework.