922 resultados para Key Agreement, Password Authentication, Three-party
Resumo:
BACKGROUND Current evidence on myelopoietic growth factors is difficult to overview for the practicing haematologist/oncologist. International guidelines are sometimes conflicting, exclude certain patient groups, or cannot directly be applied to the German health system. This guideline by the Infectious Diseases Working Party (AGIHO) of the German Society of Haematology and Medical Oncology (DGHO) gives evidence-based recommendations for the use of G-CSF, pegylated G-CSF, and biosimilars to prevent infectious complications in cancer patients undergoing chemotherapy, including those with haematological malignancies. METHODS We systematically searched and evaluated current evidence. An expert panel discussed the results and recommendations. We then compared our recommendations to current international guidelines. RESULTS We summarised the data from eligible studies in evidence tables, developed recommendations for different entities and risk groups. CONCLUSION Comprehensive literature search and expert panel consensus confirmed many key recommendations given by international guidelines. Evidence for growth factors during acute myeloid leukaemia induction chemotherapy and pegfilgrastim use in haematological malignancies was rated lower compared with other guidelines.
Resumo:
Introduction: Schizophrenia patients frequently suffer from complex motor abnormalities including fine and gross motor disturbances, abnormal involuntary movements, neurological soft signs and parkinsonism. These symptoms occur early in the course of the disease, continue in chronic patients and may deteriorate with antipsychotic medication. Furthermore gesture performance is impaired in patients, including the pantomime of tool use. Whether schizophrenia patients would show difficulties of actual tool use has not yet been investigated. Human tool use is complex and relies on a network of distinct and distant brain areas. We therefore aim to test if schizophrenia patients had difficulties in tool use and to assess associations with structural brain imaging using voxel based morphometry (VBM) and tract based spatial statistics (TBSS). Methode: In total, 44 patients with schizophrenia (DSM-5 criteria; 59% men, mean age 38) underwent structural MR imaging and performed the Tool-Use test. The test examines the use of a scoop and a hammer in three conditions: pantomime (without the tool), demonstration (with the tool) and actual use (with a recipient object). T1-weighted images were processed using SPM8 and DTI-data using FSL TBSS routines. To assess structural alterations of impaired tool use we first compared gray matter (GM) volume in VBM and white matter (WM) integrity in TBSS data of patients with and without difficulties of actual tool use. Next we explored correlations of Tool use scores and VBM and TBSS data. Group comparisons were family wise error corrected for multiple tests. Correlations were uncorrected (p < 0.001) with a minimum cluster threshold of 17 voxels (equivalent to a map-wise false positive rate of alpha < 0.0001 using a Monte Carlo procedure). Results: Tool use was impaired in schizophrenia (43.2% pantomime, 11.6% demonstration, 11.6% use). Impairment was related to reduced GM volume and WM integrity. Whole brain analyses detected an effect in the SMA in group analysis. Correlations of tool use scores and brain structure revealed alterations in brain areas of the dorso-dorsal pathway (superior occipital gyrus, superior parietal lobule, and dorsal premotor area) and the ventro-dorsal pathways (middle occipital gyrus, inferior parietal lobule) the action network, as well as the insula and the left hippocampus. Furthermore, significant correlations within connecting fiber tracts - particularly alterations within the bilateral corona radiata superior and anterior as well as the corpus callosum -were associated with Tool use performance. Conclusions: Tool use performance was impaired in schizophrenia, which was associated with reduced GM volume in the action network. Our results are in line with reports of impaired tool use in patients with brain lesions particularly of the dorso-dorsal and ventro-dorsal stream of the action network. In addition an effect of tool use on WM integrity was shown within fiber tracts connecting regions important for planning and executing tool use. Furthermore, hippocampus is part of a brain system responsible for spatial memory and navigation.The results suggest that structural brain alterations in the common praxis network contribute to impaired tool use in schizophrenia.
Resumo:
In his contribution, Joppke justifies his selection of foundational scholars by linking each to what he sees as the three key facets of citizenship: status, rights and identity. Maarten Vink explicitly links his research agenda to the first, status, and outlines why it is so important. In identifying three facets of citizenship, Joppke acknowledges that some academics would include political participation, but he ultimately decides against it. But here we can, and should, broaden citizenship studies by bringing in insights from the behavioral politics tradition in domestic politics - when and why people engage in political acts - and from the social movements literature in sociology. I believe that the American debate on immigration reform, admittedly stalled, would not have advanced as far as it has without the social movement activism of DREAMers - unauthorized young people pushing for a path to citizenship - and the belief that Barack Obama won re-election in part because of the Latino vote. Importantly, one type of political activism demands formal citizenship, the other does not. As many contributors note, the “national models” approach has had a significant impact on citizenship studies. Whether one views such models through a cultural, institutional or historical lens, this tends to be a top-down, macro-level framework. What about immigrants’ agency? In Canada, although the ruling Conservative government is shifting citizenship discourse to a more traditional language - as Winter points out - it has not reduced immigration, ended dual citizenship, or eliminated multiculturalism, all goals of the Reform Party that the current prime minister once helped build. “Lock-in” effects (or policy feedback loops) based on high immigrant naturalization and the coming of age of a second-generation with citizenship also d emands study, in North America and elsewhere. Much of the research thus far suggests that political decisions over citizenship status and rights do not seem linked to immigrants’ political activism. State-centered decision-making may have characterized policy in the early post-World War II period in Europe (and East Asia?), but does it continue to hold today? Majority publics and immigrant-origin residents are increasingly politicized around citizenship and immigration. Does immigrant agency extend citizenship status, rights and identity to those born outside the polity? Is electoral power key, or is protest necessary? How is citizenship practiced, and contested, irrespective of formal status? These are important and understudied empirical questions, ones that demand theoretical creativity - across sub-fields and disciplines - in conceptualizing and understanding citizenship in contemporary times.
Resumo:
Three-dimensional oxalate-based {[Ru(bpy)3][Cu2xNi2(1-x)(ox)3]}n (0≤ x ≤ 1, ox = C2O42-, bpy = 2,2‘bipyridine) were synthesized. The structure was determined for x = 1 by X-ray diffraction on single crystal. The compound crystallizes in the cubic space group P4132. It shows a three-dimensional 10-gon 3-connected (10,3) anionic network where copper(II) has an unusual tris(bischelated) environment. X-ray powder diffraction patterns and their Rietveld refinement show that all the compounds along the series are isostructural and single-phased. According to X-ray absorption spectroscopy, copper(II) and nickel(II) have an octahedral environment, respectively elongated and trigonally distorted. As shown by natural circular dichroism, the optically active forms of {[Ru(bpy)3][CuxNi2(1-x)(ox)3]}n are obtained starting from resolved Δ- or Λ-[Ru(bpy)3]2+. The Curie−Weiss temperatures range between −55 (x = 1) and −150 K (x = 0). The antiferromagnetic exchange interaction thus decreases when the copper contents increases in agreement with the crystallographic structure of the compounds and the electronic structure of the metal ions. At low temperature, the compounds exhibit complex long-range ordered magnetic behavior.
Resumo:
Luminescence and energy transfer in [Zn1-xRux(bpy)3][NaAl1-yCry(ox)3] (x ≈ 0.01, y = 0.006 − 0.22; bpy = 2,2‘-bipyridine, ox = C2O42-) and [Zn1-x-yRuxOsy(bpy)3][NaAl(ox)3] (x ≈ 0.01, y = 0.012) are presented and discussed. Surprisingly, the luminescence of the isolated luminophores [Ru(bpy)3]2+ and [Os(bpy)3]2+ in [Zn(bpy)3][NaAl(ox)3] is hardly quenched at room temperature. Steady-state luminescence spectra and decay curves show that energy transfer occurs between [Ru(bpy)3]2+ and [Cr(ox)3]3- and between [Ru(bpy)3]2+ and [Os(bpy)3]2+ in [Zn1-xRux(bpy)3][NaAl1-yCry(ox)3] and [Zn1-x-yRuxOsy(bpy)3] [NaAl(ox)3], respectively. For a quantitative investigation of the energy transfer, a shell type model is developed, using a Monte Carlo procedure and the structural parameters of the systems. A good description of the experimental data is obtained assuming electric dipole−electric dipole interaction between donors and acceptors, with a critical distance Rc for [Ru(bpy)3]2+ to [Cr(ox)3]3- energy transfer of 15 Å and for [Ru(bpy)3]2+ to [Os(bpy)3]2+ energy transfer of 33 Å. These values are in good agreement with those derived using the Förster−Dexter theory.
Resumo:
In 2014, the Dispute Settlement Body (DSB) of the World Trade Organization (WTO) adopted seven panel reports and six Appellate Body rulings. Two of the cases relate to anti-dumping measures. Three cases, comprising five complaints, are of particular interest and these are summarized and discussed below. China – Rare Earths further refines the relationship between protocols of accession and the general provisions of WTO agreements, in particular the exceptions of Article XX GATT. Recourse to that provision is no longer excluded but depends on a careful case-by-case analysis. While China failed to comply with the conditions for export restrictions, the case reiterates the problem of insufficiently developed disciplines on export restrictions on strategic minerals and other commodities in WTO law. EC – Seals Products is a landmark case for two reasons. Firstly, it limits the application of the Agreement on Technical Barriers to Trade (TBT Agreement) resulting henceforth in a narrow reading of technical regulations. Normative rules prescribing conditions for importation are to be dealt with under the rules of the General Agreement on Tariffs and Trade (GATT) instead. Secondly, the ruling permits recourse to public morals in justifying import restrictions essentially on the basis of process and production methods (PPMs). Meanwhile, the more detailed implications for extraterritorial application of such rules and for the concept of PPMs remain open as these key issues were not raised by the parties to the case. Peru – Agricultural Products adds to the interpretation of the Agreement on Agriculture (AoA), but most importantly, it confirms the existing segregation of WTO law and the law of free trade agreements. The case is of particular importance for Switzerland in its relations with the European Union (EU). The case raises, but does not fully answer, the question whether in a bilateral agreement, Switzerland or the EU can, as a matter of WTO law, lawfully waive their right of lodging complaints against each other under WTO law within the scope of their bilateral agreement, for example the Agreement on Agriculture where such a clause exists.
Resumo:
Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.
Resumo:
Seizure freedom in patients suffering from pharmacoresistant epilepsies is still not achieved in 20–30% of all cases. Hence, current therapies need to be improved, based on a more complete understanding of ictogenesis. In this respect, the analysis of functional networks derived from intracranial electroencephalographic (iEEG) data has recently become a standard tool. Functional networks however are purely descriptive models and thus are conceptually unable to predict fundamental features of iEEG time-series, e.g., in the context of therapeutical brain stimulation. In this paper we present some first steps towards overcoming the limitations of functional network analysis, by showing that its results are implied by a simple predictive model of time-sliced iEEG time-series. More specifically, we learn distinct graphical models (so called Chow–Liu (CL) trees) as models for the spatial dependencies between iEEG signals. Bayesian inference is then applied to the CL trees, allowing for an analytic derivation/prediction of functional networks, based on thresholding of the absolute value Pearson correlation coefficient (CC) matrix. Using various measures, the thus obtained networks are then compared to those which were derived in the classical way from the empirical CC-matrix. In the high threshold limit we find (a) an excellent agreement between the two networks and (b) key features of periictal networks as they have previously been reported in the literature. Apart from functional networks, both matrices are also compared element-wise, showing that the CL approach leads to a sparse representation, by setting small correlations to values close to zero while preserving the larger ones. Overall, this paper shows the validity of CL-trees as simple, spatially predictive models for periictal iEEG data. Moreover, we suggest straightforward generalizations of the CL-approach for modeling also the temporal features of iEEG signals.
Resumo:
To say that regionalism is gaining momentum has become an understatement. To mourn the lack of progress in multilateral trade rule-making is a commonplace in the discourse of politicians regretting the WTO negotiation standstill, and of “know-what-to-do” academics. The real problem is the uneven level-playing field resulting from increasing differences of rules and obligations. The Transatlantic Trade and Investment Partnership Agreement (TTIP) is a very ambitious project. WTI studies in 2014 have shown that the implications for Switzerland could be enormous. But even the combined market power of the two TTIP participants – the EU and the USA – will not level the playing field impairing the regulatory framework, and the market access barriers for trade in agriculture. Such differences will remain in three areas which, incidentally, are also vital for a global response to the food security challenge to feed 9 billion people before the year 2050: market access, non-tariff barriers, and trade-distorting domestic support programmes. This means that without multilateral progress the TTIP and other so-called mega-regionals, if successfully concluded, will exacerbate rather than lessen trade distortions. While this makes farmers in rich countries safer from competition, competitive production in all countries will be hampered. Consequently, and notwithstanding the many affirmations to the contrary, farm policies worldwide will continue to only address farmer security without increasing global food security. What are the implications of the TTIP for Swiss agriculture? This article, commissioned by Waseda University in Tokyo, finds that the failure to achieve further reforms – including a number of areas where earlier reforms have been reversed – is presenting Switzerland and Swiss agriculture with a terrible dilemma in the eventuality of a successful conclusion of the TTIP. If Swiss farm production is to survive for more than another generation, continuous reform efforts are required, and over-reliance on the traditional instruments of border protection and product support is to be avoided. Without a substantial TTIP obliging Switzerland to follow suit, autonomous reforms will remain extremely fragile.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
BACKGROUND The aim of this study was to evaluate the accuracy of linear measurements on three imaging modalities: lateral cephalograms from a cephalometric machine with a 3 m source-to-mid-sagittal-plane distance (SMD), from a machine with 1.5 m SMD and 3D models from cone-beam computed tomography (CBCT) data. METHODS Twenty-one dry human skulls were used. Lateral cephalograms were taken, using two cephalometric devices: one with a 3 m SMD and one with a 1.5 m SMD. CBCT scans were taken by 3D Accuitomo® 170, and 3D surface models were created in Maxilim® software. Thirteen linear measurements were completed twice by two observers with a 4 week interval. Direct physical measurements by a digital calliper were defined as the gold standard. Statistical analysis was performed. RESULTS Nasion-Point A was significantly different from the gold standard in all methods. More statistically significant differences were found on the measurements of the 3 m SMD cephalograms in comparison to the other methods. Intra- and inter-observer agreement based on 3D measurements was slightly better than others. LIMITATIONS Dry human skulls without soft tissues were used. Therefore, the results have to be interpreted with caution, as they do not fully represent clinical conditions. CONCLUSIONS 3D measurements resulted in a better observer agreement. The accuracy of the measurements based on CBCT and 1.5 m SMD cephalogram was better than a 3 m SMD cephalogram. These findings demonstrated the linear measurements accuracy and reliability of 3D measurements based on CBCT data when compared to 2D techniques. Future studies should focus on the implementation of 3D cephalometry in clinical practice.
Resumo:
Background: It is yet unclear if there are differences between using electronic key feature problems (KFPs) or electronic case-based multiple choice questions (cbMCQ) for the assessment of clinical decision making. Summary of Work: Fifth year medical students were exposed to clerkships which ended with a summative exam. Assessment of knowledge per exam was done by 6-9 KFPs, 9-20 cbMCQ and 9-28 MC questions. Each KFP consisted of a case vignette and three key features (KF) using “long menu” as question format. We sought students’ perceptions of the KFPs and cbMCQs in focus groups (n of students=39). Furthermore statistical data of 11 exams (n of students=377) concerning the KFPs and (cb)MCQs were compared. Summary of Results: The analysis of the focus groups resulted in four themes reflecting students’ perceptions of KFPs and their comparison with (cb)MCQ: KFPs were perceived as (i) more realistic, (ii) more difficult, (iii) more motivating for the intense study of clinical reasoning than (cb)MCQ and (iv) showed an overall good acceptance when some preconditions are taken into account. The statistical analysis revealed that there was no difference in difficulty; however KFP showed a higher discrimination and reliability (G-coefficient) even when corrected for testing times. Correlation of the different exam parts was intermediate. Conclusions: Students perceived the KFPs as more motivating for the study of clinical reasoning. Statistically KFPs showed a higher discrimination and higher reliability than cbMCQs. Take-home messages: Including KFPs with long menu questions into summative clerkship exams seems to offer positive educational effects.
Resumo:
Studies suggest that depression affects glucose metabolism, and therefore is a risk factor for insulin resistance. The association between depression and insulin resistance has been investigated in a number of studies, but there is no agreement on the results. The objective of this study is to survey the epidemiological studies, identify the ones that measured the association of depression (as exposure) with insulin resistance (as outcome), and perform a systematic review to assess the reliability and strength of the association. For high quality reporting, and assessment, this systematic review used the outlined procedures, guidelines and recommendations for reviews in health care, suggested by the Centre for Reviews and Dissemination, along with recommendations from the STROBE group (Strengthening the Reporting of Observational Studies in Epidemiology). Ovid MEDLINE 1996 to April Week 1 2010, was used to identify the relevant epidemiological studies. To identify the most relevant set of articles for this systematic review, a set of inclusion and exclusion criteria were applied. Six studies that met the specific criteria were selected. Key information from identified studies was tabulated, and the methodological quality, internal and external validity, and the strength of the evidence of the selected studies were assessed. The result from the tabulated data of the reviewed studies indicates that the studies either did not apply a case definition for insulin resistance in their investigation, or did not state a specific value for the index used to define insulin resistance. The quality assessment of the reviewed studies indicates that to assess the association between insulin resistance and depression, specifying a case definition for insulin resistance is important. The case definition for insulin resistance is defined by the World Health Organization and the European Group for the Study of Insulin Resistance as the insulin sensitivity index of the lowest quartile or lowest decile of a general population, respectively. Three studies defined the percentile cut-off point for insulin resistance, but did not give the insulin sensitivity index value. In these cases, it is not possible to compare the results. Three other studies did not define the cut-off point for insulin resistance. In these cases, it is hard to confirm the existence of insulin resistance. In conclusion, to convincingly answer our question, future studies need to adopt a clear case definition, define a percentile cut-off point and reference population, and give value of the insulin resistance measure at the specified percentile.^
Resumo:
Birth defects are the leading cause of infant mortality in the United States and are a major cause of lifetime disability. However, efforts to understand their causes have been hampered by a lack of population-specific data. During 1990–2004, 22 state legislatures responded to this need by proposing birth defects surveillance legislation (BDSL). The contrast between these states and those that did not pass BDSL provides an opportunity to better understand conditions associated with US public health policy diffusion. ^ This study identifies key state-specific determinants that predict: (1) the introduction of birth defects surveillance legislation (BDSL) onto states' formal legislative agenda, and (2) the successful adoption of these laws. Secondary aims were to interpret these findings in a theoretically sound framework and to incorporate evidence from three analytical approaches. ^ The study begins with a comparative case study of Texas and Oregon (states with divergent BDSL outcomes), including a review of historical documentation and content analysis of key informant interviews. After selecting and operationalizing explanatory variables suggested by the case study, Qualitative Comparative Analysis (QCA) was applied to publically available data to describe important patterns of variation among 37 states. Results from logistic regression were compared to determine whether the two methods produced consistent findings. ^ Themes emerging from the comparative case study included differing budgetary conditions and the significance of relationships within policy issue networks. However, the QCA and statistical analysis pointed to the importance of political parties and contrasting societal contexts. Notably, state policies that allow greater access to citizen-driven ballot initiatives were consistently associated with lower likelihood of introducing BDSL. ^ Methodologically, these results indicate that a case study approach, while important for eliciting valuable context-specific detail, may fail to detect the influence of overarching, systemic variables, such as party competition. However, QCA and statistical analyses were limited by a lack of existing data to operationalize policy issue networks, and thus may have downplayed the impact of personal interactions. ^ This study contributes to the field of health policy studies in three ways. First, it emphasizes the importance of collegial and consistent relationships among policy issue network members. Second, it calls attention to political party systems in predicting policy outcomes. Finally, a novel approach to interpreting state data in a theoretically significant manner (QCA) has been demonstrated.^
Resumo:
The purpose of this dissertation was to develop a conceptual framework which can be used to account for policy decisions made by the House Ways and Means Committee (HW&MC) of the Texas House of Representatives. This analysis will examine the actions of the committee over a ten-year period with the goal of explaining and predicting the success of failure of certain efforts to raise revenue.^ The basis framework for modelling the revenue decision-making process includes three major components--the decision alternatives, the external factors and two competing contingency theories. The decision alternatives encompass the particular options available to increase tax revenue. The options were classified as non-innovative or innovative. The non-innovative options included the sales, franchise, property and severance taxes. The innovative options were principally the personal and corporate income taxes.^ The external factors included political and economic constraints that affected the actions of the HW&MC. Several key political constraints on committee decision-making were addressed--including public attitudes, interest groups, political party strength and tradition and precedents. The economic constraints that affected revenue decisions included court mandates, federal mandates and the fiscal condition of the nation and the state.^ The third component of the revenue decision-making framework included two alternative contingency theories. The first alternative theory postulated that the committee structure, including the individual member roles and the overall committee style, resulted in distinctive revenue decisions. This theory will be favored if evidence points to the committee acting autonomously with less concern for the policies of the Speaker of the House. The Speaker assignment theory, postulated that the assignment of committee members shaped or changed the course of committee decision-making. This theory will be favored if there was evidence that the committee was strictly a vehicle for the Speaker to institute his preferred tax policies.^ The ultimate goal of this analysis is to develop an explanation for legislative decision-making about tax policy. This explanation will be based on the linkages across various tax options, political and economic constraints, member roles and committee style and the patterns of committee assignment. ^