821 resultados para Strategic usage of IS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Programa de doctorado: Salud Pública (Epidemiología, Planificación y Nutrición)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For their survival, humans and animals can rely on motivational systems which are specialized in assessing the valence and imminence of dangers and appetitive cues. The Orienting Response (OR) is a fundamental response pattern that an organism executes whenever a novel or significant stimulus is detected, and has been shown to be consistently modulated by the affective value of a stimulus. However, detecting threatening stimuli and appetitive affordances while they are far away compared to when they are within reach constitutes an obvious evolutionary advantage. Building on the linear relationship between stimulus distance and retinal size, the present research was aimed at investigating the extent to which emotional modulation of distinct processes (action preparation, attentional capture, and subjective emotional state) is affected when reducing the retinal size of a picture. Studies 1-3 examined the effects of picture size on emotional response. Subjective feeling of engagement, as well as sympathetic activation, were modulated by picture size, suggesting that action preparation and subjective experience reflect the combined effects of detecting an arousing stimulus and assessing its imminence. On the other hand, physiological responses which are thought to reflect the amount of attentional resources invested in stimulus processing did not vary with picture size. Studies 4-6 were conducted to substantiate and extend the results of studies 1-3. In particular, it was noted that a decrease in picture size is associated with a loss in the low spatial frequencies of a picture, which might confound the interpretation of the results of studies 1-3. Therefore, emotional and neutral images which were either low-pass filtered or reduced in size were presented, and affective responses were measured. Most effects which were observed when manipulating image size were replicated by blurring pictures. However, pictures depicting highly arousing unpleasant contents were associated with a more pronounced decrease in affective modulation when pictures were reduced in size compared to when they were blurred. The present results provide important information for the study of processes involved in picture perception and in the genesis and expression of an emotional response. In particular, the availability of high spatial frequencies might affect the degree of activation of an internal representation of an affectively charged scene, and might modulate subjective emotional state and preparation for action. Moreover, the manipulation of stimulus imminence revealed important effects of stimulus engagement on specific components of the emotional response, and the implications of the present data for some models of emotions have been discussed. In particular, within the framework of a staged model of emotional response, the tactic and strategic role of response preparation and attention allocation to stimuli varying in engaging power has been discussed, considering the adaptive advantages that each might represent in an evolutionary view. Finally, the identification of perceptual parameters that allow affective processing to be carried out has important methodological applications in future studies examining emotional response in basic research or clinical contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The term "Brain Imaging" identi�es a set of techniques to analyze the structure and/or functional behavior of the brain in normal and/or pathological situations. These techniques are largely used in the study of brain activity. In addition to clinical usage, analysis of brain activity is gaining popularity in others recent �fields, i.e. Brain Computer Interfaces (BCI) and the study of cognitive processes. In this context, usage of classical solutions (e.g. f MRI, PET-CT) could be unfeasible, due to their low temporal resolution, high cost and limited portability. For these reasons alternative low cost techniques are object of research, typically based on simple recording hardware and on intensive data elaboration process. Typical examples are ElectroEncephaloGraphy (EEG) and Electrical Impedance Tomography (EIT), where electric potential at the patient's scalp is recorded by high impedance electrodes. In EEG potentials are directly generated from neuronal activity, while in EIT by the injection of small currents at the scalp. To retrieve meaningful insights on brain activity from measurements, EIT and EEG relies on detailed knowledge of the underlying electrical properties of the body. This is obtained from numerical models of the electric �field distribution therein. The inhomogeneous and anisotropic electric properties of human tissues make accurate modeling and simulation very challenging, leading to a tradeo�ff between physical accuracy and technical feasibility, which currently severely limits the capabilities of these techniques. Moreover elaboration of data recorded requires usage of regularization techniques computationally intensive, which influences the application with heavy temporal constraints (such as BCI). This work focuses on the parallel implementation of a work-flow for EEG and EIT data processing. The resulting software is accelerated using multi-core GPUs, in order to provide solution in reasonable times and address requirements of real-time BCI systems, without over-simplifying the complexity and accuracy of the head models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Neolithic is characterized by the transition from a subsistence economy, based on hunting and gathering, to one based on food producing. This important change was paralleled by one of the most significant demographic increase in the recent history of European populations. The earliest Neolithic sites in Europe are located in Greece. However, the debate regarding the colonization route followed by the Middle-eastern farmers is still open. Based on archaeological, archaeobotanical, craniometric and genetic data, two main hypotheses have been proposed. The first implies the maritime colonization of North-eastern Peloponnesus from Crete, whereas the second points to an island hopping route that finally brought migrants to Central Greece. To test these hypotheses using a genetic approach, 206 samples were collected from the two Greek regions proposed as the arrival point of the two routes (Korinthian district and Euboea). Expectations for each hypothesis were compared with empirical observations based on the analysis of 60 SNPs and 26 microsatellite loci of Y-chromosome and mitochondrial DNA hypervariable region I. The analysis of Y-chromosome haplogroups revealed a strong genetic affinity of Euboea with Anatolian and Middle-eastern populations. The inferences of the time since population expansion suggests an earlier usage of agriculture in Euboea. Moreover, the haplogroup J2a-M410, supposed to be associated with the Neolithic transition, was observed at higher frequency and variance in Euboea showing, for both these parameters, a decreasing gradient moving from this area. The time since expansion estimates for J2a-M410 was found to be compatible with the Neolithic and slightly older in Euboea. The analysis of mtDNA resulted less informative. However, a higher genetic affinity of Euboea with Anatolian and Middle-eastern populations was confirmed. These results taken as a whole suggests that the most probable route followed by Neolithic farmers during the colonization of Greece was the island hopping route.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with the role played by software tools in the analysis and dissemination of linguistic corpora and their contribution to a more widespread adoption of corpora in different fields. Chapter 1 contains an overview of some of the most relevant corpus analysis tools available today, presenting their most interesting features and some of their drawbacks. Chapter 2 begins with an explanation of the reasons why none of the available tools appear to satisfy the requirements of the user community and then continues with technical overview of the current status of the new system developed as part of this work. This presentation is followed by highlights of features that make the system appealing to users and corpus builders (i.e. scholars willing to make their corpora available to the public). The chapter concludes with an indication of future directions for the projects and information on the current availability of the software. Chapter 3 describes the design of an experiment devised to evaluate the usability of the new system in comparison to another corpus tool. Usage of the tool was tested in the context of a documentation task performed on a real assignment during a translation class in a master's degree course. In chapter 4 the findings of the experiment are presented on two levels of analysis: firstly a discussion on how participants interacted with and evaluated the two corpus tools in terms of interface and interaction design, usability and perceived ease of use. Then an analysis follows of how users interacted with corpora to complete the task and what kind of queries they submitted. Finally, some general conclusions are drawn and areas for future work are outlined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the high price of natural oil and harmful effects of its usage, as the increase in emission of greenhouse gases, the industry focused in searching of sustainable types of the raw materials for production of chemicals. Ethanol, produced by fermentation of sugars, is one of the more interesting renewable materials for chemical manufacturing. There are numerous applications for the conversion of ethanol into commodity chemicals. In particular, the production of 1,3-butadiene whose primary source is ethanol using multifunctional catalysts is attractive. With the 25% of world rubber manufacturers utilizing 1,3-butadiene, there is an exigent need for its sustainable production. In this research, the conversion of ethanol in one-step process to 1,3-butadiene was studied. According to the literature, the mechanisms which were proposed to explain the way ethanol transforms into butadiene require to have both acid and basic sites. But still, there are a lot of debate on this topic. Thus, the aim of this research work is a better understanding of the reaction pathways with all the possible intermediates and products which lead to the formation of butadiene from ethanol. The particular interests represent the catalysts, based on different ratio Mg/Si in comparison to bare magnesia and silica oxides, in order to identify a good combination of acid/basic sites for the adsorption and conversion of ethanol. Usage of spectroscopictechniques are important to extract information that could be helpful for understanding the processes on the molecular level. The diffuse reflectance infrared spectroscopy coupled to mass spectrometry (DRIFT-MS) was used to study the surface composition of the catalysts during the adsorption of ethanol and its transformation during the temperature program. Whereas, mass spectrometry was used to monitor the desorbed products. The set of studied materials include MgO, Mg/Si=0.1, Mg/Si=2, Mg/Si=3, Mg/Si=9 and SiO2 which were also characterized by means of surface area measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The proton-nucleus elastic scattering at intermediate energies is a well-established method for the investigation of the nuclear matter distribution in stable nuclei and was recently applied also for the investigation of radioactive nuclei using the method of inverse kinematics. In the current experiment, the differential cross sections for proton elastic scattering on the isotopes $^{7,9,10,11,12,14}$Be and $^8$B were measured. The experiment was performed using the fragment separator at GSI, Darmstadt to produce the radioactive beams. The main part of the experimental setup was the time projection ionization chamber IKAR which was simultaneously used as hydrogen target and a detector for the recoil protons. Auxiliary detectors for projectile tracking and isotope identification were also installed. As results from the experiment, the absolute differential cross sections d$sigma$/d$t$ as a function of the four momentum transfer $t$ were obtained. In this work the differential cross sections for elastic p-$^{12}$Be, p-$^{14}$Be and p-$^{8}$B scattering at low $t$ ($t leq$~0.05~(GeV/c)$^2$) are presented. The measured cross sections were analyzed within the Glauber multiple-scattering theory using different density parameterizations, and the nuclear matter density distributions and radii of the investigated isotopes were determined. The analysis of the differential cross section for the isotope $^{14}$Be shows that a good description of the experimental data is obtained when density distributions consisting of separate core and halo components are used. The determined {it rms} matter radius is $3.11 pm 0.04 pm 0.13$~fm. In the case of the $^{12}$Be nucleus the results showed an extended matter distribution as well. For this nucleus a matter radius of $2.82 pm 0.03 pm 0.12$~fm was determined. An interesting result is that the free $^{12}$Be nucleus behaves differently from the core of $^{14}$Be and is much more extended than it. The data were also compared with theoretical densities calculated within the FMD and the few-body models. In the case of $^{14}$Be, the calculated cross sections describe the experimental data well while, in the case of $^{12}$Be there are discrepancies in the region of high momentum transfer. Preliminary experimental results for the isotope $^8$B are also presented. An extended matter distribution was obtained (though much more compact as compared to the neutron halos). A proton halo structure was observed for the first time with the proton elastic scattering method. The deduced matter radius is $2.60pm 0.02pm 0.26$~fm. The data were compared with microscopic calculations in the frame of the FMD model and reasonable agreement was observed. The results obtained in the present analysis are in most cases consistent with the previous experimental studies of the same isotopes with different experimental methods (total interaction and reaction cross section measurements, momentum distribution measurements). For future investigation of the structure of exotic nuclei a universal detector system EXL is being developed. It will be installed at the NESR at the future FAIR facility where higher intensity beams of radioactive ions are expected. The usage of storage ring techniques provides high luminosity and low background experimental conditions. Results from the feasibility studies of the EXL detector setup, performed at the present ESR storage ring, are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis analyses problems related to the applicability, in business environments, of Process Mining tools and techniques. The first contribution is a presentation of the state of the art of Process Mining and a characterization of companies, in terms of their "process awareness". The work continues identifying circumstance where problems can emerge: data preparation; actual mining; and results interpretation. Other problems are the configuration of parameters by not-expert users and computational complexity. We concentrate on two possible scenarios: "batch" and "on-line" Process Mining. Concerning the batch Process Mining, we first investigated the data preparation problem and we proposed a solution for the identification of the "case-ids" whenever this field is not explicitly indicated. After that, we concentrated on problems at mining time and we propose the generalization of a well-known control-flow discovery algorithm in order to exploit non instantaneous events. The usage of interval-based recording leads to an important improvement of performance. Later on, we report our work on the parameters configuration for not-expert users. We present two approaches to select the "best" parameters configuration: one is completely autonomous; the other requires human interaction to navigate a hierarchy of candidate models. Concerning the data interpretation and results evaluation, we propose two metrics: a model-to-model and a model-to-log. Finally, we present an automatic approach for the extension of a control-flow model with social information, in order to simplify the analysis of these perspectives. The second part of this thesis deals with control-flow discovery algorithms in on-line settings. We propose a formal definition of the problem, and two baseline approaches. The actual mining algorithms proposed are two: the first is the adaptation, to the control-flow discovery problem, of a frequency counting algorithm; the second constitutes a framework of models which can be used for different kinds of streams (stationary versus evolving).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present PhD dissertation is dedicated to the general topic of knowledge transfer from academia to industry and the role of various measures at both institutional and university levels in support of commercialization of university research. The overall contribution of the present dissertation work refers to presenting an in-depth and comprehensive analysis of the main critical issues that currently exist with regard to commercial exploitation of academic research, while providing evidence on the role of previously underexplored areas (e.g. strategic use of academic patents; female academic patenting) in a general debate on the ways to successful knowledge transfer from academia to industry. The first paper, included in the present PhD dissertation, aims to address this gap by developing a taxonomy of literature, based on a comprehensive review of the existing body of research on government measures in support of knowledge transfer from academia to industry. The results of the review reveal that there is a considerable gap in the analysis of the impact and relative effectiveness of the public policy measures, especially in what regards the measures aimed at building knowledge and expertise among academic faculty and technology transfer agents. The second paper, presented as a part of the dissertation, focuses on the role of interorganizational collaborations and their effect on the likelihood of an academic patent to remain unused, and points to the strategic management of patents by universities. In the third paper I turn to the issue of female participation in patenting and commercialization; in particular, I find evidence on the positive role of university and its internal support structures in closing the gender gap in female academic patenting. The results of the research, carried out for the present dissertation, provide important implications for policy makers in crafting measures to increase the efficient use of university knowledge stock.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis concerns artificially intelligent natural language processing systems that are capable of learning the properties of lexical items (properties like verbal valency or inflectional class membership) autonomously while they are fulfilling their tasks for which they have been deployed in the first place. Many of these tasks require a deep analysis of language input, which can be characterized as a mapping of utterances in a given input C to a set S of linguistically motivated structures with the help of linguistic information encoded in a grammar G and a lexicon L: G + L + C → S (1) The idea that underlies intelligent lexical acquisition systems is to modify this schematic formula in such a way that the system is able to exploit the information encoded in S to create a new, improved version of the lexicon: G + L + S → L' (2) Moreover, the thesis claims that a system can only be considered intelligent if it does not just make maximum usage of the learning opportunities in C, but if it is also able to revise falsely acquired lexical knowledge. So, one of the central elements in this work is the formulation of a couple of criteria for intelligent lexical acquisition systems subsumed under one paradigm: the Learn-Alpha design rule. The thesis describes the design and quality of a prototype for such a system, whose acquisition components have been developed from scratch and built on top of one of the state-of-the-art Head-driven Phrase Structure Grammar (HPSG) processing systems. The quality of this prototype is investigated in a series of experiments, in which the system is fed with extracts of a large English corpus. While the idea of using machine-readable language input to automatically acquire lexical knowledge is not new, we are not aware of a system that fulfills Learn-Alpha and is able to deal with large corpora. To instance four major challenges of constructing such a system, it should be mentioned that a) the high number of possible structural descriptions caused by highly underspeci ed lexical entries demands for a parser with a very effective ambiguity management system, b) the automatic construction of concise lexical entries out of a bulk of observed lexical facts requires a special technique of data alignment, c) the reliability of these entries depends on the system's decision on whether it has seen 'enough' input and d) general properties of language might render some lexical features indeterminable if the system tries to acquire them with a too high precision. The cornerstone of this dissertation is the motivation and development of a general theory of automatic lexical acquisition that is applicable to every language and independent of any particular theory of grammar or lexicon. This work is divided into five chapters. The introductory chapter first contrasts three different and mutually incompatible approaches to (artificial) lexical acquisition: cue-based queries, head-lexicalized probabilistic context free grammars and learning by unification. Then the postulation of the Learn-Alpha design rule is presented. The second chapter outlines the theory that underlies Learn-Alpha and exposes all the related notions and concepts required for a proper understanding of artificial lexical acquisition. Chapter 3 develops the prototyped acquisition method, called ANALYZE-LEARN-REDUCE, a framework which implements Learn-Alpha. The fourth chapter presents the design and results of a bootstrapping experiment conducted on this prototype: lexeme detection, learning of verbal valency, categorization into nominal count/mass classes, selection of prepositions and sentential complements, among others. The thesis concludes with a review of the conclusions and motivation for further improvements as well as proposals for future research on the automatic induction of lexical features.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation seeks to improve the usage of direct democracy in order to minimize agency cost. It first explains why insights from corporate governance can help to improve constitutional law and then identifies relevant insights from corporate governance that can make direct democracy more efficient. To accomplish this, the dissertation examines a number of questions. What are the key similarities in corporate and constitutional law? Do these similarities create agency problems that are similar enough for a comparative analysis to yield valuable insights? Once the utility of corporate governance insights is established, the dissertation answers two questions. Are initiatives necessary to minimize agency cost if referendums are already provided for? And, must the results of direct democracy be binding in order for agency cost to be minimized?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data sets describing the state of the earth's atmosphere are of great importance in the atmospheric sciences. Over the last decades, the quality and sheer amount of the available data increased significantly, resulting in a rising demand for new tools capable of handling and analysing these large, multidimensional sets of atmospheric data. The interdisciplinary work presented in this thesis covers the development and the application of practical software tools and efficient algorithms from the field of computer science, aiming at the goal of enabling atmospheric scientists to analyse and to gain new insights from these large data sets. For this purpose, our tools combine novel techniques with well-established methods from different areas such as scientific visualization and data segmentation. In this thesis, three practical tools are presented. Two of these tools are software systems (Insight and IWAL) for different types of processing and interactive visualization of data, the third tool is an efficient algorithm for data segmentation implemented as part of Insight.Insight is a toolkit for the interactive, three-dimensional visualization and processing of large sets of atmospheric data, originally developed as a testing environment for the novel segmentation algorithm. It provides a dynamic system for combining at runtime data from different sources, a variety of different data processing algorithms, and several visualization techniques. Its modular architecture and flexible scripting support led to additional applications of the software, from which two examples are presented: the usage of Insight as a WMS (web map service) server, and the automatic production of a sequence of images for the visualization of cyclone simulations. The core application of Insight is the provision of the novel segmentation algorithm for the efficient detection and tracking of 3D features in large sets of atmospheric data, as well as for the precise localization of the occurring genesis, lysis, merging and splitting events. Data segmentation usually leads to a significant reduction of the size of the considered data. This enables a practical visualization of the data, statistical analyses of the features and their events, and the manual or automatic detection of interesting situations for subsequent detailed investigation. The concepts of the novel algorithm, its technical realization, and several extensions for avoiding under- and over-segmentation are discussed. As example applications, this thesis covers the setup and the results of the segmentation of upper-tropospheric jet streams and cyclones as full 3D objects. Finally, IWAL is presented, which is a web application for providing an easy interactive access to meteorological data visualizations, primarily aimed at students. As a web application, the needs to retrieve all input data sets and to install and handle complex visualization tools on a local machine are avoided. The main challenge in the provision of customizable visualizations to large numbers of simultaneous users was to find an acceptable trade-off between the available visualization options and the performance of the application. Besides the implementational details, benchmarks and the results of a user survey are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent studies support the notion that statins, widely prescribed cholesterol-lowering agents, may target key elements in the immunological cascade leading to inflammation and tissue damage in the pathogenesis of multiple sclerosis (MS). Compelling experimental and observational clinical studies highlighted the possibility that statins may also exert immunomodulatory synergy with approved MS drugs, resulting in several randomized clinical trials testing statins in combination with interferon-beta (IFN-?). Some data, however, suggest that this particular combination may not be clinically beneficial, and might actually have a negative effect on the disease course in some patients with MS. In this regard, a small North American trial indicated that atorvastatin administered in combination with IFN-? may increase disease activity in relapsing-remitting MS. Although other trials did not confirm this finding, the enthusiasm for studies with statins dwindled. This review aims to provide a comprehensive overview of the completed clinical trials and reports of the interim analyses evaluating the combination of IFN-? and statins in MS. Moreover, we try to address the evident question whether usage of this combination routinely requires caution, since the number of IFN-?-treated MS patients receiving statins for lowering of cholesterol is expected to grow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The morbilliviruses measles virus (MeV) and canine distemper virus (CDV) both rely on two surface glycoproteins, the attachment (H) and fusion proteins, to promote fusion activity for viral cell entry. Growing evidence suggests that morbilliviruses infect multiple cell types by binding to distinct host cell surface receptors. Currently, the only known in vivo receptor used by morbilliviruses is CD150/SLAM, a molecule expressed in certain immune cells. Here we investigated the usage of multiple receptors by the highly virulent and demyelinating CDV strain A75/17. We based our study on the assumption that CDV-H may interact with receptors similar to those for MeV, and we conducted systematic alanine-scanning mutagenesis on CDV-H throughout one side of the beta-propeller documented in MeV-H to contain multiple receptor-binding sites. Functional and biochemical assays performed with SLAM-expressing cells and primary canine epithelial keratinocytes identified 11 residues mutation of which selectively abrogated fusion in keratinocytes. Among these, four were identical to amino acids identified in MeV-H as residues contacting a putative receptor expressed in polarized epithelial cells. Strikingly, when mapped on a CDV-H structural model, all residues clustered in or around a recessed groove located on one side of CDV-H. In contrast, reported CDV-H mutants with SLAM-dependent fusion deficiencies were characterized by additional impairments to the promotion of fusion in keratinocytes. Furthermore, upon transfer of residues that selectively impaired fusion induction in keratinocytes into the CDV-H of the vaccine strain, fusion remained largely unaltered. Taken together, our results suggest that a restricted region on one side of CDV-H contains distinct and overlapping sites that control functional interaction with multiple receptors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this critical analysis of sociological studies of the political subsystem in Yugoslavia since the fall of communism Mr. Ilic examined the work of the majority of leading researchers of politics in the country between 1990 and 1996. Where the question of continuity was important, he also looked at previous research by the writers in question. His aim was to demonstrate the overall extent of existing research and at the same time to identify its limits and the social conditions which defined it. Particular areas examined included the problems of defining basic concepts and selecting the theoretically most relevant indicators; the sources of data including the types of authentic materials exploited; problems of research work (contacts, field control, etc.); problems of analysisl and finally the problems arising from different relations with the people who commission the research. In the first stage of the research, looking at methods of defining key terms, special attention was paid to the analysis of the most frequently used terms such as democracy, totalitarianism, the political left and right, and populism. Numerous weaknesses were noted in the analytic application of these terms. In studies of the possibilities of creating a democratic political system in Serbia and its possible forms (democracy of the majority or consensual democracy), the profound social division of Serbian society was neglected. The left-right distinction tends to be identified with the government-opposition relation, in the way of practical politics. The idea of populism was used to pass responsibility for the policy of war from the manipulator to the manipulated, while the concept of totalitarianism is used in a rather old-fashioned way, with echoes of the cold war. In general, the terminology used in the majority of recent research on the political subsystem in Yugoslavia is characterised by a special ideological style and by practical political material, rather than by developed theoretical effort. The second section of analysis considered the wider theoretical background of the research and focused on studies of the processes of transformation and transition in Yugoslav society, particularly the work of Mladen Lazic and Silvano Bolcic, who he sees as representing the most important and influential contemporary Yugoslav sociologists. Here Mr. Ilic showed that the meaning of empirical data is closely connected with the stratification schemes towards which they are oriented, so that the same data can have different meanings in shown through different schemes. He went on to show the observed theoretical frames in the context of wider ideological understanding of the authors' ideas and research. Here the emphasis was on the formalistic character of such notions as command economy and command work which were used in analysing the functioning and the collapse of communist society, although Mr. Ilic passed favourable judgement on the Lazic's critique of political over-determination in its various attempts to explain the disintegration of the communist political (sub)system. The next stage of the analysis was devoted to the problem of empirical identification of the observed phenomena. Here again the notions of the political left and right were of key importance. He sees two specific problems in using these notion in talking about Yugoslavia, the first being that the process of transition in the FR Yugoslavia has hardly begun. The communist government has in effect remained in power continuously since 1945, despite the introduction of a multi-party system in 1990. The process of privatisation of public property was interrupted at a very early stage and the results of this are evident on the structural level in the continuous weakening of the social status of the middle class and on the political level because the social structure and dominant form of property direct the majority of votes towards to communists in power. This has been combined with strong chauvinist confusion associated with the wars in Croatia and Bosnia, and these ideas were incorporated by all the relevant Yugoslav political parties, making it more difficult to differentiate between them empirically. In this context he quotes the situation of the stream of political scientists who emerged in the Faculty of Political Science in Belgrade. During the time of the one-party regime, this faculty functioned as ideological support for official communist policy and its teachers were unable to develop views which differed from the official line, but rather treated all contrasting ideas in the same way, neglecting their differences. Following the introduction of a multi-party system, these authors changed their idea of a public enemy, but still retained an undifferentiated and theoretically undeveloped approach to the issue of the identification of political ideas. The fourth section of the work looked at problems of explanation in studying the political subsystem and the attempts at an adequate causal explanation of the triumph of Slobodan Milosevic's communists at four subsequent elections was identified as the key methodological problem. The main problem Mr. Ilic isolated here was the neglect of structural factors in explaining the voters' choice. He then went on to look at the way empirical evidence is collected and studied, pointing out many mistakes in planning and determining the samples used in surveys as well as in the scientifically incorrect use of results. He found these weaknesses particularly noticeable in the works of representatives of the so-called nationalistic orientation in Yugoslav sociology of politics, and he pointed out the practical political abuses which these methodological weaknesses made possible. He also identified similar types of mistakes in research by Serbian political parties made on the basis of party documentation and using methods of content analysis. He found various none-sided applications of survey data and looked at attempts to apply other sources of data (statistics, official party documents, various research results). Mr. Ilic concluded that there are two main sets of characteristics in modern Yugoslav sociological studies of political subsystems. There are a considerable number of surveys with ambitious aspirations to explain political phenomena, but at the same time there is a clear lack of a developed sociological theory of political (sub)systems. He feels that, in the absence of such theory, most researcher are over-ready to accept the theoretical solutions found for interpretation of political phenomena in other countries. He sees a need for a stronger methodological bases for future research, either 1) in complementary usage of different sources and ways of collecting data, or 2) in including more of a historical dimension in different attempts to explain the political subsystem in Yugoslavia.