43 resultados para Extracoronal precision attachments

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an analysis of the mass of the X(3872) reconstructed via its decay to J/psi pi+ pi- using 2.4 fb^-1 of integrated luminosity from ppbar collisions at sqrt(s) = 1.96 TeV, collected with the CDF II detector at the Fermilab Tevatron. The possible existence of two nearby mass states is investigated. Within the limits of our experimental resolution the data are consistent with a single state, and having no evidence for two states we set upper limits on the mass difference between two hypothetical states for different assumed ratios of contributions to the observed peak. For equal contributions, the 95% confidence level upper limit on the mass difference is 3.6 MeV/c^2. Under the single-state model the X(3872) mass is measured to be 3871.61 +- 0.16 (stat) +- 0.19 (syst) MeV/c^2, which is the most precise determination to date.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a laser-based system to measure the refractive index of air over a long path length. In optical distance measurements it is essential to know the refractive index of air with high accuracy. Commonly, the refractive index of air is calculated from the properties of the ambient air using either Ciddor or Edlén equations, where the dominant uncertainty component is in most cases the air temperature. The method developed in this work utilises direct absorption spectroscopy of oxygen to measure the average temperature of air and of water vapor to measure relative humidity. The method allows measurement of temperature and humidity over the same beam path as in optical distance measurement, providing spatially well matching data. Indoor and outdoor measurements demonstrate the effectiveness of the method. In particular, we demonstrate an effective compensation of the refractive index of air in an interferometric length measurement at a time-variant and spatially non-homogenous temperature over a long time period. Further, we were able to demonstrate 7 mK RMS noise over a 67 m path length using 120 s sample time. To our knowledge, this is the best temperature precision reported for a spectroscopic temperature measurement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The methodology of extracting information from texts has widely been described in the current literature. However, the methodology has been developed mainly for the purposes of other fields than terminology science. In addition, the research has been English language oriented. Therefore, there are no satisfactory language-independent methods for extracting terminological information from texts. The aim of the present study is to form the basis for a further improvement of methods for extraction of terminological information. A further aim is to determine differences in term extraction between subject groups with or without knowledge of the special field in question. The study is based on the theory of terminology, and has mainly a qualitative approach. The research material consists of electronically readable specialized texts in the subject domain of maritime safety. Textbooks, conference papers, research reports and articles from professional journals in Finnish and in Russian are included. The thesis first deals with certain term extraction methods. These are manual term identification and semi-automatic term extraction, the latter of which was carried out by using three commercial computer programs. The results of term extraction were compared and the recall and precision of the methods were evaluated. The latter part of the study is dedicated to the identification of concept relations. Certain linguistic expressions, which some researchers call knowledge probes, were applied to identify concept relations. The results of the present thesis suggest that special field knowledge is an advantage in manual term identification. However, in the candidate term lists the variation between subject groups was not as remarkable as it was between individual subjects. The term extraction software tested here produces candidate term lists which can be useful, but only after some manual work. Therefore, the work emphasizes the need to further develop term extraction software. Furthermore, the analyses indicate that there are a certain number of terms which were extracted by all the subjects and the software. These terms we call core terms. As the result of the experiment on linguistic expressions which signal concept relations, a proposal of Finnish and Russian knowledge probes in the field of maritime safety was made. The main finding was that it would be useful to combine the use of knowledge probes with semi-automatic term extraction since knowledge probes usually occur in the vicinity of terms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dozens of Finnish artists, practically all the professional sculptors and painters, travelled to and stayed in Rome during the 19th century. The study at hand concentrates for the first time on the Finnish artists in Rome in corpore, and analyses their way of life based on a broad variety of previously unknown and unexplored sources from a number of archives in both Scandinavia and Rome. The extensive corpus of source material is scrutinized with microhistorical precision from the point of view of cultural history. The new information thus achieved adds to the previous knowledge of Rome s often overlooked importance as a source of inspiration in Scandinavian culture in general and significantly clarifies our understanding of the development of Finnish artistic life and cultural identity in the 19th century. The study proves that in Finland, like in all of Europe, the stay in Rome was considered to be a necessary part of becoming a true artist. Already the journey was an integral part of the encounter with Rome, corresponding with the civilized ideal of the period. The stay in Rome provided a northern artist with overwhelming opportunities that were incomparable to the unestablished and modest forms of artistic life Finland could offer. Without domestic artistic institutions or traditions, the professional status of Finnish painters and sculptors took shape abroad, firstly through the encounter with Rome and the different networks the Finnish artists belonged to during and after their stay in the eternal city. The Finnish artists were an integral part of the international artistic community in the cultural capital of Europe, which gave a totally new impetus to their work and contributed to their cosmopolitan identification. For these early masters of Finnish art, the Scandinavian communality and universal artistic identity seemed to be more significant than their nationality. In all, the scrutiny of Finnish artists in their wide social, ideological and international framework gives an interesting aspect to the cultural ambiance of the 19th century, in both Rome and Finland. The study highlights many long-forgotten artists who were influential in shaping Finnish art, culture and identity in their time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis utilises an evidence-based approach to critically evaluate and summarize effectiveness research on physiotherapy, physiotherapy-related motor-based interventions and orthotic devices in children and adolescents with cerebral palsy (CP). It aims to assess the methodological challenges of the systematic reviews and trials, to evaluate the effectiveness of interventions in current use, and to make suggestions for future trials Methods: Systematic reviews were searched from computerized bibliographic databases up to August 2007 for physiotherapy and physiotherapy-related interventions, and up to May 2003 for orthotic devices. Two reviewers independently identified, selected, and assessed the quality of the reviews using the Overview Quality Assessment Questionnaire complemented with decision rules. From a sample of 14 randomized controlled trials (RCT) published between January 1990 and June 2003 we analysed the methods of sampling, recruitment, and comparability of groups; defined the components of a complex intervention; identified outcome measures based on the International Classification of Functioning, Disability and Health (ICF); analysed the clinical interpretation of score changes; and analysed trial reporting using a modified 33-item CONSORT (Consolidated Standards of Reporting Trials) checklist. The effectiveness of physiotherapy and physiotherapy-related interventions in children with diagnosed CP was evaluated in a systematic review of randomised controlled trials that were searched from computerized databases from January 1990 up to February 2007. Two reviewers independently assessed the methodological quality, extracted the data, classified the outcomes using the ICF, and considered the level of evidence according to van Tulder et al. (2003). Results: We identified 21 reviews on physiotherapy and physiotherapy-related interventions and five on orthotic devices. These reviews summarized 23 or 5 randomised controlled trials and 104 or 27 observational studies, respectively. Only six reviews were of high quality. These found some evidence supporting strength training, constraint-induced movement therapy or hippotherapy, and insufficient evidence on comprehensive interventions. Based on the original studies included in the reviews on orthotic devices we found some short-term effects of lower limb casting on passive range of movement, and of ankle-foot orthoses on equinus walk. Long term effects of lower limb orthoses have not been studied. Evidence of upper limb casting or orthoses is conflicting. In the sample of 14 RCTs, most trials used simple randomisation, complemented with matching or stratification, but only three specified the concealed allocation. Numerous studies provided sufficient details on the components of a complex intervention, but the overlap of outcome measures across studies was poor and the clinical interpretation of observed score changes was mostly missing. Almost half (48%) of the applicable CONSORT-based items (range 28 32) were reported adequately. Most reporting inadequacies were in outcome measures, sample size determination, details of the sequence generation, allocation concealment and implementation of the randomization, success of assessor blinding, recruitment and follow-up dates, intention-to-treat analysis, precision of the effect size, co-interventions, and adverse events. The systematic review identified 22 trials on eight intervention categories. Four trials were of high quality. Moderate evidence of effectiveness was established for upper extremity treatments on attained goals, active supination and developmental status, and of constraint-induced therapy on the amount and quality of hand use and new emerging behaviours. Moderate evidence of ineffectiveness was found for strength training's effect on walking speed and stride length. Conflicting evidence was found for strength training's effect on gross motor function. For the other intervention categories the evidence was limited due to the low methodological quality and the statistically insignificant results of the studies. Conclusions: The high-quality reviews provide both supportive and insufficient evidence on some physiotherapy interventions. The poor quality of most reviews calls for caution, although most reviews drew no conclusions on effectiveness due to the poor quality of the primary studies. A considerable number of RCTs of good to fair methodological and reporting quality indicate that informative and well-reported RCTs on complex interventions in children and adolescents with CP are feasible. Nevertheless, methodological improvement is needed in certain areas of the trial design and performance, and the trial authors are encouraged to follow the CONSORT criteria. Based on RCTs we established moderate evidence for some effectiveness of upper extremity training. Due to limitations in methodological quality and variations in population, interventions and outcomes, mostly limited evidence on the effectiveness of most physiotherapy interventions is available to guide clinical practice. Well-designed trials are needed, especially for focused physiotherapy interventions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soils represent a remarkable stock of carbon, and forest soils are estimated to hold half of the global stock of soil carbon. Topical concern about the effects of climate change and forest management on soil carbon as well as practical reporting requirements set by climate conventions have created a need to assess soil carbon stock changes reliably and transparently. The large spatial variability of soil carbon commensurate with relatively slow changes in stocks hinders the assessment of soil carbon stocks and their changes by direct measurements. Models therefore widely serve to estimate carbon stocks and stock changes in soils. This dissertation aimed to develop the soil carbon model YASSO for upland forest soils. The model was aimed to take into account the most important processes controlling the decomposition in soils, yet remain simple enough to ensure its practical applicability in different applications. The model structure and assumptions were presented and the model parameters were defined with empirical measurements. The model was evaluated by studying the sensitivities of the model results to parameter values, by estimating the precision of the results with an uncertainty analysis, and by assessing the accuracy of the model by comparing the predictions against measured data and to the results of an alternative model. The model was applied to study the effects of intensified biomass extraction on the forest carbon balance and to estimate the effects of soil carbon deficit on net greenhouse gas emissions of energy use of forest residues. The model was also applied in an inventory based method to assess the national scale forest carbon balance for Finland’s forests from 1922 to 2004. YASSO managed to describe sufficiently the effects of both the variable litter and climatic conditions on decomposition. When combined with the stand models or other systems providing litter information, the dynamic approach of the model proved to be powerful for estimating changes in soil carbon stocks on different scales. The climate dependency of the model, the effects of nitrogen on decomposition and forest growth as well as the effects of soil texture on soil carbon stock dynamics are areas for development when considering the applicability of the model to different research questions, different land use types and wider geographic regions. Intensified biomass extraction affects soil carbon stocks, and these changes in stocks should be taken into account when considering the net effects of forest residue utilisation as energy. On a national scale, soil carbon stocks play an important role in forest carbon balances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Costs of purchasing new piglets and of feeding them until slaughter are the main variable expenditures in pig fattening. They both depend on slaughter intensity, the nature of feeding patterns and the technological constraints of pig fattening, such as genotype. Therefore, it is of interest to examine the effect of production technology and changes in input and output prices on feeding and slaughter decisions. This study examines the problem by using a dynamic programming model that links genetic characteristics of a pig to feeding decisions and the timing of slaughter and takes into account how these jointly affect the quality-adjusted value of a carcass. The model simulates the growth mechanism of a pig under optional feeding and slaughter patterns and then solves the optimal feeding and slaughter decisions recursively. The state of nature and the genotype of a pig are known in the analysis. The main contribution of this study is the dynamic approach that explicitly takes into account carcass quality while simultaneously optimising feeding and slaughter decisions. The method maximises the internal rate of return to the capacity unit. Hence, the results can have vital impact on competitiveness of pig production, which is known to be quite capital-intensive. The results suggest that producer can significantly benefit from improvements in the pig's genotype, because they improve efficiency of pig production. The annual benefits from obtaining pigs of improved genotype can be more than €20 per capacity unit. The annual net benefits of animal breeding to pig farms can also be considerable. Animals of improved genotype can reach optimal slaughter maturity quicker and produce leaner meat than animals of poor genotype. In order to fully utilise the benefits of animal breeding, the producer must adjust feeding and slaughter patterns on the basis of genotype. The results suggest that the producer can benefit from flexible feeding technology. The flexible feeding technology segregates pigs into groups according to their weight, carcass leanness, genotype and sex and thereafter optimises feeding and slaughter decisions separately for these groups. Typically, such a technology provides incentives to feed piglets with protein-rich feed such that the genetic potential to produce leaner meat is fully utilised. When the pig approaches slaughter maturity, the share of protein-rich feed in the diet gradually decreases and the amount of energy-rich feed increases. Generally, the optimal slaughter weight is within the weight range that pays the highest price per kilogram of pig meat. The optimal feeding pattern and the optimal timing of slaughter depend on price ratios. Particularly, an increase in the price of pig meat provides incentives to increase the growth rates up to the pig's biological maximum by increasing the amount of energy in the feed. Price changes and changes in slaughter premium can also have large income effects. Key words: barley, carcass composition, dynamic programming, feeding, genotypes, lean, pig fattening, precision agriculture, productivity, slaughter weight, soybeans

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Eutrophication of the Baltic Sea is a serious problem. This thesis estimates the benefit to Finns from reduced eutrophication in the Gulf of Finland, the most eutrophied part of the Baltic Sea, by applying the choice experiment method, which belongs to the family of stated preference methods. Because stated preference methods have been subject to criticism, e.g., due to their hypothetical survey context, this thesis contributes to the discussion by studying two anomalies that may lead to biased welfare estimates: respondent uncertainty and preference discontinuity. The former refers to the difficulty of stating one s preferences for an environmental good in a hypothetical context. The latter implies a departure from the continuity assumption of conventional consumer theory, which forms the basis for the method and the analysis. In the three essays of the thesis, discrete choice data are analyzed with the multinomial logit and mixed logit models. On average, Finns are willing to contribute to the water quality improvement. The probability for willingness increases with residential or recreational contact with the gulf, higher than average income, younger than average age, and the absence of dependent children in the household. On average, for Finns the relatively most important characteristic of water quality is water clarity followed by the desire for fewer occurrences of blue-green algae. For future nutrient reduction scenarios, the annual mean household willingness to pay estimates range from 271 to 448 and the aggregate welfare estimates for Finns range from 28 billion to 54 billion euros, depending on the model and the intensity of the reduction. Out of the respondents (N=726), 72.1% state in a follow-up question that they are either Certain or Quite certain about their answer when choosing the preferred alternative in the experiment. Based on the analysis of other follow-up questions and another sample (N=307), 10.4% of the respondents are identified as potentially having discontinuous preferences. In relation to both anomalies, the respondent- and questionnaire-specific variables are found among the underlying causes and a departure from standard analysis may improve the model fit and the efficiency of estimates, depending on the chosen modeling approach. The introduction of uncertainty about the future state of the Gulf increases the acceptance of the valuation scenario which may indicate an increased credibility of a proposed scenario. In conclusion, modeling preference heterogeneity is an essential part of the analysis of discrete choice data. The results regarding uncertainty in stating one s preferences and non-standard choice behavior are promising: accounting for these anomalies in the analysis may improve the precision of the estimates of benefit from reduced eutrophication in the Gulf of Finland.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis consists of two parts; in the first part we performed a single-molecule force extension measurement with 10kb long DNA-molecules from phage-λ to validate the calibration and single-molecule capability of our optical tweezers instrument. Fitting the worm-like chain interpolation formula to the data revealed that ca. 71% of the DNA tethers featured a contour length within ±15% of the expected value (3.38 µm). Only 25% of the found DNA had a persistence length between 30 and 60 nm. The correct value should be within 40 to 60 nm. In the second part we designed and built a precise temperature controller to remove thermal fluctuations that cause drifting of the optical trap. The controller uses feed-forward and PID (proportional-integral-derivative) feedback to achieve 1.58 mK precision and 0.3 K absolute accuracy. During a 5 min test run it reduced drifting of the trap from 1.4 nm/min in open-loop to 0.6 nm/min in closed-loop.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In technicolor theories the scalar sector of the Standard Model is replaced by a strongly interacting sector. Although the Standard Model has been exceptionally successful, the scalar sector causes theoretical problems that make these theories seem an attractive alternative. I begin my thesis by considering QCD, which is the known example of strong interactions. The theory exhibits two phenomena: confinement and chiral symmetry breaking. I find the low-energy dynamics to be similar to that of the sigma models. Then I analyze the problems of the Standard Model Higgs sector, mainly the unnaturalness and triviality. Motivated by the example of QCD, I introduce the minimal technicolor model to resolve these problems. I demonstrate the minimal model to be free of anomalies and then deduce the main elements of its low-energy particle spectrum. I find the particle spectrum contains massless or very light technipions, and also technibaryons and techni-vector mesons with a high mass of over 1 TeV. Standard Model fermions remain strictly massless at this stage. Thus I introduce the technicolor companion theory of flavor, called extended technicolor. I show that the Standard Model fermions and technihadrons receive masses, but that they remain too light. I also discuss flavor-changing neutral currents and precision electroweak measurements. I then show that walking technicolor models partly solve these problems. In these models, contrary to QCD, the coupling evolves slowly over a large energy scale. This behavior adds to the masses so that even the light technihadrons are too heavy to be detected at current particle accelerators. Also all observed masses of the Standard Model particles can be generated, except for the bottom and top quarks. Thus it is shown in this thesis that, excluding the masses of third generation quarks, theories based on walking technicolor can in principle produce the observed particle spectrum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis contains five experimental spectroscopic studies that probe the vibration-rotation energy level structure of acetylene and some of its isotopologues. The emphasis is on the development of laser spectroscopic methods for high-resolution molecular spectroscopy. Three of the experiments use cavity ringdown spectroscopy. One is a standard setup that employs a non-frequency stabilised continuous wave laser as a source. In the other two experiments, the same laser is actively frequency stabilised to the ringdown cavity. This development allows for increased repetition rate of the experimental signal and thus the spectroscopic sensitivity of the method is improved. These setups are applied to the recording of several vibration-rotation overtone bands of both H(12)C(12)CH and H(13)C(13)CH. An intra-cavity laser absorption spectroscopy setup that uses a commercial continuous wave ring laser and a Fourier transform interferometer is presented. The configuration of the laser is found to be sub-optimal for high-sensitivity work but the spectroscopic results are good and show the viability of this type of approach. Several ro-vibrational bands of carbon-13 substituted acetylenes are recorded and analysed. Compared with earlier work, the signal-to-noise ratio of a laser-induced dispersed infrared fluorescence experiment is enhanced by more than one order of magnitude by exploiting the geometric characteristics of the setup. The higher sensitivity of the spectrometer leads to the observation of two new symmetric vibrational states of H(12)C(12)CH. The precision of the spectroscopic parameters of some previously published symmetric states is also improved. An interesting collisional energy transfer process is observed for the excited vibrational states and this phenomenon is explained by a simple step-down model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microarrays are high throughput biological assays that allow the screening of thousands of genes for their expression. The main idea behind microarrays is to compute for each gene a unique signal that is directly proportional to the quantity of mRNA that was hybridized on the chip. A large number of steps and errors associated with each step make the generated expression signal noisy. As a result, microarray data need to be carefully pre-processed before their analysis can be assumed to lead to reliable and biologically relevant conclusions. This thesis focuses on developing methods for improving gene signal and further utilizing this improved signal for higher level analysis. To achieve this, first, approaches for designing microarray experiments using various optimality criteria, considering both biological and technical replicates, are described. A carefully designed experiment leads to signal with low noise, as the effect of unwanted variations is minimized and the precision of the estimates of the parameters of interest are maximized. Second, a system for improving the gene signal by using three scans at varying scanner sensitivities is developed. A novel Bayesian latent intensity model is then applied on these three sets of expression values, corresponding to the three scans, to estimate the suitably calibrated true signal of genes. Third, a novel image segmentation approach that segregates the fluorescent signal from the undesired noise is developed using an additional dye, SYBR green RNA II. This technique helped in identifying signal only with respect to the hybridized DNA, and signal corresponding to dust, scratch, spilling of dye, and other noises, are avoided. Fourth, an integrated statistical model is developed, where signal correction, systematic array effects, dye effects, and differential expression, are modelled jointly as opposed to a sequential application of several methods of analysis. The methods described in here have been tested only for cDNA microarrays, but can also, with some modifications, be applied to other high-throughput technologies. Keywords: High-throughput technology, microarray, cDNA, multiple scans, Bayesian hierarchical models, image analysis, experimental design, MCMC, WinBUGS.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

XML documents are becoming more and more common in various environments. In particular, enterprise-scale document management is commonly centred around XML, and desktop applications as well as online document collections are soon to follow. The growing number of XML documents increases the importance of appropriate indexing methods and search tools in keeping the information accessible. Therefore, we focus on content that is stored in XML format as we develop such indexing methods. Because XML is used for different kinds of content ranging all the way from records of data fields to narrative full-texts, the methods for Information Retrieval are facing a new challenge in identifying which content is subject to data queries and which should be indexed for full-text search. In response to this challenge, we analyse the relation of character content and XML tags in XML documents in order to separate the full-text from data. As a result, we are able to both reduce the size of the index by 5-6\% and improve the retrieval precision as we select the XML fragments to be indexed. Besides being challenging, XML comes with many unexplored opportunities which are not paid much attention in the literature. For example, authors often tag the content they want to emphasise by using a typeface that stands out. The tagged content constitutes phrases that are descriptive of the content and useful for full-text search. They are simple to detect in XML documents, but also possible to confuse with other inline-level text. Nonetheless, the search results seem to improve when the detected phrases are given additional weight in the index. Similar improvements are reported when related content is associated with the indexed full-text including titles, captions, and references. Experimental results show that for certain types of document collections, at least, the proposed methods help us find the relevant answers. Even when we know nothing about the document structure but the XML syntax, we are able to take advantage of the XML structure when the content is indexed for full-text search.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whereas it has been widely assumed in the public that the Soviet music policy system had a “top-down” structure of control and command that directly affected musical creativity, in fact my research shows that the relations between the different levels of the music policy system were vague, and the viewpoints of its representatives differed from each other. Because the representatives of the party and government organs controlling operas could not define which kind of music represented Socialist Realism, the system as it developed during the 1930s and 1940s did not function effectively enough in order to create such a centralised control of Soviet music, still less could Soviet operas fulfil the highly ambiguous aesthetics of Socialist Realism. I show that musical discussions developed as bureaucratic ritualistic arenas, where it became more important to reveal the heretical composers, making scapegoats of them, and requiring them to perform self-criticism, than to give directions on how to reach the artistic goals of Socialist Realism. When one opera was found to be unacceptable, this lead to a strengthening of control by the party leadership, which lead to more operas, one after the other, to be revealed as failures. I have studied the control of the composition, staging and reception of the opera case-studies, which remain obscure in the West despite a growing scholarly interest in them, and have created a detailed picture of the foundation and development of the Soviet music control system in 1932-1950. My detailed discussion of such case-studies as Ivan Dzerzhinskii’s The Quiet Don, Dmitrii Shostakovich’s Lady Macbeth of Mtsensk District, Vano Muradeli’s The Great Friendship, Sergei Prokofiev’s Story of a Real Man, Tikhon Khrennikov’s Frol Skobeev and Evgenii Zhukovskii’s From All One’s Heart backs with documentary precision the historically revisionist model of the development of Soviet music. In February 1948, composers belonging to the elite of the Union of Soviet Composers, e.g. Dmitri Shostakovich and Sergei Prokofiev, were accused in a Central Committee Resolution of formalism, as been under the influence of western modernism. Accusations of formalism were connected to the criticism of the conciderable financial, material and social privileges these composers enjoyed in the leadership of the Union. With my new archival findings I give a more detailed picture of the financial background for the 1948 campaign. The independent position of the music funding organization of the Union of Soviet Composers (Muzfond) to decide on its finances was an exceptional phenomenon in the Soviet Union and contradicted the strivings to strengthen the control of Soviet music. The financial audits of the Union of Soviet Composers did not, however, change the elite status of some of its composers, except for maybe a short duration in some cases. At the same time the independence of the significal financial authorities of Soviet theatres was restricted. The cuts in the governmental funding allocated to Soviet theatres contradicted the intensified ideological demands for Soviet operas.