895 resultados para Spaces of Generalized Functions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is about the stability of random sums and extremes.The difficulty in finding exact sampling distributions resulted in considerable problems of computing probabilities concerning the sums that involve a large number of terms.Functions of sample observations that are natural interest other than the sum,are the extremes,that is , the minimum and the maximum of the observations.Extreme value distributions also arise in problems like the study of size effect on material strengths,the reliability of parallel and series systems made up of large number of components,record values and assessing the levels of air pollution.It may be noticed that the theories of sums and extremes are mutually connected.For instance,in the search for asymptotic normality of sums ,it is assumed that at least the variance of the population is finite.In such cases the contributions of the extremes to the sum of independent and identically distributed(i.i.d) r.vs is negligible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, the concept of reversed lack of memory property and its generalizations is studied.We we generalize this property which involves operations different than the ”addition”. In particular an associative, binary operator ” * ” is considered. The univariate reversed lack of memory property is generalized using the binary operator and a class of probability distributions which include Type 3 extreme value, power function, reflected Weibull and negative Pareto distributions are characterized (Asha and Rejeesh (2009)). We also define the almost reversed lack of memory property and considered the distributions with reversed periodic hazard rate under the binary operation. Further, we give a bivariate extension of the generalized reversed lack of memory property and characterize a class of bivariate distributions which include the characterized extension (CE) model of Roy (2002a) apart from the bivariate reflected Weibull and power function distributions. We proved the equality of local proportionality of the reversed hazard rate and generalized reversed lack of memory property. Study of uncertainty is a subject of interest common to reliability, survival analysis, actuary, economics, business and many other fields. However, in many realistic situations, uncertainty is not necessarily related to the future but can also refer to the past. Recently, Di Crescenzo and Longobardi (2009) introduced a new measure of information called dynamic cumulative entropy. Dynamic cumulative entropy is suitable to measure information when uncertainty is related to the past, a dual concept of the cumulative residual entropy which relates to uncertainty of the future lifetime of a system. We redefine this measure in the whole real line and study its properties. We also discuss the implications of generalized reversed lack of memory property on dynamic cumulative entropy and past entropy.In this study, we extend the idea of reversed lack of memory property to the discrete set up. Here we investigate the discrete class of distributions characterized by the discrete reversed lack of memory property. The concept is extended to the bivariate case and bivariate distributions characterized by this property are also presented. The implication of this property on discrete reversed hazard rate, mean past life, and discrete past entropy are also investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sonar signal processing comprises of a large number of signal processing algorithms for implementing functions such as Target Detection, Localisation, Classification, Tracking and Parameter estimation. Current implementations of these functions rely on conventional techniques largely based on Fourier Techniques, primarily meant for stationary signals. Interestingly enough, the signals received by the sonar sensors are often non-stationary and hence processing methods capable of handling the non-stationarity will definitely fare better than Fourier transform based methods.Time-frequency methods(TFMs) are known as one of the best DSP tools for nonstationary signal processing, with which one can analyze signals in time and frequency domains simultaneously. But, other than STFT, TFMs have been largely limited to academic research because of the complexity of the algorithms and the limitations of computing power. With the availability of fast processors, many applications of TFMs have been reported in the fields of speech and image processing and biomedical applications, but not many in sonar processing. A structured effort, to fill these lacunae by exploring the potential of TFMs in sonar applications, is the net outcome of this thesis. To this end, four TFMs have been explored in detail viz. Wavelet Transform, Fractional Fourier Transfonn, Wigner Ville Distribution and Ambiguity Function and their potential in implementing five major sonar functions has been demonstrated with very promising results. What has been conclusively brought out in this thesis, is that there is no "one best TFM" for all applications, but there is "one best TFM" for each application. Accordingly, the TFM has to be adapted and tailored in many ways in order to develop specific algorithms for each of the applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis "Entitled performance of district industries centres in kerala :An application of augmented solow model.The first chapter deals with evolution of approaches for promoting small scale production and the growth of small scale industries in india.the developing countries face the problems like sluggish growth capital shortages high levels of unemployment,enoromous rural-urban economic disparities regional inequalities increasing concentration of capital and chronic difficulities in the export sector.Review of literature and methodology of the study are presented in the second chapter. In the third chapter an attempt has been made to make an in-depth study of the emergence and growth of district of district industries centres.In the chapter four an attempt was made to study the organisational structure of DICs functions and responsibilities assigned to the functional managers and performance of the functionaries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis "Entitled performance of district industries centres in kerala :An application of augmented solow model.The first chapter deals with evolution of approaches for promoting small scale production and the growth of small scale industries in india.the developing countries face the problems like sluggish growth capital shortages high levels of unemployment,enoromous rural-urban economic disparities regional inequalities increasing concentration of capital and chronic difficulities in the export sector.Review of literature and methodology of the study are presented in the second chapter. In the third chapter an attempt has been made to make an in-depth study of the emergence and growth of district of district industries centres.In the chapter four an attempt was made to study the organisational structure of DICs functions and responsibilities assigned to the functional managers and performance of the functionaries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability analysis is a well established branch of statistics that deals with the statistical study of different aspects of lifetimes of a system of components. As we pointed out earlier that major part of the theory and applications in connection with reliability analysis were discussed based on the measures in terms of distribution function. In the beginning chapters of the thesis, we have described some attractive features of quantile functions and the relevance of its use in reliability analysis. Motivated by the works of Parzen (1979), Freimer et al. (1988) and Gilchrist (2000), who indicated the scope of quantile functions in reliability analysis and as a follow up of the systematic study in this connection by Nair and Sankaran (2009), in the present work we tried to extend their ideas to develop necessary theoretical framework for lifetime data analysis. In Chapter 1, we have given the relevance and scope of the study and a brief outline of the work we have carried out. Chapter 2 of this thesis is devoted to the presentation of various concepts and their brief reviews, which were useful for the discussions in the subsequent chapters .In the introduction of Chapter 4, we have pointed out the role of ageing concepts in reliability analysis and in identifying life distributions .In Chapter 6, we have studied the first two L-moments of residual life and their relevance in various applications of reliability analysis. We have shown that the first L-moment of residual function is equivalent to the vitality function, which have been widely discussed in the literature .In Chapter 7, we have defined percentile residual life in reversed time (RPRL) and derived its relationship with reversed hazard rate (RHR). We have discussed the characterization problem of RPRL and demonstrated with an example that the RPRL for given does not determine the distribution uniquely

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An attempt is made by the researcher to establish a theory of discrete functions in the complex plane. Classical analysis q-basic theory, monodiffric theory, preholomorphic theory and q-analytic theory have been utilised to develop concepts like differentiation, integration and special functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new approach to the design of combinational digital circuits with multiplexers using Evolutionary techniques. Genetic Algorithm (GA) is used as the optimization tool. Several circuits are synthesized with this method and compared with two design techniques such as standard implementation of logic functions using multiplexers and implementation using Shannon’s decomposition technique using GA. With the proposed method complexity of the circuit and the associated delay can be reduced significantly

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Bieberbach conjecture about the coefficients of univalent functions of the unit disk was formulated by Ludwig Bieberbach in 1916 [Bieberbach1916]. The conjecture states that the coefficients of univalent functions are majorized by those of the Koebe function which maps the unit disk onto a radially slit plane. The Bieberbach conjecture was quite a difficult problem, and it was surprisingly proved by Louis de Branges in 1984 [deBranges1985] when some experts were rather trying to disprove it. It turned out that an inequality of Askey and Gasper [AskeyGasper1976] about certain hypergeometric functions played a crucial role in de Branges' proof. In this article I describe the historical development of the conjecture and the main ideas that led to the proof. The proof of Lenard Weinstein (1991) [Weinstein1991] follows, and it is shown how the two proofs are interrelated. Both proofs depend on polynomial systems that are directly related with the Koebe function. At this point algorithms of computer algebra come into the play, and computer demonstrations are given that show how important parts of the proofs can be automated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this 1984 proof of the Bieberbach and Milin conjectures de Branges used a positivity result of special functions which follows from an identity about Jacobi polynomial sums thas was published by Askey and Gasper in 1976. The de Branges functions Tn/k(t) are defined as the solutions of a system of differential recurrence equations with suitably given initial values. The essential fact used in the proof of the Bieberbach and Milin conjectures is the statement Tn/k(t)<=0. In 1991 Weinstein presented another proof of the Bieberbach and Milin conjectures, also using a special function system Λn/k(t) which (by Todorov and Wilf) was realized to be directly connected with de Branges', Tn/k(t)=-kΛn/k(t), and the positivity results in both proofs Tn/k(t)<=0 are essentially the same. In this paper we study differential recurrence equations equivalent to de Branges' original ones and show that many solutions of these differential recurrence equations don't change sign so that the above inequality is not as surprising as expected. Furthermore, we present a multiparameterized hypergeometric family of solutions of the de Branges differential recurrence equations showing that solutions are not rare at all.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im Mittelpunkt der Dissertation stehen das Schutzgut ‚Landschaft’ sowie ‚Prognosemethoden in der Umweltprüfung’. Mit beiden Themenbereichen verbinden sich bereits heute ungelöste methodische Probleme, die mit der Umsetzung der Richtlinie zur Strategischen Umweltprüfung (SUP) zusätzlich komplexer und deren Lösung mithin anspruchsvoller werden. Dies hängt einerseits damit zusammen, dass eine gesetzeskonforme Gleichbehandlung aller Schutzgüter zunehmend eingefordert wird und gerade das Schutzgut ‚Landschaft’ in einer SUP methodisch besondere Aufmerksamkeit verlangt. Zum anderen führt die gängige planungsmethodische Diskussion allein nicht zu geeigneten Antworten auf o.g. Fragen, und es bedarf der Prüfung verschiedener Methodenbausteine, auch aus anderen Wissensgebieten, um – über ein eindimensionales Landschaftsverständnis einerseits und die bisher bekannten linearen Wirkungsprognosen andererseits hinaus gehend – mehrfach verknüpfte Prognoseschritte zur Anwendung in der SUP zu entwickeln, in denen das Schutzgut ‚Landschaft’ modellhaft für Bewertungsschritte nachvollziehbar abgebildet wird. Hierbei müssen entscheidungsrelevante Prognosezeiträume ebenso beachtet werden, wie in diesen Zeiträumen möglicherweise auftretende sekundäre, kumulative, synergetische, positive und negative Auswirkungen der zu beurteilenden Planung. Dieser Ziel- und Aufgabenstellung entsprechend erfolgt die theoretische Herangehensweise der Arbeit von zwei Seiten: 1. Die Funktionen und Stellung von Prognosen innerhalb der SUP wird erläutert (Kap. 2), und es wird der Frage nachgegangen, welche Anforderungen an Prognosemethoden zu stellen sind (Kap. 2.4) und welche Prognosemethoden in der SUP Verwendung finden bzw. finden können (Kap. 3). Der Schwerpunkt wird dabei auf die Anwendung der Szenariotechnik gelegt. 2. Es wird dargestellt wie Landschaft für Aufgaben der Landschaftsplanung und Umweltprüfung bisher üblicherweise erfasst und analysiert wird, um in Prognoseschritten handhabbar behandelt zu werden (Kap. 4). Beide Zugänge werden sodann zusammengeführt (Kap. 5), um am Beispiel einer Hochwasserschutzkonzeption im Rahmen der SUP Landschaftliche Prognosen zu erarbeiten. Die Prognose setzt methodisch mit der Beschreibung des zu verwendenden Landschaftsmodells und der Klärung des Modellzwecks ein. Bezugsbasis ist die Beschreibung des Charakters einzelner logisch hergeleiteter Landschaftseinheiten bzw. Landschaftsräume, die typisiert werden. Die Prognose selber unterscheidet zwischen der Abschätzung zu erwartender Landschaftsveränderungen im Sinne der ‚Status-quo-Prognose’ (einschließlich der Entwicklung von drei Szenarien möglicher Zukunftslandschaften bis 2030) und der Wirkungsabschätzungen verschiedener Maßnahmen bzw. Planungsalternativen und zwar zunächst raumunabhängig, und dann raumkonkret. Besondere Bedeutung bei den Wirkungsabschätzungen erhält die klare Trennung von Sach- und Wertebene, eine angemessene Visualisierung und die Dokumentation von Informationslücken und Unsicherheiten bei der Prognose. Diskutiert wird u.a. (Kap. 6) · die Bildung und Abgrenzung landschaftlicher Einheiten und Typen in Bezug zu der Aufgabe, landschaftliche Eigenart zu definieren und planerisch handhabbar und anwendbar zu bestimmen, · die Bedeutung angemessener Visualisierung zur Unterstützung von Beteiligungsverfahren und · die Bestimmung des so genannten ‚Raumwiderstandes’. Beigefügt sind zwei Karten des gesamten Bearbeitungsgebietes: Karte 1 „Landschaftstypen“, Karte 2 „Maßnahmentypen des Hochwasserschutzes mit möglichen Synergieeffekten für die Landschaft“.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die q-Analysis ist eine spezielle Diskretisierung der Analysis auf einem Gitter, welches eine geometrische Folge darstellt, und findet insbesondere in der Quantenphysik eine breite Anwendung, ist aber auch in der Theorie der q-orthogonalen Polynome und speziellen Funktionen von großer Bedeutung. Die betrachteten mathematischen Objekte aus der q-Welt weisen meist eine recht komplizierte Struktur auf und es liegt daher nahe, sie mit Computeralgebrasystemen zu behandeln. In der vorliegenden Dissertation werden Algorithmen für q-holonome Funktionen und q-hypergeometrische Reihen vorgestellt. Alle Algorithmen sind in dem Maple-Package qFPS, welches integraler Bestandteil der Arbeit ist, implementiert. Nachdem in den ersten beiden Kapiteln Grundlagen geschaffen werden, werden im dritten Kapitel Algorithmen präsentiert, mit denen man zu einer q-holonomen Funktion q-holonome Rekursionsgleichungen durch Kenntnis derer q-Shifts aufstellen kann. Operationen mit q-holonomen Rekursionen werden ebenfalls behandelt. Im vierten Kapitel werden effiziente Methoden zur Bestimmung polynomialer, rationaler und q-hypergeometrischer Lösungen von q-holonomen Rekursionen beschrieben. Das fünfte Kapitel beschäftigt sich mit q-hypergeometrischen Potenzreihen bzgl. spezieller Polynombasen. Wir formulieren einen neuen Algorithmus, der zu einer q-holonomen Rekursionsgleichung einer q-hypergeometrischen Reihe mit nichttrivialem Entwicklungspunkt die entsprechende q-holonome Rekursionsgleichung für die Koeffizienten ermittelt. Ferner können wir einen neuen Algorithmus angeben, der umgekehrt zu einer q-holonomen Rekursionsgleichung für die Koeffizienten eine q-holonome Rekursionsgleichung der Reihe bestimmt und der nützlich ist, um q-holonome Rekursionen für bestimmte verallgemeinerte q-hypergeometrische Funktionen aufzustellen. Mit Formulierung des q-Taylorsatzes haben wir schließlich alle Zutaten zusammen, um das Hauptergebnis dieser Arbeit, das q-Analogon des FPS-Algorithmus zu erhalten. Wolfram Koepfs FPS-Algorithmus aus dem Jahre 1992 bestimmt zu einer gegebenen holonomen Funktion die entsprechende hypergeometrische Reihe. Wir erweitern den Algorithmus dahingehend, dass sogar Linearkombinationen q-hypergeometrischer Potenzreihen bestimmt werden können. ________________________________________________________________________________________________________________

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of vectors for the over-expression of tagged proteins in Dictyostelium were designed, constructed and tested. These vectors allow the addition of an N- or C-terminal tag (GFP, RFP, 3xFLAG, 3xHA, 6xMYC and TAP) with an optimized polylinker sequence and no additional amino acid residues at the N or C terminus. Different selectable markers (Blasticidin and gentamicin) are available as well as an extra chromosomal version; these allow copy number and thus expression level to be controlled, as well as allowing for more options with regard to complementation, co- and super-transformation. Finally, the vectors share standardized cloning sites, allowing a gene of interest to be easily transfered between the different versions of the vectors as experimental requirements evolve. The organisation and dynamics of the Dictyostelium nucleus during the cell cycle was investigated. The centromeric histone H3 (CenH3) variant serves to target the kinetochore to the centromeres and thus ensures correct chromosome segregation during mitosis and meiosis. A number of Dictyostelium histone H3-domain containing proteins as GFP-tagged fusions were expressed and it was found that one of them functions as CenH3 in this species. Like CenH3 from some other species, Dictyostelium CenH3 has an extended N-terminal domain with no similarity to any other known proteins. The targeting domain, comprising α-helix 2 and loop 1 of the histone fold is required for targeting CenH3 to centromeres. Compared to the targeting domain of other known and putative CenH3 species, Dictyostelium CenH3 has a shorter loop 1 region. The localisation of a variety of histone modifications and histone modifying enzymes was examined. Using fluorescence in situ hybridisation (FISH) and CenH3 chromatin-immunoprecipitation (ChIP) it was shown that the six telocentric centromeres contain all of the DIRS-1 and most of the DDT-A and skipper transposons. During interphase the centromeres remain attached to the centrosome resulting in a single CenH3 cluster which also contains the putative histone H3K9 methyltransferase SuvA, H3K9me3 and HP1 (heterochromatin protein 1). Except for the centromere cluster and a number of small foci at the nuclear periphery opposite the centromeres, the rest of the nucleus is largely devoid of transposons and heterochromatin associated histone modifications. At least some of the small foci correspond to the distal telomeres, suggesting that the chromosomes are organised in a Rabl-like manner. It was found that in contrast to metazoans, loading of CenH3 onto Dictyostelium centromeres occurs in late G2 phase. Transformation of Dictyostelium with vectors carrying the G418 resistance cassette typically results in the vector integrating into the genome in one or a few tandem arrays of approximately a hundred copies. In contrast, plasmids containing a Blasticidin resistance cassette integrate as single or a few copies. The behaviour of transgenes in the nucleus was examined by FISH, and it was found that low copy transgenes show apparently random distribution within the nucleus, while transgenes with more than approximately 10 copies cluster at or immediately adjacent to the centromeres in interphase cells regardless of the actual integration site along the chromosome. During mitosis the transgenes show centromere-like behaviour, and ChIP experiments show that transgenes contain the heterochromatin marker H3K9me2 and the centromeric histone variant H3v1. This clustering, and centromere-like behaviour was not observed on extrachromosomal transgenes, nor on a line where the transgene had integrated into the extrachromosomal rDNA palindrome. This suggests that it is the repetitive nature of the transgenes that causes the centromere-like behaviour. A Dictyostelium homolog of DET1, a protein largely restricted to multicellular eukaryotes where it has a role in developmental regulation was identified. As in other species Dictyostelium DET1 is nuclear localised. In ChIP experiments DET1 was found to bind the promoters of a number of developmentally regulated loci. In contrast to other species where it is an essential protein, loss of DET1 is not lethal in Dictyostelium, although viability is greatly reduced. Loss of DET1 results in delayed and abnormal development with enlarged aggregation territories. Mutant slugs displayed apparent cell type patterning with a bias towards pre-stalk cell types.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thema der vorliegenden Arbeit ist die Bestimmung von Basen von Räumen spezieller harmonischer 2-Koketten auf Bruhat-Tits-Gebäuden der PGL(3) über Funktionenkörpern. Hierzu wird der Raum der speziellen harmonischen 2-Koketten auf dem Bruhat-Tits-Gebäude der PGL(3) zunächst mit gewissen komplexen Linearkombinationen von 2-Simplizes des Quotientenkomplexes, sogenannten geschlossenen Flächen, identifiziert und anschließend durch verallgemeinerte Modulsymbole beschrieben. Die Darstellung der Gruppe der Modulsymbole durch Erzeuger und Relationen ermöglicht die Bestimmung einer endlichen Basis des Raums der speziellen harmonischen 2-Koketten. Die so gewonnenen Erkenntnisse können zur Untersuchung von Hecke-Operatoren auf speziellen harmonischen 2-Koketten genutzt werden. Mithilfe des hergeleiteten Isomorphismus zwischen dem Raum der speziellen harmonischen 2-Koketten und dem Raum der geschlossenen Flächen wird die Theorie der Hecke-Operatoren auf den Raum der geschlossenen Flächen übertragen. Dies ermöglicht die Berechnung von Abbildungsmatrizen der Hecke-Operatoren auf dem Raum der harmonischen 2-Koketten durch die Auswertung auf den geschlossenen Flächen.