15 resultados para projection

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diagnostic radiology represents the largest man-made contribution to population radiation doses in Europe. To be able to keep the diagnostic benefit versus radiation risk ratio as high as possible, it is important to understand the quantitative relationship between the patient radiation dose and the various factors which affect the dose, such as the scan parameters, scan mode, and patient size. Paediatric patients have a higher probability for late radiation effects, since longer life expectancy is combined with the higher radiation sensitivity of the developing organs. The experience with particular paediatric examinations may be very limited and paediatric acquisition protocols may not be optimised. The purpose of this thesis was to enhance and compare different dosimetric protocols, to promote the establishment of the paediatric diagnostic reference levels (DRLs), and to provide new data on patient doses for optimisation purposes in computed tomography (with new applications for dental imaging) and in paediatric radiography. Large variations in radiation exposure in paediatric skull, sinus, chest, pelvic and abdominal radiography examinations were discovered in patient dose surveys. There were variations between different hospitals and examination rooms, between different sized patients, and between imaging techniques; emphasising the need for harmonisation of the examination protocols. For computed tomography, a correction coefficient, which takes individual patient size into account in patient dosimetry, was created. The presented patient size correction method can be used for both adult and paediatric purposes. Dental cone beam CT scanners provided adequate image quality for dentomaxillofacial examinations while delivering considerably smaller effective doses to patient compared to the multi slice CT. However, large dose differences between cone beam CT scanners were not explained by differences in image quality, which indicated the lack of optimisation. For paediatric radiography, a graphical method was created for setting the diagnostic reference levels in chest examinations, and the DRLs were given as a function of patient projection thickness. Paediatric DRLs were also given for sinus radiography. The detailed information about the patient data, exposure parameters and procedures provided tools for reducing the patient doses in paediatric radiography. The mean tissue doses presented for paediatric radiography enabled future risk assessments to be done. The calculated effective doses can be used for comparing different diagnostic procedures, as well as for comparing the use of similar technologies and procedures in different hospitals and countries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents a population projection for Namibia for years 2011–2020. In many countries of sub-Saharan Africa, including Namibia, the population growth is still continuing even though the fertility rates have declined. However, many of these countries suffer from a large HIV epidemic that is slowing down the population growth. In Namibia, the epidemic has been severe. Therefore, it is important to assess the effect of HIV/AIDS on the population of Namibia in the future. Demographic research on Namibia has not been very extensive, and data on population is not widely available. According to the studies made, fertility has been shown to be generally declining and mortality has been significantly increasing due to AIDS. Previous population projections predict population growth for Namibia in the near future, yet HIV/AIDS is affecting the future population developments. For the projection constructed in this study, data on population is taken from the two most recent censuses, from 1991 and 2001. Data on HIV is available from HIV Sentinel Surveys 1992–2008, which test pregnant women for HIV in antenatal clinics. Additional data are collected from different sources and recent studies. The projection is made with software (EPP and Spectrum) specially designed for developing countries with scarce data. The projection includes two main scenarios which have different assumptions concerning the development of the HIV epidemic. In addition, two hypothetical scenarios are made: the first considering the case where HIV epidemic would never have existed and the second considering the case where HIV treatment would never have existed. The results indicate population growth for Namibia. Population in the 2001 census was 1.83 million and is projected to result in 2.38/2.39 million in 2020 in the first two scenarios. Without HIV, population would be 2.61 million and without treatment 2.30 million in 2020. Urban population is growing faster than rural. Even though AIDS is increasing mortality, the past high fertility rates still keep young adult age groups quite large. The HIV epidemic shows to be slowing down, but it is still increasing the mortality of the working-aged population. The initiation of HIV treatment in 2004 in the public sector seems to have had an effect on many projected indicators, diminishing the impact of HIV on the population. For example, the rise of mortality is slowing down.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In dentistry, basic imaging techniques such as intraoral and panoramic radiography are in most cases the only imaging techniques required for the detection of pathology. Conventional intraoral radiographs provide images with sufficient information for most dental radiographic needs. Panoramic radiography produces a single image of both jaws, giving an excellent overview of oral hard tissues. Regardless of the technique, plain radiography has only a limited capability in the evaluation of three-dimensional (3D) relationships. Technological advances in radiological imaging have moved from two-dimensional (2D) projection radiography towards digital, 3D and interactive imaging applications. This has been achieved first by the use of conventional computed tomography (CT) and more recently by cone beam CT (CBCT). CBCT is a radiographic imaging method that allows accurate 3D imaging of hard tissues. CBCT has been used for dental and maxillofacial imaging for more than ten years and its availability and use are increasing continuously. However, at present, only best practice guidelines are available for its use, and the need for evidence-based guidelines on the use of CBCT in dentistry is widely recognized. We evaluated (i) retrospectively the use of CBCT in a dental practice, (ii) the accuracy and reproducibility of pre-implant linear measurements in CBCT and multislice CT (MSCT) in a cadaver study, (iii) prospectively the clinical reliability of CBCT as a preoperative imaging method for complicated impacted lower third molars, and (iv) the tissue and effective radiation doses and image quality of dental CBCT scanners in comparison with MSCT scanners in a phantom study. Using CBCT, subjective identification of anatomy and pathology relevant in dental practice can be readily achieved, but dental restorations may cause disturbing artefacts. CBCT examination offered additional radiographic information when compared with intraoral and panoramic radiographs. In terms of the accuracy and reliability of linear measurements in the posterior mandible, CBCT is comparable to MSCT. CBCT is a reliable means of determining the location of the inferior alveolar canal and its relationship to the roots of the lower third molar. CBCT scanners provided adequate image quality for dental and maxillofacial imaging while delivering considerably smaller effective doses to the patient than MSCT. The observed variations in patient dose and image quality emphasize the importance of optimizing the imaging parameters in both CBCT and MSCT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The United States is the world s single biggest market area, where the demand for graphic papers has increased by 80 % during the last three decades. However, during the last two decades there have been very big unpredictable changes in the graphic paper markets. For example, the consumption of newsprint started to decline from the late 1980 s, which was surprising compared to the historical consumption and projections. The consumption has declined since. The aim of this study was to see how magazine paper consumption will develop in the United States until 2030. The long-term consumption projection was made using mainly two methods. The first method was to use trend analysis to see how and if the consumption has changed since 1980. The second method was to use qualitative estimate. These estimates are then compared to the so-called classical model projections, which are usually mentioned and used in forestry literature. The purpose of the qualitative analysis is to study magazine paper end-use purposes and to analyze how and with what intensity the changes in society will effect to magazine paper consumption in the long-term. The framework of this study covers theories such as technology adaptation, electronic substitution, electronic publishing and Porter s threat of substitution. Because this study deals with markets, which have showed signs of structural change, a very substantial part of this study covers recent development and newest possible studies and statistics. The following were among the key findings of this study. Different end-uses have very different kinds of future. Electronic substitution is very likely in some end-use purposes, but not in all. Young people i.e. future consumers have very different manners, habits and technological opportunities than our parents did. These will have substantial effects in magazine paper consumption in the long-term. This study concludes to the fact that the change in magazine paper consumption is more likely to be gradual (evolutionary) than sudden collapse (revolutionary). It is also probable that the years of fast growing consumption of magazine papers are behind. Besides the decelerated growth, the consumption of magazine papers will decline slowly in the long-term. The decline will be faster depending on how far in the future we ll extend the study to.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The topic of this dissertation is the geometric and isometric theory of Banach spaces. This work is motivated by the known Banach-Mazur rotation problem, which asks whether each transitive separable Banach space is isometrically a Hilbert space. A Banach space X is said to be transitive if the isometry group of X acts transitively on the unit sphere of X. In fact, some weaker symmetry conditions than transitivity are studied in the dissertation. One such condition is an almost isometric version of transitivity. Another investigated condition is convex-transitivity, which requires that the closed convex hull of the orbit of any point of the unit sphere under the rotation group is the whole unit ball. Following the tradition developed around the rotation problem, some contemporary problems are studied. Namely, we attempt to characterize Hilbert spaces by using convex-transitivity together with the existence of a 1-dimensional bicontractive projection on the space, and some mild geometric assumptions. The convex-transitivity of some vector-valued function spaces is studied as well. The thesis also touches convex-transitivity of Banach lattices and resembling geometric cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of an atomic decomposition was introduced by Coifman and Rochberg (1980) for weighted Bergman spaces on the unit disk. By the Riemann mapping theorem, functions in every simply connected domain in the complex plane have an atomic decomposition. However, a decomposition resulting from a conformal mapping of the unit disk tends to be very implicit and often lacks a clear connection to the geometry of the domain that it has been mapped into. The lattice of points, where the atoms of the decomposition are evaluated, usually follows the geometry of the original domain, but after mapping the domain into another this connection is easily lost and the layout of points becomes seemingly random. In the first article we construct an atomic decomposition directly on a weighted Bergman space on a class of regulated, simply connected domains. The construction uses the geometric properties of the regulated domain, but does not explicitly involve any conformal Riemann map from the unit disk. It is known that the Bergman projection is not bounded on the space L-infinity of bounded measurable functions. Taskinen (2004) introduced the locally convex spaces LV-infinity consisting of measurable and HV-infinity of analytic functions on the unit disk with the latter being a closed subspace of the former. They have the property that the Bergman projection is continuous from LV-infinity onto HV-infinity and, in some sense, the space HV-infinity is the smallest possible substitute to the space H-infinity of analytic functions. In the second article we extend the above result to a smoothly bounded strictly pseudoconvex domain. Here the related reproducing kernels are usually not known explicitly, and thus the proof of continuity of the Bergman projection is based on generalised Forelli-Rudin estimates instead of integral representations. The minimality of the space LV-infinity is shown by using peaking functions first constructed by Bell (1981). Taskinen (2003) showed that on the unit disk the space HV-infinity admits an atomic decomposition. This result is generalised in the third article by constructing an atomic decomposition for the space HV-infinity on a smoothly bounded strictly pseudoconvex domain. In this case every function can be presented as a linear combination of atoms such that the coefficient sequence belongs to a suitable Köthe co-echelon space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paradigm of computational vision hypothesizes that any visual function -- such as the recognition of your grandparent -- can be replicated by computational processing of the visual input. What are these computations that the brain performs? What should or could they be? Working on the latter question, this dissertation takes the statistical approach, where the suitable computations are attempted to be learned from the natural visual data itself. In particular, we empirically study the computational processing that emerges from the statistical properties of the visual world and the constraints and objectives specified for the learning process. This thesis consists of an introduction and 7 peer-reviewed publications, where the purpose of the introduction is to illustrate the area of study to a reader who is not familiar with computational vision research. In the scope of the introduction, we will briefly overview the primary challenges to visual processing, as well as recall some of the current opinions on visual processing in the early visual systems of animals. Next, we describe the methodology we have used in our research, and discuss the presented results. We have included some additional remarks, speculations and conclusions to this discussion that were not featured in the original publications. We present the following results in the publications of this thesis. First, we empirically demonstrate that luminance and contrast are strongly dependent in natural images, contradicting previous theories suggesting that luminance and contrast were processed separately in natural systems due to their independence in the visual data. Second, we show that simple cell -like receptive fields of the primary visual cortex can be learned in the nonlinear contrast domain by maximization of independence. Further, we provide first-time reports of the emergence of conjunctive (corner-detecting) and subtractive (opponent orientation) processing due to nonlinear projection pursuit with simple objective functions related to sparseness and response energy optimization. Then, we show that attempting to extract independent components of nonlinear histogram statistics of a biologically plausible representation leads to projection directions that appear to differentiate between visual contexts. Such processing might be applicable for priming, \ie the selection and tuning of later visual processing. We continue by showing that a different kind of thresholded low-frequency priming can be learned and used to make object detection faster with little loss in accuracy. Finally, we show that in a computational object detection setting, nonlinearly gain-controlled visual features of medium complexity can be acquired sequentially as images are encountered and discarded. We present two online algorithms to perform this feature selection, and propose the idea that for artificial systems, some processing mechanisms could be selectable from the environment without optimizing the mechanisms themselves. In summary, this thesis explores learning visual processing on several levels. The learning can be understood as interplay of input data, model structures, learning objectives, and estimation algorithms. The presented work adds to the growing body of evidence showing that statistical methods can be used to acquire intuitively meaningful visual processing mechanisms. The work also presents some predictions and ideas regarding biological visual processing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines the mythology in and social reality behind a group of texts from the Nag Hammadi and related literature, to which certain leaders of the early church attached the label, Ophite, i.e., snake people. In the mythology, which essentially draws upon and rewrites the Genesis paradise story, the snake's advice to eat from the tree of knowledge is positive, the creator and his angels are demonic beasts and the true godhead is depicted as an androgynous heavenly projection of Adam and Eve. It will be argued that this unique mythology is attested in certain Coptic texts from the Nag Hammadi and Berlin 8502 Codices (On the Origin of the World, Hypostasis of the Archons, Apocryphon of John, Eugnostos, Sophia of Jesus Christ), as well as in reports by Irenaeus (Adversus Haereses 1.30), Origen (Contra Celsum 6.24-38) and Epiphanius (Panarion 26). It will also be argued that this so-called Ophite evidence is essential for a proper understanding of Sethian Gnosticism, often today considered one of the earliest forms of Gnosticism; there seems to have occurred a Sethianization of Ophite mythology. I propose that we replace the current Sethian Gnostic category by a new one that not only adds texts that draw upon the Ophite mythology alongside these Sethian texts, but also arranges the material in smaller typological units. I also propose we rename this remodelled and expanded Sethian corpus "Classic Gnostic." I have divided the thesis into four parts: (I) Introduction; (II) Myth and Innovation; (III) Ritual; and (IV) Conclusion. In Part I, the sources and previous research on Ophites and Sethians will be examined, and the new Classic Gnostic category will be introduced to provide a framework for the study of the Ophite evidence. Chapters in Part II explore key themes in the mythology of our texts, first by text comparison (to show that certain texts represent the Ophite mythology and that this mythology is different from Sethianism), and then by attempting to unveil social circumstances that may have given rise to such myths. Part III assesses heresiological claims of Ophite rituals, and Part IV is the conclusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technological development of fast multi-sectional, helical computed tomography (CT) scanners has allowed computed tomography perfusion (CTp) and angiography (CTA) in evaluating acute ischemic stroke. This study focuses on new multidetector computed tomography techniques, namely whole-brain and first-pass CT perfusion plus CTA of carotid arteries. Whole-brain CTp data is acquired during slow infusion of contrast material to achieve constant contrast concentration in the cerebral vasculature. From these data quantitative maps are constructed of perfused cerebral blood volume (pCBV). The probability curve of cerebral infarction as a function of normalized pCBV was determined in patients with acute ischemic stroke. Normalized pCBV, expressed as a percentage of contralateral normal brain pCBV, was determined in the infarction core and in regions just inside and outside the boundary between infarcted and noninfarcted brain. Corresponding probabilities of infarction were 0.99, 0.96, and 0.11, R² was 0.73, and differences in perfusion between core and inner and outer bands were highly significant. Thus a probability of infarction curve can help predict the likelihood of infarction as a function of percentage normalized pCBV. First-pass CT perfusion is based on continuous cine imaging over a selected brain area during a bolus injection of contrast. During its first passage, contrast material compartmentalizes in the intravascular space, resulting in transient tissue enhancement. Functional maps such as cerebral blood flow (CBF), and volume (CBV), and mean transit time (MTT) are then constructed. We compared the effects of three different iodine concentrations (300, 350, or 400 mg/mL) on peak enhancement of normal brain tissue and artery and vein, stratified by region-of-interest (ROI) location, in 102 patients within 3 hours of stroke onset. A monotonic increasing peak opacification was evident at all ROI locations, suggesting that CTp evaluation of patients with acute stroke is best performed with the highest available concentration of contrast agent. In another study we investigated whether lesion volumes on CBV, CBF, and MTT maps within 3 hours of stroke onset predict final infarct volume, and whether all these parameters are needed for triage to intravenous recombinant tissue plasminogen activator (IV-rtPA). The effect of IV-rtPA on the affected brain by measuring salvaged tissue volume in patients receiving IV-rtPA and in controls was investigated also. CBV lesion volume did not necessarily represent dead tissue. MTT lesion volume alone can serve to identify the upper size limit of the abnormally perfused brain, and those with IV-rtPA salvaged more brain than did controls. Carotid CTA was compared with carotid DSA in grading of stenosis in patients with stroke symptoms. In CTA, the grade of stenosis was determined by means of axial source and maximum intensity projection (MIP) images as well as a semiautomatic vessel analysis. CTA provides an adequate, less invasive alternative to conventional DSA, although tending to underestimate clinically relevant grades of stenosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical weather prediction (NWP) models provide the basis for weather forecasting by simulating the evolution of the atmospheric state. A good forecast requires that the initial state of the atmosphere is known accurately, and that the NWP model is a realistic representation of the atmosphere. Data assimilation methods are used to produce initial conditions for NWP models. The NWP model background field, typically a short-range forecast, is updated with observations in a statistically optimal way. The objective in this thesis has been to develope methods in order to allow data assimilation of Doppler radar radial wind observations. The work has been carried out in the High Resolution Limited Area Model (HIRLAM) 3-dimensional variational data assimilation framework. Observation modelling is a key element in exploiting indirect observations of the model variables. In the radar radial wind observation modelling, the vertical model wind profile is interpolated to the observation location, and the projection of the model wind vector on the radar pulse path is calculated. The vertical broadening of the radar pulse volume, and the bending of the radar pulse path due to atmospheric conditions are taken into account. Radar radial wind observations are modelled within observation errors which consist of instrumental, modelling, and representativeness errors. Systematic and random modelling errors can be minimized by accurate observation modelling. The impact of the random part of the instrumental and representativeness errors can be decreased by calculating spatial averages from the raw observations. Model experiments indicate that the spatial averaging clearly improves the fit of the radial wind observations to the model in terms of observation minus model background (OmB) standard deviation. Monitoring the quality of the observations is an important aspect, especially when a new observation type is introduced into a data assimilation system. Calculating the bias for radial wind observations in a conventional way can result in zero even in case there are systematic differences in the wind speed and/or direction. A bias estimation method designed for this observation type is introduced in the thesis. Doppler radar radial wind observation modelling, together with the bias estimation method, enables the exploitation of the radial wind observations also for NWP model validation. The one-month model experiments performed with the HIRLAM model versions differing only in a surface stress parameterization detail indicate that the use of radar wind observations in NWP model validation is very beneficial.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel method for functional lung imaging was introduced by adapting the K-edge subtraction method (KES) to in vivo studies of small animals. In this method two synchrotron radiation energies, which bracket the K-edge of the contrast agent, are used for simultaneous recording of absorption-contrast images. Stable xenon gas is used as the contrast agent, and imaging is performed in projection or computed tomography (CT) mode. Subtraction of the two images yields the distribution of xenon, while removing practically all features due to other structures, and the xenon density can be calculated quantitatively. Because the images are recorded simultaneously, there are no movement artifacts in the subtraction image. Time resolution for a series of CT images is one image/s, which allows functional studies. Voxel size is 0.1mm3, which is an order better than in traditional lung imaging methods. KES imaging technique was used in studies of ventilation distribution and the effects of histamine-induced airway narrowing in healthy, mechanically ventilated, and anaesthetized rabbits. First, the effect of tidal volume on ventilation was studied, and the results show that an increase in tidal volume without an increase in minute ventilation results a proportional increase in regional ventilation. Second, spiral CT was used to quantify the airspace volumes in lungs in normal conditions and after histamine aerosol inhalation, and the results showed large patchy filling defects in peripheral lungs following histamine provocation. Third, the kinetics of proximal and distal airway response to histamine aerosol were examined, and the findings show that the distal airways react immediately to histamine and start to recover, while the reaction and the recovery in proximal airways is slower. Fourth, the fractal dimensions of lungs was studied, and it was found that the fractal dimension is higher at the apical part of the lungs compared to the basal part, indicating structural differences between apical and basal lung level. These results provide new insights to lung function and the effects of drug challenge studies. Nowadays the technique is available at synchrotron radiation facilities, but the compact synchrotron radiation sources are being developed, and in relatively near future the method may be used at hospitals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines Institutional Twinning in Morocco as a case of EU cooperation through the pragmatic, ethical and moral logics of reason in Jürgen Habermas’s discourse ethics. As a former accession tool, Twinning was introduced in 2004 for legal approximation in the context of the European Neighborhood Policy. Twinning is a unique instrument in development cooperation from a legal perspective. With its long historical and cultural ties to Europe, Morocco presents an interesting case study of this new form of cooperation. We will analyse motives behind the Twinning projects on illegal immigration, environment legislation and customs reform. As Twinning is a new policy instrument within the ENP context, there is relatively little preceding research, which, in itself, constitutes a reason to inquire into the subject. While introducing useful categories, the approaches discussing “normative power Europe” do not offer methodological tools precise enough to analyse the motives of the Twinning cooperation from a broad ethical standpoint. Helene Sjursen as well as Esther Barbé and Elisabeth Johansson-Nogués have elaborated on Jürgen Habermas’ discourse ethics in determining the extent of altruism in the ENP in general. Situating the analysis in the process-oriented framework of Critical Theory, discourse ethics provides the methodological framework for our research. The case studies reveal that the context in which they operate affects the pragmatic, ethical and moral aspirations of the actors. The utilitarian notion of profit maximization is quite pronounced both in terms of the number of Twinning projects in the economic sphere and the pragmatic logics of reason instrumental to security and trade-related issues. The historical background as well internal processes, however, contribute to defining areas of mutual interest to the actors as well as the motives Morocco and the EU sometimes described as the external projection of internal values. Through its different aspects, Twinning cooperation portrays the functioning of the pragmatic, ethical and moral logics of reason in international relations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The nutritional quality of the product as well as other quality attributes like microbiological and sensory quality are essential factors in baby food industry, and therefore different alternative sterilizing methods for conventional heating processes are of great interest in this food sector. This report gives an overview on different sterilization techniques for baby food. The report is a part of the work done in work package 3 ”QACCP Analysis Processing: Quality – driven distribution and processing chain analysis“ in the Core Organic ERANET project called Quality analysis of critical control points within the whole food chain and their impact on food quality, safety and health (QACCP). The overall objective of the project is to optimise organic production and processing in order to improve food safety as well as nutritional quality and increase health promoting aspects in consumer products. The approach will be a chain analysis approach which addresses the link between farm and fork and backwards from fork to farm. The objective is to improve product related quality management in farming (towards testing food authenticity) and processing (towards food authenticity and sustainable processes. The articles in this volume do not necessarily reflect the Core Organic ERANET’s views and in no way anticipate the Core Organic ERANET’s future policy in this area. The contents of the articles in this volume are the sole responsibility of the authors. The information contained here in, including any expression of opinion and any projection or forecast, has been obtained from sources believed by the authors to be reliable but is not guaranteed as to accuracy or completeness. The information is supplied without obligation and on the understanding that any person who acts upon it or otherwise changes his/her position in reliance thereon does so entirely at his/her own risk. The writers gratefully acknowledge the financial support from the Core Organic Funding Body: Ministry of Agriculture and Forestry, Finland, Swiss Federal Office for Agriculture, Switzerland and Federal Ministry of Consumer Protection, Food and Agriculture, Germany.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines Institutional Twinning in Morocco as a case of EU cooperation through the pragmatic, ethical and moral logics of reason in Jürgen Habermas’s discourse ethics. As a former accession tool, Twinning was introduced in 2004 for legal approximation in the context of the European Neighborhood Policy. Twinning is a unique instrument in development cooperation from a legal perspective. With its long historical and cultural ties to Europe, Morocco presents an interesting case study of this new form of cooperation. We will analyse motives behind the Twinning projects on illegal immigration, environment legislation and customs reform. As Twinning is a new policy instrument within the ENP context, there is relatively little preceding research, which, in itself, constitutes a reason to inquire into the subject. While introducing useful categories, the approaches discussing “normative power Europe” do not offer methodological tools precise enough to analyse the motives of the Twinning cooperation from a broad ethical standpoint. Helene Sjursen as well as Esther Barbé and Elisabeth Johansson-Nogués have elaborated on Jürgen Habermas’ discourse ethics in determining the extent of altruism in the ENP in general. Situating the analysis in the process-oriented framework of Critical Theory, discourse ethics provides the methodological framework for our research. The case studies reveal that the context in which they operate affects the pragmatic, ethical and moral aspirations of the actors. The utilitarian notion of profit maximization is quite pronounced both in terms of the number of Twinning projects in the economic sphere and the pragmatic logics of reason instrumental to security and trade-related issues. The historical background as well internal processes, however, contribute to defining areas of mutual interest to the actors as well as the motives Morocco and the EU sometimes described as the external projection of internal values. Through its different aspects, Twinning cooperation portrays the functioning of the pragmatic, ethical and moral logics of reason in international relations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Toeplitz operators are among the most important classes of concrete operators with applications to several branches of pure and applied mathematics. This doctoral thesis deals with Toeplitz operators on analytic Bergman, Bloch and Fock spaces. Usually, a Toeplitz operator is a composition of multiplication by a function and a suitable projection. The present work deals with generalizing the notion to the case where the function is replaced by a distributional symbol. Fredholm theory for Toeplitz operators with matrix-valued symbols is also considered. The subject of this thesis belongs to the areas of complex analysis, functional analysis and operator theory. This work contains five research articles. The articles one, three and four deal with finding suitable distributional classes in Bergman, Fock and Bloch spaces, respectively. In each case the symbol class to be considered turns out to be a certain weighted Sobolev-type space of distributions. The Bergman space setting is the most straightforward. When dealing with Fock spaces, some difficulties arise due to unboundedness of the complex plane and the properties of the Gaussian measure in the definition. In the Bloch-type spaces an additional logarithmic weight must be introduced. Sufficient conditions for boundedness and compactness are derived. The article two contains a portion showing that under additional assumptions, the condition for Bergman spaces is also necessary. The fifth article deals with Fredholm theory for Toeplitz operators having matrix-valued symbols. The essential spectra and index theorems are obtained with the help of Hardy space factorization and the Berezin transform, for instance. The article two also has a part dealing with matrix-valued symbols in a non-reflexive Bergman space, in which case a condition on the oscillation of the symbol (a logarithmic VMO-condition) must be added.