105 resultados para Köln


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents four essays in energy economics. The first essay investigates one of the workhorse models of resource economics, the Hotelling model of an inter-temporally optimizing resource extracting firm. The Hotelling model provides a convincing theory of fundamental concepts like resource scarcity, but very few empirical validations of the model have been conducted. This essay attempts to empirically validate the Hotelling model by first expanding it to include exploration activity and market power and then using a newly constructed data set for the uranium mining industry to test whether a major resource extracting mining firm in the industry is following the theory’s predictions. The results show that the theory is rejected in all considered settings. The second and third essays investigate the difference in market outcomes under spot-market based trade as compared to long-term contract based trade in oligopolistic markets with investments. The second essay investigates analytically the difference in market outcomes in an electricity market setting, showing that investments and consumer welfare may be higher under spot-market based trade than under long-term contracts. The third essay proposes techniques to solve large-scale models of this kind, empirically, by exploring the practicability of this approach in an application to the international metallurgical coal market. The final essay investigates the influence of policy uncertainty on investment decisions. With France debating the role of nuclear technology, this essay analyses how policy uncertainty regarding nuclear power in France may feature in the French and European power sector. Applying a stochastic model for the European power system, the analysis shows that the costs of uncertainty in this particular application are rather low compared to the overall costs of a nuclear phase-out.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study explores EUropean geopolitical agency in a distinct spatio-temporal context: the Arctic region of the early 21st century. Thus, it provides an in-depth analysis of the European Union’s process to construct EUropean legitimacy and credibility in its ‘Northern Neighbourhood’ between 2008 and 2014. Embedded in a conceptual and methodological framework using critical geopolitics, this study assesses the strategic policy reasoning of the EU and the implicit geopolitical discourses that guide and determine a particular line of argumentation so as to claim a ‘legitimate’ role in the Arctic and accordingly construct a distinct ‘EUropean Arctic space’. In doing so, it establishes a clearer picture on the (narrated) regional interests of the EU and the related developed policy and concrete steps taken in order to get hold of these interests. Eventually, the analysis gets to the conceptual bottom of what exactly fashioned the EU with geopolitical agency in the circumpolar North. As a complementary explanation, this study provides a thick description of the area under scrutiny – the Arctic region – in order to explicate the systemic context that conditioned the EU’s regional demeanour and action. Elucidated along the lines of Arctic history and identity, rights, interests and responsibility, it delineates the emergence of the Arctic as a region of and for geopolitics. The findings indicate that the sui generis character of the Arctic as EUropean neighbourhood essentially determined the EU’s regional performance. It explicates that the Union’s ‘traditional’ geopolitical models of civilian or normative power got entangled in a fluid state of Arctic affairs: a distinct regional system, characterised by few strong state actors with pronounced national Arctic interests and identities, and an indefinite local context of environmental changes, economic uncertainties and social challenges. This study applies critical geopolitics in a Political Science context and essentially contributes to a broader understanding of EU foreign policy construction and behaviour. Ultimately, it offers an interdisciplinary approach on how to analyse EU external action by explicitly taking into account the internal and external social processes that ultimately condition a certain EUropean foreign policy performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The expression of a gene from transcription of the DNA into pre-messenger RNA (pre-mRNA) over translation of messenger RNA (mRNA) into protein is constantly monitored for errors. This quality control is necessary to guarantee successful gene expression. One quality control mechanism important to this thesis is called nonsense-mediated mRNA decay (NMD). NMD is a cellular process that eliminates mRNA transcripts harboring premature translation termination codons (PTCs). Furthermore, NMD is known to regulate certain transcripts with long 3′ UTRs. However, some mRNA transcripts are known to evade NMD. The mechanism of NMD activation has been subjected to many studies whereas NMD evasion or suppression still remains rather elusive. It has previously been shown that the cytoplasmic poly(A)-binding protein (PABPC1) is able to suppress NMD of certain transcripts. In this study I show that PABPC1 is able to suppress NMD of a long 3′ UTR-carrying reporter when tethered immediately downstream of the termination codon. I further am able to show the importance of the interaction between PABPC1 and eIF4G for NMD suppression, whereas the interaction between PABPC1 and eRF3a seems dispensable. These results indicate an involvement of efficient translation termination and potentially ribosome recycling in NMD suppression. I am able to show that if PABPC1 is too far removed from the terminating ribosome NMD is activated. After showing the importance of PABPC1 recruitment directly downstream of a terminating ribosome in NMD suppression, I am further able to demonstrate several different methods by which PABPC1 can be recruited. Fold-back of the poly(A)-tail mediated by two interacting proteins on opposite ends of a 3′ UTR manages to bring PABPC1 bound to the poly(A)-tail into close proximity of the terminating ribosome and therefore suppress NMD. Furthermore, small PAM2 peptides that are known to interact with the MLLE domain of PABPC1 are able to strongly suppress NMD initiated by either a long 3′ UTR or an EJC. I am also able to show the NMD antagonizing power of recruited PABPC1 for the known endogenous NMD target β-globin PTC39, which is responsible for the disease β-thalassemia. This shows the potential medical implications and application of suppressing NMD by recruiting PABPC1 into close proximity of a terminating ribosome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The m-AAA protease is a hexameric complex involved in processing of specific substrates and turnover of misfolded polypeptides in the mitochondrial inner membrane. In humans, the m-AAA protease is composed of AFG3L2 and paraplegin. Mutations in AFG3L2 have been implicated in dominant spinocerebellar ataxia (SCA28) and recessive spastic ataxia-neuropathy syndrome (SPAX5). Mutations of SPG7, encoding paraplegin, are linked to hereditary spastic paraplegia. In the mouse, a third subunit AFG3L1 is expressed. Various mouse models recapitulate the phenotype of these neurodegenerative disorders, however, the pathogenic mechanism of neurodegeneration is not completely understood. Here, we studied several mouse models and focused on cell-autonomous role of the m-AAA protease in neurons and myelinating cells. We show that lack of Afg3l2 triggers mitochondrial fragmentation and swelling, tau hyperphosphorylation and pathology in Afg3l2 full-body and forebrain neuron-specific knockout mice. Moreover, deletion of Afg3l2 in adult myelinating cells causes early-onset mitochondrial abnormalities as in the neurons, but the survival of these cells is not affected, which is a contrast to early neuronal death. Despite the fact that myelinating cells have been previously shown to survive respiratory deficiency by glycolysis, total ablation of the m-AAA protease by deleting Afg3l2 in an Afg3l1 null background (DKO), leads to myelinating cell demise and subsequently progressive axonal demyelination. Interestingly, DKO mice show premature hair greying due to loss of melanoblasts. Together, our data demonstrate cell-autonomous survival thresholds to m-AAA protease deficiency, and an essential role of the m-AAA protease to prevent cell death independent from mitochondrial dynamics and the oxidative capacity of the cell. Thus, our findings provide novel insights to the pathogenesis of diseases linked to m-AAA protease deficiency, and also establish valuable mitochondrial dysfunctional mouse models to study other neurodegenerative diseases, such as tauopathies and demyelinating diseases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

On most if not all evaluatively relevant dimensions such as the temperature level, taste intensity, and nutritional value of a meal, one range of adequate, positive states is framed by two ranges of inadequate, negative states, namely too much and too little. This distribution of positive and negative states in the information ecology results in a higher similarity of positive objects, people, and events to other positive stimuli as compared to the similarity of negative stimuli to other negative stimuli. In other words, there are fewer ways in which an object, a person, or an event can be positive as compared to negative. Oftentimes, there is only one way in which a stimulus can be positive (e.g., a good meal has to have an adequate temperature level, taste intensity, and nutritional value). In contrast, there are many different ways in which a stimulus can be negative (e.g., a bad meal can be too hot or too cold, too spicy or too bland, or too fat or too lean). This higher similarity of positive as compared to negative stimuli is important, as similarity greatly impacts speed and accuracy on virtually all levels of information processing, including attention, classification, categorization, judgment and decision making, and recognition and recall memory. Thus, if the difference in similarity between positive and negative stimuli is a general phenomenon, it predicts and may explain a variety of valence asymmetries in cognitive processing (e.g., positive as compared to negative stimuli are processed faster but less accurately). In my dissertation, I show that the similarity asymmetry is indeed a general phenomenon that is observed in thousands of words and pictures. Further, I show that the similarity asymmetry applies to social groups. Groups stereotyped as average on the two dimensions agency / socio-economic success (A) and conservative-progressive beliefs (B) are stereotyped as positive or high on communion (C), while groups stereotyped as extreme on A and B (e.g., managers, homeless people, punks, and religious people) are stereotyped as negative or low on C. As average groups are more similar to one another than extreme groups, according to this ABC model of group stereotypes, positive groups are mentally represented as more similar to one another than negative groups. Finally, I discuss implications of the ABC model of group stereotypes, pointing to avenues for future research on how stereotype content shapes social perception, cognition, and behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The European Commission has been negotiating Economic Partnership Agreements (EPAs) with Regional Economic Communities of African, Caribbean and Pacific Group of States since 2002. The outcomes have been mixed. The negotiations with the Caribbean Forum (CARIFORUM) concluded rather more quickly than was initially envisaged, whereas negotiations with West African Economic Community (ECOWAS) and the remaining ACP regions have been dragging on for several years. This research consequently addresses the key question of what accounts for the variations in the EPA negotiation outcomes, making use of a comparative research approach. It evaluates the explanatory power of three research variables in accounting for the variation in the EPA negotiations outcomes – namely, Best Alternative to the Negotiated Agreement (BATNA); negotiation strategies; and the issues linkage approach – which are deduced from negotiation theory. Principally, the study finds that, the outcomes of the EPA negotiations predominantly depended on the presence or otherwise of a “Best Alternative” to the proposed EPA; that is then complemented by the negotiation strategies pursued by the parties, and the joint application of issues linkage mechanism which facilitated a sense of mutual benefit from the agreements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Early intervention is the key to spoken language for hearing impaired children. A severe hearing loss diagnosis in young children raises the urgent question on the optimal type of hearing aid device. As there is no recent data on comparing selection criteria for a specific hearing aid device, the goal of the Hearing Evaluation of Auditory Rehabilitation Devices (hEARd) project (Coninx & Vermeulen, 2012) evolved to collect and analyze interlingually comparable normative data on the speech perception performances of children with hearing aids and children with cochlear implants (CI). METHOD: In various institutions for hearing rehabilitation in Belgium, Germany and the Netherlands the Adaptive Auditory Speech Test AAST was used in the hEARd project, to determine speech perception abilities in kindergarten and school aged hearing impaired children. Results in the speech audiometric procedures were matched to the unaided hearing loss values of children using hearing aids and compared to results of children using CI. 277 data sets of hearing impaired children were analyzed. Results of children using hearing aids were summarized in groups as to their unaided hearing loss values. The grouping was related to the World Health Organization’s (WHO) grading of hearing impairment from mild (25–40 dB HL) to moderate (41–60 dB HL), severe (61-80 dB HL) and profound hearing impairment (80 dB HL and higher). RESULTS: AAST speech recognition results in quiet showed a significantly better performance for the CI group in comparison to the group of profoundly impaired hearing aid users as well as the group of severely impaired hearing aid users. However the CI users’ performances in speech perception in noise did not vary from the hearing aid users’ performances. Within the collected data analyses showed that children with a CI show an equivalent performance on speech perception in quiet as children using hearing aids with a “moderate” hearing impairment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional popular poetry follows a certain culture and has a literary canon that is very different from written poetry by educated authors. Among the elements that distinguish this poetry and that emphasize the continual presence of symbolism, which is manifested in the connotative reading of the texts, is a symbolism that refers to eroticism and to the romantic relationships of men. In these folk songs nature acquires a distinct meaning of love, for example by means of the presence of the olive as a frequent motif in Andalusian, Hispanic and European songs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adeno-associated viral (AAV) vectors are among the most widely used gene transfer systems in basic and pre-clinical research and have been employed in more than 160 clinical trials. AAV vectors are commonly produced in producer cell lines like HEK293 by co-transfection with a so-called vector plasmid and one (in this work) or two so-called helper plasmids. The vector plasmid contains the transgene cassette of interest (TEC) flanked by AAV’s inverted terminal repeats (ITRs) which serve as packaging signals, whereas the helper plasmid provides the required AAV and helper virus functions in trans. A pivotal aspect of AAV vectorology is the manufacturing of AAV vectors free from impurities arising during the production process. These impurities include AAV vector preparations that contain capsids containing prokaryotic sequences, e.g. antibiotic resistance genes originating from the producer plasmids. In the first part of the thesis we aimed at improving the safety of AAV vectors. As we found that encapsidated prokaryotic sequences (using the ampicillin resistance gene as indicator) cannot be re-moved by standard purification methods we investigated whether the producer plasmids could be replaced by Minicircles (MCs). MCs are circular DNA constructs which contain no functional or coding prokaryotic sequences; they only consist of the TEC and a short sequence required for production and purification. MC counterparts of a vector plasmid encoding for enhanced green fluorescent (eGFP) protein and a helper plasmid encoding for AAV serotype 2 (AAV2) and helper Adenovirus (Ad) genes were designed and produced by PlasmidFactory (Bielefeld, Germany). Using all four possible combinations of plasmid and MCs, single-stranded AAV2 vectors (ssAAV) and self-complementary AAV vectors (scAAV) were produced and characterized for vector quantity, quality and functionality. The analyses showed that plasmids can be replaced by MCs without decreasing the efficiency of vector production and vector quality. MC-derived scAAV vector preparations even exceeded plasmid-derived preparations, as they displayed up to 30-fold improved transduction efficiencies. Using MCs as tools, we found that the vector plasmid is the main source of encapsidated prokaryotic sequences. Remarkably, we found that plasmid-derived scAAV vector preparations contained a much higher relative amount of prokaryotic sequences (up to 26.1 %, relative to TEC) compared to ssAAV vector preparations (up to 2.9 %). By replacing both plasmids by MCs the amount of functional prokaryotic sequences could be decreased to below the limit of quantification. Additional analyses for DNA impurities other than prokaryotic sequences showed that scAAV vectors generally contained a higher amount of non-vector DNA (e.g. adenoviral sequences) than ssAAV vectors. For both, ssAAV and scAAV vector preparations, MC-derived vectors tended to contain lower amounts of foreign DNA. None of the vectors tested could be shown to induce immunogenicity. In summary we could demonstrate that the quality of AAV vector preparations could be significantly improved by replacing producer plasmids by MCs. Upon transduction of a target tissue, AAV vector genomes predominantly remain in an episomal state, as duplex DNA circles or concatemers. These episomal forms mediate long-term transgene expression in terminally differentiated cells, but are lost in proliferating cells due to cell division. Therefore, in the second part of the thesis, in cooperation with Claudia Hagedorn and Hans J. Lipps (University Witten/Herdecke) an AAV vector genome was equipped with an autonomous replication element (Scaffold/matrix attachment region (S/MAR)). AAV-S/MAR encoding for eGFP and a blasticidin resistance gene and a control vector with the same TEC but lacking the S/MAR element (AAV-ΔS/MAR) were produced and transduced into highly proliferative HeLa cells. Antibiotic pressure was employed to select for cells stably maintaining the vector genome. AAV-S/MAR transduced cells yielded a higher number of colonies than AAV-ΔS/MAR-transduced cells. Colonies derived from each vector transduction were picked and cultured further. They remained eGFP-positive (up to 70 days, maximum cultivation period) even in the absence of antibiotic selection pressure. Interestingly, the mitotic stability of both AAV-S/MAR and control vector AAV-ΔS/MAR was found to be a result of episomal maintenance of the vector genome. This finding indicates that, under specific conditions such as the mild selection pressure we employed, “common” AAV vectors persist episomally. Thus, the S/MAR element increases the establishment frequency of stable episomes, but is not a prerequisite.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past decades star formation has been a very attractive field because knowledge of star formation leads to a better understanding of the formation of planets and thus of our solar system but also of the evolution of galaxies. Conditions leading to the formation of high-mass stars are still under investigation but an evolutionary scenario has been proposed: As a cold pre-stellar core collapses under gravitational force, the medium warms up until it reaches a temperature of 100 K and enters the hot molecular core (HMC) phase. The forming central proto-star accretes materials, increasing its mass and luminosity and eventually it becomes sufficiently evolved to emit UV photons which irradiate the surrounding environment forming a hyper compact (HC) and then a ultracompact (UC) HII region. At this stage, a very dense and very thin internal photon-dominated region (PDR) forms between the HII region and the molecular core. Information on the chemistry allows to trace the physical processes occurring in these different phases of star formation. Formation and destruction routes of molecules are influenced by the environment as reaction rates depend on the temperature and radiation field. Therefore, chemistry also allows the determination of the evolutionary stage of astrophysical objects through the use of chemical models including the time evolution of the temperature and radiation field. Because HMCs host a very rich chemistry with high abundances of complex organic molecules (COMs), several astrochemical models have been developed to study the gas phase chemistry as well as grain chemistry in these regions. In addition to HMCs models, models of PDRs have also been developed to study in particular photo-chemistry. So far, few studies have investigated internal PDRs and only in the presence of outflows cavities. Thus, these unique regions around HC/UCHII regions remain to be examined thoroughly. My PhD thesis focuses on the spatio-temporal chemical evolution in HC/UC HII regions with internal PDRs as well as in HMCs. The purpose of this study is first to understand the impact and effects of the radiation field, usually very strong in these regions, on the chemistry. Secondly, the goal is to study the emission of various tracers of HC/UCHII regions and compare it with HMCs models, where the UV radiation field does not impact the region as it is immediately attenuated by the medium. Ultimately we want to determine the age of a given region using chemistry in combination with radiative transfer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation aims to investigate the Trends and Determinants of the Rural Non-Farm Sector and Labor Market in Rural Vietnam since the global economic crisis occurred in 2007 with the focus on the household's diversification; the involvement of rural individuals in Rural Non-Farm Employment; Rural Labor Market development; and assessment of a specific labor market policy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent marine long-offset transient electromagnetic (LOTEM) measurements yielded the offshore delineation of a fresh groundwater body beneath the seafloor in the region of Bat Yam, Israel. The LOTEM application was effective in detecting this freshwater body underneath the Mediterranean Sea and allowed an estimation of its seaward extent. However, the measured data set was insufficient to understand the hydrogeological configuration and mechanism controlling the occurrence of this fresh groundwater discovery. Especially the lateral geometry of the freshwater boundary, important for the hydrogeological modelling, could not be resolved. Without such an understanding, a rational management of this unexploited groundwater reservoir is not possible. Two new high-resolution marine time-domain electromagnetic methods are theoretically developed to derive the hydrogeological structure of the western aquifer boundary. The first is called Circular Electric Dipole (CED). It is the land-based analogous of the Vertical Electric Dipole (VED), which is commonly applied to detect resistive structures in the subsurface. Although the CED shows exceptional detectability characteristics in the step-off signal towards the sub-seafloor freshwater body, an actual application was not carried out in the extent of this study. It was found that the method suffers from an insufficient signal strength to adequately delineate the resistive aquifer under realistic noise conditions. Moreover, modelling studies demonstrated that severe signal distortions are caused by the slightest geometrical inaccuracies. As a result, a successful application of CED in Israel proved to be rather doubtful. A second method called Differential Electric Dipole (DED) is developed as an alternative to the intended CED method. Compared to the conventional marine time-domain electromagnetic system that commonly applies a horizontal electric dipole transmitter, the DED is composed of two horizontal electric dipoles in an in-line configuration that share a common central electrode. Theoretically, DED has similar detectability/resolution characteristics compared to the conventional LOTEM system. However, the superior lateral resolution towards multi-dimensional resistivity structures make an application desirable. Furthermore, the method is less susceptible towards geometrical errors making an application in Israel feasible. In the extent of this thesis, the novel marine DED method is substantiated using several one-dimensional (1D) and multi-dimensional (2D/3D) modelling studies. The main emphasis lies on the application in Israel. Preliminary resistivity models are derived from the previous marine LOTEM measurement and tested for a DED application. The DED method is effective in locating the two-dimensional resistivity structure at the western aquifer boundary. Moreover, a prediction regarding the hydrogeological boundary conditions are feasible, provided a brackish water zone exists at the head of the interface. A seafloor-based DED transmitter/receiver system is designed and built at the Institute of Geophysics and Meteorology at the University of Cologne. The first DED measurements were carried out in Israel in April 2016. The acquired data set is the first of its kind. The measured data is processed and subsequently interpreted using 1D inversion. The intended aim of interpreting both step-on and step-off signals failed, due to the insufficient data quality of the latter. Yet, the 1D inversion models of the DED step-on signals clearly detect the freshwater body for receivers located close to the Israeli coast. Additionally, a lateral resistivity contrast is observable in the 1D inversion models that allow to constrain the seaward extent of this freshwater body. A large-scale 2D modelling study followed the 1D interpretation. In total, 425 600 forward calculations are conducted to find a sub-seafloor resistivity distribution that adequately explains the measured data. The results indicate that the western aquifer boundary is located at 3600 m - 3700 m before the coast. Moreover, a brackish water zone of 3 Omega*m to 5 Omega*m with a lateral extent of less than 300 m is likely located at the head of the freshwater aquifer. Based on these results, it is predicted that the sub-seafloor freshwater body is indeed open to the sea and may be vulnerable to seawater intrusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With progressive climate change, the preservation of biodiversity is becoming increasingly important. Only if the gene pool is large enough and requirements of species are diverse, there will be species that can adapt to the changing circumstances. To maintain biodiversity, we must understand the consequences of the various strategies. Mathematical models of population dynamics could provide prognoses. However, a model that would reproduce and explain the mechanisms behind the diversity of species that we observe experimentally and in nature is still needed. A combination of theoretical models with detailed experiments is needed to test biological processes in models and compare predictions with outcomes in reality. In this thesis, several food webs are modeled and analyzed. Among others, models are formulated of laboratory experiments performed in the Zoological Institute of the University of Cologne. Numerical data of the simulations is in good agreement with the real experimental results. Via numerical simulations it can be demonstrated that few assumptions are necessary to reproduce in a model the sustained oscillations of the population size that experiments show. However, analysis indicates that species "thrown together by chance" are not very likely to survive together over long periods. Even larger food nets do not show significantly different outcomes and prove how extraordinary and complicated natural diversity is. In order to produce such a coexistence of randomly selected species—as the experiment does—models require additional information about biological processes or restrictions on the assumptions. Another explanation for the observed coexistence is a slow extinction that takes longer than the observation time. Simulated species survive a comparable period of time before they die out eventually. Interestingly, it can be stated that the same models allow the survival of several species in equilibrium and thus do not follow the so-called competitive exclusion principle. This state of equilibrium is more fragile, however, to changes in nutrient supply than the oscillating coexistence. Overall, the studies show, that having a diverse system means that population numbers are probably oscillating, and on the other hand oscillating population numbers stabilize a food web both against demographic noise as well as against changes of the habitat. Model predictions can certainly not be converted at their face value into policies for real ecosystems. But the stabilizing character of fluctuations should be considered in the regulations of animal populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die Dissertation befasst sich mit der Verantwortlichkeit und der Haftungsprivilegierung von Internet Service Providern (ISP) für Urheberrechtsverletzungen Dritter im deutschen und US-amerikanischen Recht. Internet Service Provider (das sind Host-Provider, Cache-Provider, Access-Provider sowie Suchmaschinen-Anbieter bzw. sonstige Linksetzende) nehmen eine wichtige Rolle im Internet ein. In Deutschland sehen sich die ISP jedoch trotz gesetzlicher Haftungsprivilegien erheblichen Rechtsunsicherheiten ausgesetzt. Der Dissertation liegt die Hypothese zugrunde, dass durch die Ausweitung der Verantwortlichkeit der ISP durch die Rechtsprechung die gesetzlich stipulierte Haftungsprivilegierung faktisch entwertet wird. Insbesondere stehen die Nichtanwendung der Privilegien auf Unterlassungsansprüche sowie die im Rahmen der Störerhaftung begründeten Prüfpflichten der Intention des Gesetzgebers (auch auf europäischer Ebene) entgegen. Eine gerechte Balance der Interessen der Akteure durch die Rechtsprechung wurde nicht erreicht. Die gesetzliche Ausgestaltung der US-amerikanischen Privilegierung verspricht hingegen eine den Interessen der ISP gerechtere Lösung. Auch die Rechtsanwendung der US-amerikanischen Gerichte liegt hiermit auf einer Linie.