947 resultados para LAW OF SUCCESSION


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Araucaria angustifolia, commonly named Araucaria, is a Brazilian native species that is intensively exploited due to its timber quality. Therefore, Araucaria is on the list of species threatened by extinction. Despite the importance of soil for forest production, little is known about the soil properties of the highly fragmented Araucaria forests. This study was designed to investigate the use of chemical and biological properties as indicators of conservation and anthropogenic disturbance of Araucaria forests in different sampling periods. The research was carried out in two State parks of São Paulo: Parque Estadual Turístico do Alto do Ribeira and Parque Estadual de Campos de Jordão. The biochemical properties carbon and nitrogen in microbial biomass (MB-C and MB-N), basal respiration (BR), the metabolic quotient (qCO2) and the following enzyme activities: β-glucosidase, urease, and fluorescein diacetate hydrolysis (FDA) were evaluated. The sampling period (dry or rainy season) influenced the results of mainly MB-C, MB-N, BR, and qCO2. The chemical and biochemical properties, except K content, were sensitive indicators of differences in the conservation and anthropogenic disturbance stages of Araucaria forests. Although these forests differ in biochemical and chemical properties, they are efficient in energy use and conservation, which is shown by their low qCO2, suggesting an advanced stage of succession.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many of the discovered exoplanetary systems are involved inside mean-motion resonances. In this work we focus on the dynamics of the 3:1 mean-motion resonant planetary systems. Our main purpose is to understand the dynamics in the vicinity of the apsidal corotation resonance (ACR) which are stationary solutions of the resonant problem. We apply the semi-analytical method (Michtchenko et al., 2006) to construct the averaged three-body Hamiltonian of a planetary system near a 3:1 resonance. Then we obtain the families of ACR, composed of symmetric and asymmetric solutions. Using the symmetric stable solutions we observe the law of structures (Ferraz-Mello,1988), for different mass ratio of the planets. We also study the evolution of the frequencies of σ1, resonant angle, and Δω, the secular angle. The resonant domains outside the immediate vicinity of ACR are studied using dynamical maps techniques. We compared the results obtained to planetary systems near a 3:1 MMR, namely 55 Cnc b-c, HD 60532 b-c and Kepler 20 b-c.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Reproducing Fourier's law of heat conduction from a microscopic stochastic model is a long standing challenge in statistical physics. As was shown by Rieder, Lebowitz and Lieb many years ago, a chain of harmonically coupled oscillators connected to two heat baths at different temperatures does not reproduce the diffusive behaviour of Fourier's law, but instead a ballistic one with an infinite thermal conductivity. Since then, there has been a substantial effort from the scientific community in identifying the key mechanism necessary to reproduce such diffusivity, which usually revolved around anharmonicity and the effect of impurities. Recently, it was shown by Dhar, Venkateshan and Lebowitz that Fourier's law can be recovered by introducing an energy conserving noise, whose role is to simulate the elastic collisions between the atoms and other microscopic degrees of freedom, which one would expect to be present in a real solid. For a one-dimensional chain this is accomplished numerically by randomly flipping - under the framework of a Poisson process with a variable “rate of collisions" - the sign of the velocity of an oscillator. In this poster we present Langevin simulations of a one-dimensional chain of oscillators coupled to two heat baths at different temperatures. We consider both harmonic and anharmonic (quartic) interactions, which are studied with and without the energy conserving noise. With these results we are able to map in detail how the heat conductivity k is influenced by both anharmonicity and the energy conserving noise. We also present a detailed analysis of the behaviour of k as a function of the size of the system and the rate of collisions, which includes a finite-size scaling method that enables us to extract the relevant critical exponents. Finally, we show that for harmonic chains, k is independent of temperature, both with and without the noise. Conversely, for anharmonic chains we find that k increases roughly linearly with the temperature of a given reservoir, while keeping the temperature difference fixed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[EN]We describe a method for studying the classicalexperiment of the simple pendulum, consisting of a body of magnetic material oscillating through a thin conducting coil (magnetic pendulum), which according to Faraday’s law of induction generates a fluctuating current in the coil that can be transferred into a periodic signal in an oscilloscope. The set up described here allows to study the motion of the pendulum beyond what is normally considered in more basic settings, including a detailed analysis of both small and large oscillations, and the determination of the value of the acceleration of gravity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Non-Equilibrium Statistical Mechanics is a broad subject. Grossly speaking, it deals with systems which have not yet relaxed to an equilibrium state, or else with systems which are in a steady non-equilibrium state, or with more general situations. They are characterized by external forcing and internal fluxes, resulting in a net production of entropy which quantifies dissipation and the extent by which, by the Second Law of Thermodynamics, time-reversal invariance is broken. In this thesis we discuss some of the mathematical structures involved with generic discrete-state-space non-equilibrium systems, that we depict with networks in all analogous to electrical networks. We define suitable observables and derive their linear regime relationships, we discuss a duality between external and internal observables that reverses the role of the system and of the environment, we show that network observables serve as constraints for a derivation of the minimum entropy production principle. We dwell on deep combinatorial aspects regarding linear response determinants, which are related to spanning tree polynomials in graph theory, and we give a geometrical interpretation of observables in terms of Wilson loops of a connection and gauge degrees of freedom. We specialize the formalism to continuous-time Markov chains, we give a physical interpretation for observables in terms of locally detailed balanced rates, we prove many variants of the fluctuation theorem, and show that a well-known expression for the entropy production due to Schnakenberg descends from considerations of gauge invariance, where the gauge symmetry is related to the freedom in the choice of a prior probability distribution. As an additional topic of geometrical flavor related to continuous-time Markov chains, we discuss the Fisher-Rao geometry of nonequilibrium decay modes, showing that the Fisher matrix contains information about many aspects of non-equilibrium behavior, including non-equilibrium phase transitions and superposition of modes. We establish a sort of statistical equivalence principle and discuss the behavior of the Fisher matrix under time-reversal. To conclude, we propose that geometry and combinatorics might greatly increase our understanding of nonequilibrium phenomena.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis, we investigated the evaporation of sessile microdroplets on different solid substrates. Three major aspects were studied: the influence of surface hydrophilicity and heterogeneity on the evaporation dynamics for an insoluble solid substrate, the influence of external process parameters and intrinsic material properties on microstructuring of soluble polymer substrates and the influence of an increased area to volume ratio in a microfluidic capillary, when evaporation is hindered. In the first part, the evaporation dynamics of pure sessile water drops on smooth self-assembled monolayers (SAMs) of thiols or disulfides on gold on mica was studied. With increasing surface hydrophilicity the drop stayed pinned longer. Thus, the total evaporation time of a given initial drop volume was shorter, since the drop surface, through which the evaporation occurs, stays longer large. Usually, for a single drop the volume decreased linearly with t1.5, t being the evaporation time, for a diffusion-controlled evaporation process. However, when we measured the total evaporation time, ttot, for multiple droplets with different initial volumes, V0, we found a scaling of the form V0 = attotb. The more hydrophilic the substrate was, the more showed the scaling exponent a tendency to an increased value up to 1.6. This can be attributed to an increasing evaporation rate through a thin water layer in the vicinity of the drop. Under the assumption of a constant temperature at the substrate surface a cooling of the droplet and thus a decreased evaporation rate could be excluded as a reason for the different scaling exponent by simulations performed by F. Schönfeld at the IMM, Mainz. In contrast, for a hairy surface, made of dialkyldisulfide SAMs with different chain lengths and a 1:1 mixture of hydrophilic and hydrophobic end groups (hydroxy versus methyl group), the scaling exponent was found to be ~ 1.4. It increased to ~ 1.5 with increasing hydrophilicity. A reason for this observation can only be speculated: in the case of longer hydrophobic alkyl chains the formation of an air layer between substrate and surface might be favorable. Thus, the heat transport to the substrate might be reduced, leading to a stronger cooling and thus decreased evaporation rate. In the second part, the microstructuring of polystyrene surfaces by drops of toluene, a good solvent, was investigated. For this a novel deposition technique was developed, with which the drop can be deposited with a syringe. The polymer substrate is lying on a motorized table, which picks up the pendant drop by an upward motion until a liquid bridge is formed. A consecutive downward motion of the table after a variable delay, i.e. the contact time between drop and polymer, leads to the deposition of the droplet, which can evaporate. The resulting microstructure is investigated in dependence of the processes parameters, i.e. the approach and the retraction speed of the substrate and the delay between them, and in dependence of the intrinsic material properties, i.e. the molar mass and the type of the polymer/solvent system. The principal equivalence with the microstructuring by the ink-jet technique was demonstrated. For a high approach and retraction speed of 9 mm/s and no delay between them, a concave microtopology was observed. In agreement with the literature, this can be explained by a flow of solvent and the dissolved polymer to the rim of the pinned droplet, where polymer is accumulated. This effect is analogue to the well-known formation of ring-like stains after the evaporation of coffee drops (coffee-stain effect). With decreasing retraction speed down to 10 µm/s the resulting surface topology changes from concave to convex. This can be explained with the increasing dissolution of polymer into the solvent drop prior to the evaporation. If the polymer concentration is high enough, gelation occurs instead of a flow to the rim and the shape of the convex droplet is received. With increasing delay time from below 0 ms to 1s the depth of the concave microwells decreases from 4.6 µm to 3.2 µm. However, a convex surface topology could not be obtained, since for longer delay times the polymer sticks to the tip of the syringe. Thus, by changing the delay time a fine-tuning of the concave structure is accomplished, while by changing the retraction speed a principal change of the microtopolgy can be achieved. We attribute this to an additional flow inside the liquid bridge, which enhanced polymer dissolution. Even if the pendant drop is evaporating about 30 µm above the polymer surface without any contact (non-contact mode), concave structures were observed. Rim heights as high as 33 µm could be generated for exposure times of 20 min. The concave structure exclusively lay above the flat polymer surface outside the structure even after drying. This shows that toluene is taken up permanently. The increasing rim height, rh, with increasing exposure time to the solvent vapor obeys a diffusion law of rh = rh0  tn, with n in the range of 0.46 ~ 0.65. This hints at a non-Fickian swelling process. A detailed analysis showed that the rim height of the concave structure is modulated, unlike for the drop deposition. This is due to the local stress relaxation, which was initiated by the increasing toluene concentration in the extruded polymer surface. By altering the intrinsic material parameters i.e. the polymer molar mass and the polymer/solvent combination, several types of microstructures could be formed. With increasing molar mass from 20.9 kDa to 1.44 MDa the resulting microstructure changed from convex, to a structure with a dimple in the center, to concave, to finally an irregular structure. This observation can be explained if one assumes that the microstructuring is dominated by two opposing effects, a decreasing solubility with increasing polymer molar mass, but an increasing surface tension gradient leading to instabilities of Marangoni-type. Thus, a polymer with a low molar mass close or below the entanglement limit is subject to a high dissolution rate, which leads to fast gelation compared to the evaporation rate. This way a coffee-rim like effect is eliminated early and a convex structure results. For high molar masses the low dissolution rate and the low polymer diffusion might lead to increased surface tension gradients and a typical local pile-up of polymer is found. For intermediate polymer masses around 200 kDa, the dissolution and evaporation rate are comparable and the typical concave microtopology is found. This interpretation was supported by a quantitative estimation of the diffusion coefficient and the evaporation rate. For a different polymer/solvent system, polyethylmethacrylate (PEMA)/ethylacetate (EA), exclusively concave structures were found. Following the statements above this can be interpreted with a lower dissolution rate. At low molar masses the concentration of PEMA in EA most likely never reaches the gelation point. Thus, a concave instead of a convex structure occurs. At the end of this section, the optically properties of such microstructures for a potential application as microlenses are studied with laser scanning confocal microscopy. In the third part, the droplet was confined into a glass microcapillary to avoid evaporation. Since here, due to an increased area to volume ratio, the surface properties of the liquid and the solid walls became important, the influence of the surface hydrophilicity of the wall on the interfacial tension between two immiscible liquid slugs was investigated. For this a novel method for measuring the interfacial tension between the two liquids within the capillary was developed. This technique was demonstrated by measuring the interfacial tensions between slugs of pure water and standard solvents. For toluene, n-hexane and chloroform 36.2, 50.9 and 34.2 mN/m were measured at 20°C, which is in a good agreement with data from the literature. For a slug of hexane in contact with a slug of pure water containing ethanol in a concentration range between 0 and 70 (v/v %), a difference of up to 6 mN/m was found, when compared to commercial ring tensiometry. This discrepancy is still under debate.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Zur Untersuchung von Effekten beim Laserheizen von Polymeren wurde ein Temperaturmessaufbau entwickelt. Das Messprinzip basiert auf der Auswertung der thermischen Emission. Der Messaufbau besteht aus einer hochauflösenden Kamera, ausgestattet mit Bildverstärker, sowie Interferenzfiltern um eine spektrale Auflösung zu gewährleisten und einem gepulster NIR-Heizlaser. Die Pulsdauer des Lasers liegt in der Größenordnung von 10 µs, der Strahldurchmesser durch entsprechende Fokussierung in der Größenordnung von 10 µm. Mittels Fit des Planck‘schen Strahlungsgesetzes an die aufgenommene thermische Emission konnten 2D Temperaturgraphen erhalten werden. Eine Ortsauflösung von 1 µm und eine Zeitauflösung von 1 µs konnten realisiert werden. In Kombination mit Finite-Elemente-Simulationen wurde mit diesem Aufbau die Laserablation verschiedener Polymere untersucht. Dabei hat sich gezeigt, dass bei Polymeren mit einem Glasübergang im Temperaturbereich zwischen Raum- und Zerfallstemperatur, photomechanische Ablation stattfand. Die Ablationsschwelle lag für diese Polymere mehrere 10 K über dem Glasübergang, weit unter der Zerfallstemperatur aus thermogravimetrischen Experimenten mit typischen Heizraten von 10 K/min. Bei hohen Laserenergien und damit verbundenen hohen Temperaturen konnte dagegen thermischer Zerfall beobachtet werden. Ein Übergang des Mechanismus von photomechanischer Ablation zu Ablation durch thermischen Zerfall ergab sich bei Temperaturen deutlich über der Zerfallstemperatur des Polymers aus der Thermogravimetrie. Dies wurde bedingt durch die kurzen Reaktionszeiten des Laserexperiments in der Größenordnung der Pulsdauer und steht im Einklang mit dem Gesetz von Arrhenius. Polymere ohne Glasübergang im Heizbereich zeigten dagegen keine photomechanische Ablation, sondern ausschließlich thermischen Zerfall. Die Ablationsschwelle lag auch hier bei höheren Temperaturen, entsprechend dem Gesetz von Arrhenius. Hohe Temperaturen, mehrere 100 K über der Zerfallstemperatur, ergaben sich darüber hinaus bei hohen Laserenergien. Ein drastisches Überhitzen des Polymers, wie in der Literatur beschrieben, konnte nicht beobachtet werden. Experimentelle Befunde deuten vielmehr darauf hin, dass es sich bei dem heißen Material um thermische Zerfallsprodukte, Polymerfragmente, Monomer und Zerfallsprodukte des Monomers handelte bzw. das Temperaturprofil der Zerfallsreaktion selbst visualisiert wurde.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Today, crude oil remains a vital resource all around the world. This non-renewable resource powers countries worldwide. Besides serving as an energy source, crude oil is also the most important component for different world economies, especially in developing countries. Ecuador, a small member of the OPEC oil cartel, presents a case where its economy is oil dependent. A great percentage of the country¿s GDP and government¿s budget comes from oil revenues. Ecuador has always been a primary exporter of raw materials. In the last centuries, the country experienced three important economic booms: cacao, bananas, and, ultimately, crude oil. In this sense, the country has not been able to fully industrialize and begin to export manufactured goods, i.e., Ecuador suffers from the Dutch disease. The latter has deterred Ecuador from achieving broad-based economic development. Given crude oil¿s importance for the Ecuadorian economy, the government has always tried to influence the oil industry in search of profits and benefits. Therefore, this thesis, explores the question: how and to what extent have political interventions affected the oil industry in Ecuador from 1990 until March 2014? In general, this thesis establishes an economic history context during the last twenty-four years, attempting to research how political interventions have shaped Ecuador¿s oil industry and economy. In the analysis, it covers a period where political instability prevailed, until Rafael Correa became president. The thesis examines Ecuador¿s participation in OPEC, trying to find explanations as to why the country voluntarily left the organization in 1992, only to rejoin in 2007 when Correa rose to power. During the ¿Revolución Ciudadana¿ period, the thesis researches reforms to the Law of Hydrocarbons, variations in the relations with other nations, the controversy surrounding the Yasuní-ITT oil block, and the ¿Refinería del Pacífico¿ construction. The thesis is an Industrial Organization detailed case study that analyzes, updates, and evaluates the intersection of economics and politics in Ecuador¿s crude oil industry during the last 24 years. In this sense I have consulted past theses, newspaper articles, books, and other published data about the petroleum industry, both from a global and Ecuadorian perspective. In addition to published sources, I was able to interview sociologists, public figures, history and economics academics, and other experts, accessing unique unpublished data about Ecuador¿s oil industry. I made an effort to collect information that shows the private and public side of the industry, i.e., from government-related and independent sources. I attempted to remain as objective as possible to make conclusions about the appropriate Industrial Organization policy for Ecuador¿s oil industry, addressing the issue from an economic, social, political, and environmental point of view. I found how Ecuador¿s political instability caused public policy to fail, molding the conduct and market structure of the crude oil industry. Throughout history, developed nations have benefited from low oil prices, but things shifted since oil prices began to rise, which is more beneficial for the developing nations that actually possess and produce the raw material. Nevertheless, Ecuador, a victim of the Dutch disease due to its heavy reliance on crude oil as a primary product, has not achieved broad-based development.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Theoretical studies of the problems of the securities markets in the Russian Federation incline to one or other of the two traditional approaches. The first consists of comparing the definition of "valuable paper" set forth in the current legislation of the Russian Federation, with the theoretical model of "Wertpapiere" elaborated by German scholars more than 90 years ago. The problem with this approach is, in Mr. Pentsov's opinion, that any new features of the definition of "security" that do not coincide with the theoretical model of "Wertpapiere" (such as valuable papers existing in non-material, electronic form) are claimed to be incorrect and removed from the current legislation of the Russian Federation. The second approach works on the basis of the differentiation between the Common Law concept of "security" and the Civil Law concept of "valuable paper". Mr. Pentsov's research, presented in an article written in English, uses both methodological tools and involves, firstly, a historical study of the origin and development of certain legal phenomena (securities) as they evolved in different countries, and secondly, a comparative, synchronic study of equivalent legal phenomena as they exist in different countries today. Employing the first method, Mr. Pentsov divided the historical development of the conception of "valuable paper" in Russia into five major stages. He found that, despite the existence of a relatively wide circulation of valuable papers, especially in the second half of the 19th century, Russian legislation before 1917 (the first stage) did not have a unified definition of valuable paper. The term was used, in both theoretical studies and legislation, but it covered a broad range of financial instruments such as stocks, bonds, government bonds, promissory notes, bills of exchange, etc. During the second stage, also, the legislation of the USSR did not have a unified definition of "valuable paper". After the end of the "new economic policy" (1922 - 1930) the stock exchanges and the securities markets in the USSR, with a very few exceptions, were abolished. And thus during the third stage (up to 1985), the use of valuable papers in practice was reduced to foreign economic relations (bills of exchange, stocks in enterprises outside the USSR) and to state bonds. Not surprisingly, there was still no unified definition of "valuable paper". After the beginning of Gorbachev's perestroika, a securities market began to re-appear in the USSR. However, the successful development of securities markets in the USSR was retarded by the absence of an appropriate regulatory framework. The first effort to improve the situation was the adoption of the Regulations on Valuable Papers, approved by resolution No. 590 of the Council of Ministers of the USSR, dated June 19, 1990. Section 1 of the Regulation contained the first statutory definition of "valuable paper" in the history of Russia. At the very beginning of the period of transition to a market economy, a number of acts contained different definitions of "valuable paper". This diversity clearly undermined the stability of the Russian securities market and did not achieve the goal of protecting the investor. The lack of unified criteria for the consideration of such non-standard financial instruments as "valuable papers" significantly contributed to the appearance of numerous fraudulent "pyramid" schemes that were outside of the regulatory scheme of Russia legislation. The situation was substantially improved by the adoption of the new Civil Code of the Russian Federation. According to Section 1 of Article 142 of the Civil Code, a valuable paper is a document that confirms, in compliance with an established form and mandatory requisites, certain material rights whose realisation or transfer are possible only in the process of its presentation. Finally, the recent Federal law No. 39 - FZ "On the Valuable Papers Market", dated April 22 1996, has also introduced the term "emission valuable papers". According to Article 2 of this Law, an "emission valuable paper" is any valuable paper, including non-documentary, that simultaneously has the following features: it fixes the composition of material and non-material rights that are subject to confirmation, cession and unconditional realisation in compliance with the form and procedure established by this federal law; it is placed by issues; and it has equal amount and time of realisation of rights within the same issue regardless of when the valuable paper was purchased. Thus the introduction of the conception of "emission valuable paper" became the starting point in the Russian federation's legislation for the differentiation between the legal regimes of "commercial papers" and "investment papers" similar to the Common Law approach. Moving now to the synchronic, comparative method of research, Mr. Pentsov notes that there are currently three major conceptions of "security" and, correspondingly, three approaches to its legal definition: the Common Law concept, the continental law concept, and the concept employed by Japanese Law. Mr. Pentsov proceeds to analyse the differences and similarities of all three, concluding that though the concept of "security" in the Common Law system substantially differs from that of "valuable paper" in the Continental Law system, nevertheless the two concepts are developing in similar directions. He predicts that in the foreseeable future the existing differences between these two concepts will become less and less significant. On the basis of his research, Mr. Pentsov arrived at the conclusion that the concept of "security" (and its equivalents) is not a static one. On the contrary, it is in the process of permanent evolution that reflects the introduction of new financial instruments onto the capital markets. He believes that the scope of the statutory definition of "security" plays an extremely important role in the protection of investors. While passing the Securities Act of 1933, the United States Congress determined that the best way to achieve the goal of protecting investors was to define the term "security" in sufficiently broad and general terms so as to include within the definition the many types of instruments that in the commercial world fall within the ordinary concept of "security' and to cover the countless and various devices used by those who seek to use the money of others on the promise of profits. On the other hand, the very limited scope of the current definition of "emission valuable paper" in the Federal Law of the Russian Federation entitled "On the Valuable Papers Market" does not allow the anti-fraud provisions of this law to be implemented in an efficient way. Consequently, there is no basis for the protection of investors. Mr. Pentsov proposes amendments which he believes would enable the Russian markets to become more efficient and attractive for both foreign and domestic investors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The political philosophy underpinning the Indian Constitution is socialist economy in a multilingual political landscape. The Constitution grants some fundamental rights to all citizens regarding language and to linguistic and other minorities regarding education. It also obligates states to use many languages in school education. Restructuring the economy with free market as its pivot and the growing dominance of English in the information driven global economy give rise to policy changes in language use in education, which undermine the Constitutional provisions relating to language, though these changes reflect the manufactured consent of the citizens. This is made possible by the way the Constitution is interpreted by courts with regard to the fundamental rights of equality and non-discrimination when they apply to language. The unique property of language that it can be acquired, unlike other primordial attributes such as ethnicity or caste, comes into play in this interpretation. The result is that the law of the market takes over the law of the land.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The following comparison was written for the first meeting of the International Law Association newly established (2010) Committee on Intellectual Property and Private International Law (Chair: Professor Toshiyuki Kono, Kyushu University; Co-Rapporteurs: Professors Pedro de Miguel Asensio, Madrid Complutense University, and Axel Metzger, Hannover University) (hereinafter: ILA Committee), which was hosted at the Faculty of Law of the University of Lisbon in March 16-17, 2012. The comparison at stake concerns the rules on infringement and exclusive (subject-mater) jurisdiction posed (or rejected, in case of exclusive jurisdiction) by four sets of academic principles. Notwithstanding the fact that the rules in question present several differences, those differences in the majority of cases could be overcome by further studies and work of the ILA Committee, as the following comparison explains.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper discusses the phenomenon of injunctions against third parties that are innocent from the tort law perspective. One such type of injunction, website blocking, is currently appearing in the spotlight around various European jurisdictions as a consequence of the implementation of Article 8(3) of the Information Society Directive and Article 11 of the Enforcement Directive. Website-blocking injunctions are used in this paper only as a plastic and perhaps also canonical example of the paradigmatic shift we are facing: the shift from tort-law-centric injunctions to in rem injunctions. The author of this paper maintains that the theoretical framework for the latter injunctions is not in the law of civil wrongs, but in an old Roman law concept of ‘in rem actions’ (actio in rem negatoria). Thus the term ‘in rem injunctions’ is coined to describe this paradigm of injunctions. Besides the theoretical foundations, this paper explains how a system of injunctions against innocent third parties fits into the private law regulation of negative externalities of online technology and explores the expected dangers of derailing injunctions from the tracks of tort law. The author’s PhD project – the important question of the justification of an extension of the intellectual property entitlements by the in rem paradigm, along with its limits or other solutions – is left out from the paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The first section of this chapter starts with the Buffon problem, which is one of the oldest in stochastic geometry, and then continues with the definition of measures on the space of lines. The second section defines random closed sets and related measurability issues, explains how to characterize distributions of random closed sets by means of capacity functionals and introduces the concept of a selection. Based on this concept, the third section starts with the definition of the expectation and proves its convexifying effect that is related to the Lyapunov theorem for ranges of vector-valued measures. Finally, the strong law of large numbers for Minkowski sums of random sets is proved and the corresponding limit theorem is formulated. The chapter is concluded by a discussion of the union-scheme for random closed sets and a characterization of the corresponding stable laws.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Species coexistence has been a fundamental issue to understand ecosystem functioning since the beginnings of ecology as a science. The search of a reliable and all-encompassing explanation for this issue has become a complex goal with several apparently opposing trends. On the other side, seemingly unconnected with species coexistence, an ecological state equation based on the inverse correlation between an indicator of dispersal that fits gamma distribution and species diversity has been recently developed. This article explores two factors, whose effects are inconspicuous in such an equation at the first sight, that are used to develop an alternative general theoretical background in order to provide a better understanding of species coexistence. Our main outcomes are: (i) the fit of dispersal and diversity values to gamma distribution is an important factor that promotes species coexistence mainly due to the right-skewed character of gamma distribution; (ii) the opposite correlation between species diversity and dispersal implies that any increase of diversity is equivalent to a route of “ecological cooling” whose maximum limit should be constrained by the influence of the third law of thermodynamics; this is in agreement with the well-known asymptotic trend of diversity values in space and time; (iii) there are plausible empirical and theoretical ways to apply physical principles to explain important ecological processes; (iv) the gap between theoretical and empirical ecology in those cases where species diversity is paradoxically high could be narrowed by a wave model of species coexistence based on the concurrency of local equilibrium states. In such a model, competitive exclusion has a limited but indispensable role in harmonious coexistence with functional redundancy. We analyze several literature references as well as ecological and evolutionary examples that support our approach, reinforcing the meaning equivalence between important physical and ecological principles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Spike timing dependent plasticity (STDP) is a phenomenon in which the precise timing of spikes affects the sign and magnitude of changes in synaptic strength. STDP is often interpreted as the comprehensive learning rule for a synapse - the "first law" of synaptic plasticity. This interpretation is made explicit in theoretical models in which the total plasticity produced by complex spike patterns results from a superposition of the effects of all spike pairs. Although such models are appealing for their simplicity, they can fail dramatically. For example, the measured single-spike learning rule between hippocampal CA3 and CA1 pyramidal neurons does not predict the existence of long-term potentiation one of the best-known forms of synaptic plasticity. Layers of complexity have been added to the basic STDP model to repair predictive failures, but they have been outstripped by experimental data. We propose an alternate first law: neural activity triggers changes in key biochemical intermediates, which act as a more direct trigger of plasticity mechanisms. One particularly successful model uses intracellular calcium as the intermediate and can account for many observed properties of bidirectional plasticity. In this formulation, STDP is not itself the basis for explaining other forms of plasticity, but is instead a consequence of changes in the biochemical intermediate, calcium. Eventually a mechanism-based framework for learning rules should include other messengers, discrete change at individual synapses, spread of plasticity among neighboring synapses, and priming of hidden processes that change a synapse's susceptibility to future change. Mechanism-based models provide a rich framework for the computational representation of synaptic plasticity.