840 resultados para Economics, Mathematical
Resumo:
The global power supply stability is faced to several severe and fundamental threats, in particular steadily increasing power demand, diminishing and degrading fossil and nuclear energy resources, very harmful greenhouse gas emissions, significant energy injustice and a structurally misbalanced ecological footprint. Photovoltaic (PV) power systems are analysed in various aspects focusing on economic and technical considerations of supplemental and substitutional power supply to the constraint conventional power system. To infer the most relevant system approach for PV power plants several solar resources available for PV systems are compared. By combining the different solar resources and respective economics, two major PV systems are identified to be very competitive in almost all regions in the world. The experience curve concept is used as a key technique for the development of scenario assumptions on economic projections for the decade of the 2010s. Main drivers for cost reductions in PV systems are learning and production growth rate, thus several relevant aspects are discussed such as research and development investments, technical PV market potential, different PV technologies and the energetic sustainability of PV. Three major market segments for PV systems are identified: off-grid PV solutions, decentralised small scale on-grid PV systems (several kWp) and large scale PV power plants (tens of MWp). Mainly by application of ‘grid-parity’ and ‘fuel-parity’ concepts per country, local market and conventional power plant basis, the global economic market potential for all major PV system segments is derived. PV power plant hybridization potential of all relevant power technologies and the global power plant structure are analyzed regarding technical, economical and geographical feasibility. Key success criteria for hybrid PV power plants are discussed and comprehensively analysed for all adequate power plant technologies, i.e. oil, gas and coal fired power plants, wind power, solar thermal power (STEG) and hydro power plants. For the 2010s, detailed global demand curves are derived for hybrid PV-Fossil power plants on a per power plant, per country and per fuel type basis. The fundamental technical and economic potentials for hybrid PV-STEG, hybrid PV-Wind and hybrid PV-Hydro power plants are considered. The global resource availability for PV and wind power plants is excellent, thus knowing the competitive or complementary characteristic of hybrid PV-Wind power plants on a local basis is identified as being of utmost relevance. The complementarity of hybrid PV-Wind power plants is confirmed. As a result of that almost no reduction of the global economic PV market potential need to be expected and more complex power system designs on basis of hybrid PV-Wind power plants are feasible. The final target of implementing renewable power technologies into the global power system is a nearly 100% renewable power supply. Besides balancing facilities, storage options are needed, in particular for seasonal power storage. Renewable power methane (RPM) offers respective options. A comprehensive global and local analysis is performed for analysing a hybrid PV-Wind-RPM combined cycle gas turbine power system. Such a power system design might be competitive and could offer solutions for nearly all current energy system constraints including the heating and transportation sector and even the chemical industry. Summing up, hybrid PV power plants become very attractive and PV power systems will very likely evolve together with wind power to the major and final source of energy for mankind.
Resumo:
Seit Etablierung der ersten Börsen als Marktplatz für fungible Güter sind Marktteilnehmer und die Wissenschaft bemüht, Erklärungen für das Zustandekommen von Marktpreisen zu finden. Im Laufe der Zeit wurden diverse Modelle entwickelt. Allen voran ist das neoklassische Capital Asset Pricing Modell (CAPM) zu nennen. Die Neoklassik sieht den Akteur an den Finanzmärkten als emotionslosen und streng rationalen Entscheider, dem sog. homo oeconomicus. Psychologische Einflussfaktoren bei der Preisbildung bleiben unbeachtet. Mit der Behavioral Finance hat sich ein neuer Zweig zur Erklärung von Börsenkursen und deren Bewegungen entwickelt. Die Behavioral Finance sprengt die enge Sichtweise der Neoklassik und geht davon aus, dass psychologische Effekte die Entscheidung der Finanzakteure beeinflussen und dabei zu teilweise irrational und emotional geprägten Kursänderungen führen. Eines der Hauptprobleme der Behavioral Finance liegt allerdings in der fehlenden formellen Ermittelbarkeit und Testbarkeit der einzelnen psychologischen Effekte. Anders als beim CAPM, wo die einzelnen Parameter klar mathematisch bestimmbar sind, besteht die Behavioral Finance im Wesentlichen aus psychologischen Definitionen von kursbeeinflussenden Effekten. Die genaue Wirkrichtung und Intensität der Effekte kann, mangels geeigneter Modelle, nicht ermittelt werden. Ziel der Arbeit ist es, eine Abwandlung des CAPM zu ermitteln, die es ermöglicht, neoklassische Annahmen durch die Erkenntnisse des Behavioral Finance zu ergänzen. Mittels der technischen Analyse von Marktpreisen wird versucht die Effekte der Behavioral Finance formell darstellbar und berechenbar zu machen. Von Praktikern wird die technische Analyse dazu verwendet, aus Kursverläufen die Stimmungen und Intentionen der Marktteilnehmer abzuleiten. Eine wissenschaftliche Fundierung ist bislang unterblieben. Ausgehend von den Erkenntnissen der Behavioral Finance und der technischen Analyse wird das klassische CAPM um psychologische Faktoren ergänzt, indem ein Multi-Beta-CAPM (Behavioral-Finance-CAPM) definiert wird, in das psychologisch fundierte Parameter der technischen Analyse einfließen. In Anlehnung an den CAPM-Test von FAMA und FRENCH (1992) werden das klassische CAPM und das Behavioral-Finance-CAPM getestet und der psychologische Erklärungsgehalt der technischen Analyse untersucht. Im Untersuchungszeitraum kann dem Behavioral-Finance-CAPM ein deutlich höherer Erklärungsgehalt gegenüber dem klassischen CAPM zugesprochen werden.
Resumo:
Since no physical system can ever be completely isolated from its environment, the study of open quantum systems is pivotal to reliably and accurately control complex quantum systems. In practice, reliability of the control field needs to be confirmed via certification of the target evolution while accuracy requires the derivation of high-fidelity control schemes in the presence of decoherence. In the first part of this thesis an algebraic framework is presented that allows to determine the minimal requirements on the unique characterisation of arbitrary unitary gates in open quantum systems, independent on the particular physical implementation of the employed quantum device. To this end, a set of theorems is devised that can be used to assess whether a given set of input states on a quantum channel is sufficient to judge whether a desired unitary gate is realised. This allows to determine the minimal input for such a task, which proves to be, quite remarkably, independent of system size. These results allow to elucidate the fundamental limits regarding certification and tomography of open quantum systems. The combination of these insights with state-of-the-art Monte Carlo process certification techniques permits a significant improvement of the scaling when certifying arbitrary unitary gates. This improvement is not only restricted to quantum information devices where the basic information carrier is the qubit but it also extends to systems where the fundamental informational entities can be of arbitary dimensionality, the so-called qudits. The second part of this thesis concerns the impact of these findings from the point of view of Optimal Control Theory (OCT). OCT for quantum systems utilises concepts from engineering such as feedback and optimisation to engineer constructive and destructive interferences in order to steer a physical process in a desired direction. It turns out that the aforementioned mathematical findings allow to deduce novel optimisation functionals that significantly reduce not only the required memory for numerical control algorithms but also the total CPU time required to obtain a certain fidelity for the optimised process. The thesis concludes by discussing two problems of fundamental interest in quantum information processing from the point of view of optimal control - the preparation of pure states and the implementation of unitary gates in open quantum systems. For both cases specific physical examples are considered: for the former the vibrational cooling of molecules via optical pumping and for the latter a superconducting phase qudit implementation. In particular, it is illustrated how features of the environment can be exploited to reach the desired targets.
Resumo:
The impact of two crop planting methods and of the application of cyanobacterial inoculants on plant growth, yield, water productivity and economics of rice cultivation was evaluated with the help of a split plot designed experiment during the rainy season of 2011 in New Delhi, India. Conventional transplanting and system of rice intensification (SRI) were tested as two different planting methods and seven treatments that considered cyanobacterial inoculants and compost were applied with three repetitions each. Results revealed no significant differences in plant performance and crop yield between both planting methods. However, the application of biofilm based BGA bio-fertiliser + 2/3 N had an overall positive impact on both, plant performance (plant height, number of tillers) and crop yield (number and weight of panicles) as well as on grain and straw yield. Higher net return and a higher benefit-cost ratio were observed in rice fields under SRI planting method, whereas the application of BGA + PGPR + 2/3 N resulted in highest values. Total water productivity and irrigation water productivity was significantly higher under SRI practices (5.95 and 3.67 kg ha^(-1) mm^(-1)) compared to practices of conventional transplanting (3.36 and 2.44), meaning that using SRI method, water saving of about 34 % could be achieved and significantly less water was required to produce one kg of rice. This study could show that a combination of plant growth promoting rhizobacteria (PGPR) in conjunction with BGA and 2/3 dose of mineral N fertiliser can support crop growth performance, crop yields and reduces overall production cost, wherefore this practices should be used in the integrated nutrient management of rice fields in India.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Central notations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform. In this way very elaborated aspects of mathematical statistics can be understood easily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating, combination of likelihood and robust M-estimation functions are simple additions/ perturbations in A2(Pprior). Weighting observations corresponds to a weighted addition of the corresponding evidence. Likelihood based statistics for general exponential families turns out to have a particularly easy interpretation in terms of A2(P). Regular exponential families form finite dimensional linear subspaces of A2(P) and they correspond to finite dimensional subspaces formed by their posterior in the dual information space A2(Pprior). The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P. The discussion of A2(P) valued random variables, such as estimation functions or likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
What kind of science is appropriate for understanding the Facebook? How does Google find what you're looking for... ...and exactly how do they make money doing so? What structural properties might we expect any social network to have? How does your position in an economic network (dis)advantage you? How are individual and collective behavior related in complex networks? What might we mean by the economics of spam? What do game theory and the Paris subway have to do with Internet routing? What's going on in the pictures to the left and right? Networked Life looks at how our world is connected -- socially, economically, strategically and technologically -- and why it matters. The answers to the questions above are related. They have been the subject of a fascinating intersection of disciplines including computer science, physics, psychology, mathematics, economics and finance. Researchers from these areas all strive to quantify and explain the growing complexity and connectivity of the world around us, and they have begun to develop a rich new science along the way. Networked Life will explore recent scientific efforts to explain social, economic and technological structures -- and the way these structures interact -- on many different scales, from the behavior of individuals or small groups to that of complex networks such as the Internet and the global economy. This course covers computer science topics and other material that is mathematical, but all material will be presented in a way that is accessible to an educated audience with or without a strong technical background. The course is open to all majors and all levels, and is taught accordingly. There will be ample opportunities for those of a quantitative bent to dig deeper into the topics we examine. The majority of the course is grounded in scientific and mathematical findings of the past two decades or less.
Resumo:
El documento resume los resultados de los tres ensayos sobre economía de la educación y de igualdad de oportunidades que se realizaron para el caso de Colombia.
Resumo:
Exercises, exams and solutions for a third year maths course.
Resumo:
Exercises, exams and solutions for a first year maths course.
Resumo:
Exam questions and solutions for a third year mathematical programming course.