978 resultados para Straight cosmic string
Resumo:
One major component of power system operation is generation scheduling. The objective of the work is to develop efficient control strategies to the power scheduling problems through Reinforcement Learning approaches. The three important active power scheduling problems are Unit Commitment, Economic Dispatch and Automatic Generation Control. Numerical solution methods proposed for solution of power scheduling are insufficient in handling large and complex systems. Soft Computing methods like Simulated Annealing, Evolutionary Programming etc., are efficient in handling complex cost functions, but find limitation in handling stochastic data existing in a practical system. Also the learning steps are to be repeated for each load demand which increases the computation time.Reinforcement Learning (RL) is a method of learning through interactions with environment. The main advantage of this approach is it does not require a precise mathematical formulation. It can learn either by interacting with the environment or interacting with a simulation model. Several optimization and control problems have been solved through Reinforcement Learning approach. The application of Reinforcement Learning in the field of Power system has been a few. The objective is to introduce and extend Reinforcement Learning approaches for the active power scheduling problems in an implementable manner. The main objectives can be enumerated as:(i) Evolve Reinforcement Learning based solutions to the Unit Commitment Problem.(ii) Find suitable solution strategies through Reinforcement Learning approach for Economic Dispatch. (iii) Extend the Reinforcement Learning solution to Automatic Generation Control with a different perspective. (iv) Check the suitability of the scheduling solutions to one of the existing power systems.First part of the thesis is concerned with the Reinforcement Learning approach to Unit Commitment problem. Unit Commitment Problem is formulated as a multi stage decision process. Q learning solution is developed to obtain the optimwn commitment schedule. Method of state aggregation is used to formulate an efficient solution considering the minimwn up time I down time constraints. The performance of the algorithms are evaluated for different systems and compared with other stochastic methods like Genetic Algorithm.Second stage of the work is concerned with solving Economic Dispatch problem. A simple and straight forward decision making strategy is first proposed in the Learning Automata algorithm. Then to solve the scheduling task of systems with large number of generating units, the problem is formulated as a multi stage decision making task. The solution obtained is extended in order to incorporate the transmission losses in the system. To make the Reinforcement Learning solution more efficient and to handle continuous state space, a fimction approximation strategy is proposed. The performance of the developed algorithms are tested for several standard test cases. Proposed method is compared with other recent methods like Partition Approach Algorithm, Simulated Annealing etc.As the final step of implementing the active power control loops in power system, Automatic Generation Control is also taken into consideration.Reinforcement Learning has already been applied to solve Automatic Generation Control loop. The RL solution is extended to take up the approach of common frequency for all the interconnected areas, more similar to practical systems. Performance of the RL controller is also compared with that of the conventional integral controller.In order to prove the suitability of the proposed methods to practical systems, second plant ofNeyveli Thennal Power Station (NTPS IT) is taken for case study. The perfonnance of the Reinforcement Learning solution is found to be better than the other existing methods, which provide the promising step towards RL based control schemes for practical power industry.Reinforcement Learning is applied to solve the scheduling problems in the power industry and found to give satisfactory perfonnance. Proposed solution provides a scope for getting more profit as the economic schedule is obtained instantaneously. Since Reinforcement Learning method can take the stochastic cost data obtained time to time from a plant, it gives an implementable method. As a further step, with suitable methods to interface with on line data, economic scheduling can be achieved instantaneously in a generation control center. Also power scheduling of systems with different sources such as hydro, thermal etc. can be looked into and Reinforcement Learning solutions can be achieved.
Resumo:
The present work deals with the characterization of polyhydroxyalkanoates accumulating vibrios from marine benthic environments and production studies of polyhydroxyalkanoates by vibrio sp.BTKB33. Vibrios are a group of (iram negative, curved or straight motile rods that normally inhabit the aquatic environments.The present study therefore aimed at evaluating the occurrence of PHA accumulating vibrios inhabiting marine benthic environments; characterizing the potential PHA accumulators employing phenotypic and genotypic approaches and molecular characterization of the PHA synthase gene. The study also evaluated the PHA production in V:'hri0 sp. strain BTKB33, through submerged fennentation using statistical optimization and characterized the purified biopolymer. Screening for PHA producing vibrios from marine benthic environments. Characterization of PHA producers employing phenotypic and genotypic approaches.The incidence of PHA accumulation in Vibrio sp. isolated from marine sediments was observed to be high, indicating that the natural habitat of these bacteria are stressful. Considering their ubiquitous nature, the ecological role played by vibrios in maintaining the delicate balance of the benthic ecosystem besides returning potential strains, with the ability to elaborate a plethora of extracellular enzymes for industrial application, is significant. The elaboration of several hydrolytic enzymes by individuals also emphasize the crucial role of vibrios in the mineralization process in the marine environment. This study throws light on the extracellular hydrolytic enzyme profile exhibited by vibrios. It was concluded that apart from the PHA accumulation, presence of exoenzyme production and higher MAR index also aids in their survival in the highly challenging benthic enviromnents. The phylogenetic analysis of the strains and studies on intra species variation within PHA accumulating strains reveal their diversity. The isolate selected for production in this study was Vibrio sp. strain BTKB33, identified as V.azureus by 16S rDNA sequencing and phenotypic characterization. The bioprocess variables for PHA production utilising submerged fermentation was optimized employing one-factor-at-a-time-method, PB design and RSM studies. The statistical optimization of bioprocess variables revealed that NaCl concentration, temperature and incubation period are the major bioprocess variables influencing PHA production and PHA content. The presence of Class I PHA synthase genes in BTKB33 was also unveiled. The characterization of phaC genes by PCR and of the extracted polymer employing FTIR and NMR analysis revealed the presence of polyhydroxybutyrate, smallest known PI-IAs, having wider domestic, industrial and medical application. The strain BTKB33 bearing a significant exoenzyme profile, can thus be manipulatedin future for utilization of diverse substrates as C- source for PHA production. In addition to BTKB33, several fast growing Vibrio sp. having PHA accumulating ability were also isolated, revealing the prospects of this environment as a mine for novel PHA accumulating microbes. The findings of this study will provide a reference for further research in industrial production of PHAs from marine microorganisms .
Resumo:
The present work deals with the An integrated study on the hydrogeology of Bharathapuzha river basin ,south west coast of india. To study the spatial and temporal behaviour of the groundwater system of the Bharathapuzha river basin.To discover the sub-surface parameter by ground resistivity surveys.T o determine the groundwater quality of the Bharathapuzha river basin for the different seasons {pre monsoon and post monsoon with reference to the domestic and irrigational water quality standards.Present study will provide a good database on the hydrogeological aspects within the river basin.The study area covers l7 block Panchayats. Of these, Chitoor block is ‘over exploited’, Kollengode, Trithala, and Palakkad are ‘critical’ in category and Kuttippuram and Sreekrishnapuram blocks are ‘semi critical’ in terms of groundwater development.Comparison of Geomorphology map with drainage map shows that the geomorphology has a clear control on the drainage net work of the basin. The structural hill area shows a highest drainage network, where as pediment shows lowest drainage network.There are many discontinuous lineament in the Bharathapuzha river basin which can be connected by a straight line.Ground water flow directions are generally towards the western portions of the study area. From the northern region Water flows towards the central and also water from the eastern and southern side confluences at the centre and move towards western side of the basin.The positive correlation of transmissivity and storativity values show good aquifer conditions exists in the present study area .
Resumo:
The thesis begins with a review of basic elements of general theory of relativity (GTR) which forms the basis for the theoretical interpretation of the observations in cosmology. The first chapter also discusses the standard model in cosmology, namely the Friedmann model, its predictions and problems. We have also made a brief discussion on fractals and inflation of the early universe in the first chapter. In the second chapter we discuss the formulation of a new approach to cosmology namely a stochastic approach. In this model, the dynam ics of the early universe is described by a set of non-deterministic, Langevin type equations and we derive the solutions using the Fokker—Planck formalism. Here we demonstrate how the problems with the standard model, can be eliminated by introducing the idea of stochastic fluctuations in the early universe. Many recent observations indicate that the present universe may be approximated by a many component fluid and we assume that only the total energy density is conserved. This, in turn, leads to energy transfer between different components of the cosmic fluid and fluctuations in such energy transfer can certainly induce fluctuations in the mean to factor in the equation of state p = wp, resulting in a fluctuating expansion rate for the universe. The third chapter discusses the stochastic evolution of the cosmological parameters in the early universe, using the new approach. The penultimate chapter is about the refinements to be made in the present model, by means of a new deterministic model The concluding chapter presents a discussion on other problems with the conventional cosmology, like fractal correlation of galactic distribution. The author attempts an explanation for this problem using the stochastic approach.
Resumo:
Two-sided flux decoration experiments indicate that threading dislocation lines (TDLs), which cross the entire film, are sometimes trapped in metastable states. We calculate the elastic energy associated with the meanderings of a TDL. The TDL behaves as an anisotropic and dispersive string with thermal fluctuations largely along its Burgers vector. These fluctuations also modify the structure factor of the vortex solid. Both effects can, in principle, be used to estimate the elastic moduli of the material.
Resumo:
Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.
Resumo:
Bildbasierte Authentifizierung und Verschlüsselung: Identitätsbasierte Kryptographie (oft auch identity Based Encryption, IBE) ist eine Variation der asymmetrischen Schlüsselverfahren, bei der der öffentliche Schlüssel des Anwenders eine beliebig wählbare Zeichenfolge sein darf, die dem Besitzer offensichtlich zugeordnet werden kann. Adi Shamir stellte 1984 zunächst ein solches Signatursystem vor. In der Literatur wird dabei als öffentlicher Schlüssel meist die Email-Adresse oder eine Sozialversicherungsnummer genannt. Der Preis für die freie Schlüsselwahl ist die Einbeziehung eines vertrauenswürdigen Dritten, genannt Private Key Generator, der mit seinem privaten Generalschlüssel den privaten Schlüssel des Antragstellers generiert. Mit der Arbeit von Boneh und Franklin 2001 zum Einsatz der Weil-Paarbildung über elliptischen Kurven wurde IBE auf eine sichere und praktikable Grundlage gestellt. In dieser Arbeit wird nach einer allgemeinen Übersicht über Probleme und Lösungsmöglichkeiten für Authentifizierungsaufgaben im zweiten Teil als neue Idee der Einsatz eines Bildes des Anwenders als öffentlicher Schlüssel vorgeschlagen. Dazu wird der Ablauf der Schlüsselausgabe, die Bestellung einer Dienstleistung, z. B. die Ausstellung einer personengebundenen Fahrkarte, sowie deren Kontrolle dargestellt. Letztere kann offline auf dem Gerät des Kontrolleurs erfolgen, wobei Ticket und Bild auf dem Handy des Kunden bereitliegen. Insgesamt eröffnet sich dadurch die Möglichkeit einer Authentifizierung ohne weitere Preisgabe einer Identität, wenn man davon ausgeht, dass das Bild einer Person angesichts allgegenwärtiger Kameras sowieso öffentlich ist. Die Praktikabilität wird mit einer Implementierung auf der Basis des IBE-JCA Providers der National University of Ireland in Maynooth demonstriert und liefert auch Aufschluss auf das in der Praxis zu erwartende Laufzeitverhalten.
Resumo:
The various approximations of vacuum polarization potential and the higher order corrections up to \alpha^3 are reviewed and quantitatively dicussed. The quadrupol part of the vacuum polarization is established. It leads rather straight forward to a small contribution of vacuum polarization to nuclear polarization. These effects are quantitatively investigated.
Resumo:
Die vergleichend angelegte Examensarbeit setzt sich im Spannungsfeld von Adoleszenz, Abstinenz und Antifeminismus mit der Wandervogelbewegung und dem Hardcore auseinander. Nach einer Einführung in die Problembereiche wird die Genese und Entwicklung beider Jugendkulturen beschrieben, um schließlich Ansatzpunkte für Lehramtsstudierende und Lehrkräfte aufzuzeigen, die genannten Problematiken aktiv in den Schulunterricht miteinbeziehen zu können.
Resumo:
Die Auszeichnungssprache XML dient zur Annotation von Dokumenten und hat sich als Standard-Datenaustauschformat durchgesetzt. Dabei entsteht der Bedarf, XML-Dokumente nicht nur als reine Textdateien zu speichern und zu transferieren, sondern sie auch persistent in besser strukturierter Form abzulegen. Dies kann unter anderem in speziellen XML- oder relationalen Datenbanken geschehen. Relationale Datenbanken setzen dazu bisher auf zwei grundsätzlich verschiedene Verfahren: Die XML-Dokumente werden entweder unverändert als binäre oder Zeichenkettenobjekte gespeichert oder aber aufgespalten, sodass sie in herkömmlichen relationalen Tabellen normalisiert abgelegt werden können (so genanntes „Flachklopfen“ oder „Schreddern“ der hierarchischen Struktur). Diese Dissertation verfolgt einen neuen Ansatz, der einen Mittelweg zwischen den bisherigen Lösungen darstellt und die Möglichkeiten des weiterentwickelten SQL-Standards aufgreift. SQL:2003 definiert komplexe Struktur- und Kollektionstypen (Tupel, Felder, Listen, Mengen, Multimengen), die es erlauben, XML-Dokumente derart auf relationale Strukturen abzubilden, dass der hierarchische Aufbau erhalten bleibt. Dies bietet zwei Vorteile: Einerseits stehen bewährte Technologien, die aus dem Bereich der relationalen Datenbanken stammen, uneingeschränkt zur Verfügung. Andererseits lässt sich mit Hilfe der SQL:2003-Typen die inhärente Baumstruktur der XML-Dokumente bewahren, sodass es nicht erforderlich ist, diese im Bedarfsfall durch aufwendige Joins aus den meist normalisierten und auf mehrere Tabellen verteilten Tupeln zusammenzusetzen. In dieser Arbeit werden zunächst grundsätzliche Fragen zu passenden, effizienten Abbildungsformen von XML-Dokumenten auf SQL:2003-konforme Datentypen geklärt. Darauf aufbauend wird ein geeignetes, umkehrbares Umsetzungsverfahren entwickelt, das im Rahmen einer prototypischen Applikation implementiert und analysiert wird. Beim Entwurf des Abbildungsverfahrens wird besonderer Wert auf die Einsatzmöglichkeit in Verbindung mit einem existierenden, ausgereiften relationalen Datenbankmanagementsystem (DBMS) gelegt. Da die Unterstützung von SQL:2003 in den kommerziellen DBMS bisher nur unvollständig ist, muss untersucht werden, inwieweit sich die einzelnen Systeme für das zu implementierende Abbildungsverfahren eignen. Dabei stellt sich heraus, dass unter den betrachteten Produkten das DBMS IBM Informix die beste Unterstützung für komplexe Struktur- und Kollektionstypen bietet. Um die Leistungsfähigkeit des Verfahrens besser beurteilen zu können, nimmt die Arbeit Untersuchungen des nötigen Zeitbedarfs und des erforderlichen Arbeits- und Datenbankspeichers der Implementierung vor und bewertet die Ergebnisse.
Resumo:
Diese Arbeit behandelt Controlled Traffic Farming (CTF) Anbausysteme, bei denen für alle Arbeitsgänge satellitengesteuert immer dieselben Fahrspuren benutzt werden. Lässt sich mit CTF die Belastung des Bodens verringern und die Effizienz von Direktsaat-Anbausystemen steigern? Neben agronomischen und bodenphysikalischen Parametern wurden Auswirkungen von Lenksystemen und Umsetzungsmöglichkeiten von CTF in die Praxis untersucht. Die Analyse einer CTF-Umsetzung unter europäischen Bedingungen mit der Verwendung von Standardmaschinen zeigte, dass sich CTF-Anbausysteme mit den heute zur Verfügung stehenden Maschinen für Dauergrünland, Mähdruschfrüchte und Mais auf kleiner und grösser strukturierten Flächen relativ einfach mechanisieren lassen. Bei Zuckerrüben und Kartoffeln können Kompromisse notwendig sein. Generell erfordern CTF-Anbausysteme eine sorgfältige Planung und Umsetzung in die Praxis. Im dreijährigen Feldversuch (Winterweizen, Wintergerste, Kunstwiese mit Kleegrasmischung) auf einem Lehmboden wurde CTF-Direktsaat mit konventionell zufällig befahrenen Direktsaat- und Pflugverfahren verglichen. Unter CTF zeigte sich eine Differenzierung der nicht, gering und intensiv befahrenen Varianten. Auf dem vorliegenden kompakten Boden mit 1150 mm Jahresniederschlag waren die Unterschiede zwischen den nicht befahrenen Flächen und den mit niedrigem Kontaktflächendruck befahrenen Flächen eher gering. In den nicht befahrenen Flächen entwickelten Eindringwiderstand und Kohlendioxidgehalt der Bodenluft nach drei Jahren signifikant bessere Werte. Bodendichte und Porosität zeigten hingegen keinen eindeutig interpretierbaren Trend. Aufgrund teils suboptimaler Feldaufgänge liess sich keine generelle agronomische Tendenz ableiten. Die intensive Befahrung der Pflegefahrgassen zeigte allerdings klar negative bodenkundliche und planzenbauliche Auswirkungen. Es bietet sich daher an, vor allem für Pflegearbeiten permanent dieselben Fahrspuren zu nutzen. In der Untersuchung zu den Auswirkungen von Lenksystemen zeigten sich signifikante Vorteile von Lenksystemen in einer Verminderung der Fahrerbelastung und einer höheren Lenkgenauigkeit vor allem bei grossen Arbeitsbreiten ohne Spuranreisser. Die meisten anderen Messparameter waren mit Lenksystem leicht vorteilhafter als ohne, unterschieden sich aber nicht signifikant voneinander. Fahrer und naturräumliche Gegebenheiten wie die Schlagform hatten einen wesentlich grösseren Einfluss. Gesamthaft betrachtet erweitert CTF in Kombination mit weiteren Bodenschutzmass-nahmen die Möglichkeiten, Bodenverdichtungen zu vermeiden, den Bedarf an energieintensiver Bodenlocke-rung zu reduzieren und die Entwicklung einer stabileren Bodenstruktur mit höherer Tragfähigkeit zu fördern. Zusammen mit einer an Kultur und Anbausystem angepassten Saatbettbereitung und den in geraden Reihen einfacher durchführbaren mechanischen Pflegemassnahmen ergeben sich gute Voraussetzungen für die Gestaltung agronomisch leistungsfähiger und ökologisch nachhaltiger Anbausysteme.
Resumo:
This paper describes a simple method for internal camera calibration for computer vision. This method is based on tracking image features through a sequence of images while the camera undergoes pure rotation. The location of the features relative to the camera or to each other need not be known and therefore this method can be used both for laboratory calibration and for self calibration in autonomous robots working in unstructured environments. A second method of calibration is also presented. This method uses simple geometric objects such as spheres and straight lines to The camera parameters. Calibration is performed using both methods and the results compared.
Resumo:
”compositions” is a new R-package for the analysis of compositional and positive data. It contains four classes corresponding to the four different types of compositional and positive geometry (including the Aitchison geometry). It provides means for computation, plotting and high-level multivariate statistical analysis in all four geometries. These geometries are treated in an fully analogous way, based on the principle of working in coordinates, and the object-oriented programming paradigm of R. In this way, called functions automatically select the most appropriate type of analysis as a function of the geometry. The graphical capabilities include ternary diagrams and tetrahedrons, various compositional plots (boxplots, barplots, piecharts) and extensive graphical tools for principal components. Afterwards, ortion and proportion lines, straight lines and ellipses in all geometries can be added to plots. The package is accompanied by a hands-on-introduction, documentation for every function, demos of the graphical capabilities and plenty of usage examples. It allows direct and parallel computation in all four vector spaces and provides the beginner with a copy-and-paste style of data analysis, while letting advanced users keep the functionality and customizability they demand of R, as well as all necessary tools to add own analysis routines. A complete example is included in the appendix
Resumo:
Lecture notes in PDF
Resumo:
Lecture notes in LaTex