655 resultados para Cosmic conciousness
Resumo:
Introduction Fundamental to the philosophy of Buddhism, is the insight that there is "unsatisfactohness" (dukkha) in the world and that it can be eliminated through the practice of the Noble Eight Fold Path. Buddhism also maintains that the world as we experience and entities that exist are bereft of any substantiality. Instead existence is manifest through dependent origination. All things are conditional; nothing is permanent. However, inherent in this dependent existence is the interconnectedness of all beings and their subjection to the cosmic law of karma. Part of cultivating the Eight Fold path includes a deep compassion for all other living things, 'trapped' within this cycle of dependent origination. This compassion or empathy (karuna) is crucial to the Buddhist path to enlightenment. It is this emphasis on karuna that shows itself in Mahayana Buddhism with respect to the theory of the boddhisatva (or Buddha-to-be) since the boddhisatva willingly postpones his/her own enlightenment to help others on the same path. One of the ramifications of the theory of dependent origination is that there is no anthropocentric bias placed on humans over the natural world. Paradoxically the doctrine of non-self becomes an ontology within Buddhism, culminating in the Mayahana realization that a common boundary exists between samsara and nirvana. Essential to this ontology is the life of dharma or a moral life. Ethics is not separated from ontology. As my thesis will show, this basic outlook of Buddhism has implications toward our understanding of the Buddhist world-view with respect to the current human predicament concerning the environment. While humans are the only ones who can 4 attain "Buddhahood", it is because of our ability to understand what it means to follow the Eight fold path and act accordingly. Because of the interconnectedness of all entities {dharmas), there is an ontological necessity to eliminate suffering and 'save the earth' because if we allow the earth to suffer, we ALL suffer. This can be understood as an ethical outlook which can be applied to our interaction with and treatment of the natural environment or environment in the broadest sense, not just trees plants rocks etc. It is an approach to samsara and all within it. It has been argued that there is no ontology in Buddhism due to its doctrine of "non-self". However, it is a goal of this thesis to argue that there does exist an original ontology in Buddhism; that according to it, the nature of Being is essentially neither "Being nor non-being nor not non-being" as illustrated by Nagarjuna. Within this ontology is engrained an ethic or 'right path' (samma marga) that is fundamental to our being and this includes a compassionate relationship to our environment. In this dissertation I endeavour to trace the implications that the Buddhist worldview has for the environmental issues that assail us in our age of technology. I will explore questions such as: can the Buddhist way of thinking help us comprehend and possibly resolve the environmental problems of our day and age? Are there any current environmental theories which are comparable to or share common ground with the classical Buddhist doctrines? I will elucidate some fundamental doctrines of early Buddhism from an environmental perspective as well as identify some comparable modern environmental theories such as deep ecology and general systems theory, that seem to share in the wisdom of classical Buddhism and have much to gain from a deeper appreciation of Buddhism.
Resumo:
Réalisé en cotutelle avec l'Université de Paris Ouest Nanterre La Défense
Resumo:
Ce travail se veut d’un rapprochement aux pratiques et savoirs des peuples amazoniens à partir de discours produits par ces nations. Nous y interpréterons des chants sacrés, des narrations ancestrales et des textes académiques de penseurs autochtones. Ce travail indique que les pratiques amazoniennes s’inscrivent dans un contexte de significations qui considèrent que tout être vivant possède des pensées et un esprit; qu’il existe des êtres spirituels qui défendent ces êtres vivants contre les abus possibles. Les êtres humains doivent transcender leur état de conscience, se déplacer vers les mondes invisibles et initier la communication avec ces esprits, pour ainsi maintenir l’équilibre existentiel. Selon les pensées de l’Amazonie, les communautés humaines ne peuvent pas se concevoir comme autosuffisantes; elles doivent plutôt maintenir de constantes relations avec les multiples êtres qui peuplent leur environnement visible et les mondes invisibles. Les trois concepts clés qui permettent de rendre compte des pratiques des peuples amazoniens sont la déprédation, la transformation et l’équilibre. Par déprédation, nous entendons les pratiques amazoniennes qui impliquent une destruction des autres êtres afin de sustenter la vie de la communauté. Selon les pensées de l’Amazonie, cette déprédation devrait être mesurée, dans le but de ne pas tuer plus que nécessaire à la survie. La déprédation est régulée par les êtres spirituels. Les pratiques amazoniennes de transformation sont destinées à la sauvegarde des liens de la communauté, en transfigurant tout ce qui entre ou sort de cette dernière, de manière à ce qu’aucun agent externe ne mette en péril les liens affectifs. Les pratiques de déprédation et de transformation sont complémentaires et elles requièrent toutes les deux de se produire de manière équilibrée, en respectant les savoirs ancestraux et les lois cosmiques établies par les esprits supérieurs. En ce qui a trait à la méthode d’analyse, nous aborderons les discours de l’Amazonie à partir leur propre logique culturelle, sans imposer des méthodologies préétablies, ce qui donne comme résultat un travail académique qui approfondie la production intellectuelle interculturelle, puisque ce sont les voix indigènes qui expriment elles-mêmes leurs conceptions et le sens de leurs pratiques. Dans son ensemble, le travail engage un dialogue critique avec son champ d’étude en discutant ou en approfondissant certaines conceptions forgées par la littérature anthropologique consacrée à l’étude de la région, à partir des savoirs ancestraux amazoniens qui nourrissent les pratiques de ces nations.
Resumo:
L’alchimie, science de la manipulation des influences spirituelles par une métallurgie sacrée, et la pataphysique, esthétique pseudo-scientifique associant l'ésotérisme à l'humour, sont les deux principaux fondements idéologiques qui unissent Marcel Duchamp et Roberto Matta. Tandis que Duchamp s'intéresse déjà à l'ésotérisme dès 1910, soit près d'une vingtaine d'années avant sa rencontre avec Matta. Ce dernier aborde, dans sa production, des thèmes propres à la littérature alchimique, soit les opérations occultes, les états merveilleux de la matière et les appareils de laboratoire. De plus, les écrivains symbolistes et pseudo-scientifiques, lus par Duchamp, puis par Matta, influencent l'humour pataphysique, teinté d'ésotérisme, qui s'exprime dans la production de ces deux artistes. Ainsi, Les Célibataires, vingt ans plus tard, est une huile sur toile, réalisée en 1943, par Roberto Matta, qui représente un paysage cosmique, composé d'astres et de trous noirs, de trois alambics et d'une grande machine noire. Dans cette œuvre, Matta réinterprète très librement certains éléments du Grand verre, une peinture sur verre de Marcel Duchamp, laissée inachevée en 1923. Le présent mémoire de maîtrise étudie l'influence de l'alchimie et de l'iconographie duchampienne sur Les Célibataires, vingt ans plus tard. Dans un premier temps, cette étude vise à mettre en exergue et à examiner les influences alchimiques et pataphysiques dans l'œuvre de Matta. Dans un deuxième temps, notre mémoire vise à démontrer comment l'œuvre de Matta s'intègre dans le projet surréaliste de création d'un mythe nouveau, dans la continuité du projet duchampien.
Resumo:
The question of stability of black hole was first studied by Regge and Wheeler who investigated linear perturbations of the exterior Schwarzschild spacetime. Further work on this problem led to the study of quasi-normal modes which is believed as a characteristic sound of black holes. Quasi-normal modes (QNMs) describe the damped oscillations under perturbations in the surrounding geometry of a black hole with frequencies and damping times of oscillations entirely fixed by the black hole parameters.In the present work we study the influence of cosmic string on the QNMs of various black hole background spacetimes which are perturbed by a massless Dirac field.
Resumo:
The thesis begins with a review of basic elements of general theory of relativity (GTR) which forms the basis for the theoretical interpretation of the observations in cosmology. The first chapter also discusses the standard model in cosmology, namely the Friedmann model, its predictions and problems. We have also made a brief discussion on fractals and inflation of the early universe in the first chapter. In the second chapter we discuss the formulation of a new approach to cosmology namely a stochastic approach. In this model, the dynam ics of the early universe is described by a set of non-deterministic, Langevin type equations and we derive the solutions using the Fokker—Planck formalism. Here we demonstrate how the problems with the standard model, can be eliminated by introducing the idea of stochastic fluctuations in the early universe. Many recent observations indicate that the present universe may be approximated by a many component fluid and we assume that only the total energy density is conserved. This, in turn, leads to energy transfer between different components of the cosmic fluid and fluctuations in such energy transfer can certainly induce fluctuations in the mean to factor in the equation of state p = wp, resulting in a fluctuating expansion rate for the universe. The third chapter discusses the stochastic evolution of the cosmological parameters in the early universe, using the new approach. The penultimate chapter is about the refinements to be made in the present model, by means of a new deterministic model The concluding chapter presents a discussion on other problems with the conventional cosmology, like fractal correlation of galactic distribution. The author attempts an explanation for this problem using the stochastic approach.
Resumo:
Hindi
Resumo:
HINDI
Resumo:
Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.
Resumo:
La conciencia, sus diversos estados y las propiedades específicas de estado han sido materia de indagación en prácticamente todas las culturas. Como producto de ello, se han generado multiplicidad de perspectivas sobre el valor de estos estados de conciencia y sobre los modos adecuados de producirlos y utilizarlos. A éstos últimos se les conoce como prácticas de transformación o tecnologías de la conciencia. En el presente trabajo, luego de presentar las posturas contemporáneas básicas utilizadas para el estudio de la conciencia, se revisan las concepciones que sobre ella surgen desde la psicología transpersonal y en el budismo mahayana. Le sigue la presentación del concepto de estados y estados alterados de conciencia en la psicoterapia. Tras discutir la noción de prácticas de transformación de la conciencia se concluye con una presentación más detallada de la meditación y la oración como ejemplos de tecnologías de conciencia utilizadas como medio de sanación y de crecimiento personal.
Resumo:
No publicado
Resumo:
The atmosphere's fair weather electric field is a permanent feature, arising from the combination of distant thunderstorms, Earth's conducting surface, a charged ionosphere and cosmic ray ionization. Despite its ubiquity, no fair weather electricity effect on clouds has been hitherto demonstrated. Here we report surface measurements of radiation emitted and scattered by extensive thin continental cloud, which, after ~2 min delay, shows changes closely following the fair weather electric field. For typical fluctuations in the fair weather electric field, changes of about 10% are subsequently induced in the diffuse short-wave radiation. These observations are consistent with enhanced production of large cloud droplets from charging at layer cloud edges.
Resumo:
The solar wind modulates the flux of galactic cosmic rays impinging on Earth inversely with solar activity. Cosmic ray ionisation is the major source of air’s electrical conductivity over the oceans and well above the continents. Differential solar modulation of the cosmic ray energy spectrum modifies the cosmic ray ionisation at different latitudes,varying the total atmospheric columnar conductance. This redistributes current flow in the global atmospheric electrical circuit, including the local vertical current density and the related surface potential gradient. Surface vertical current density and potential gradient measurements made independently at Lerwick Observatory,Shetland,from 1978 to 1985 are compared with modelled changes in cosmic ray ionisation arising from solar activity changes. Both the lower troposphere atmospheric electricity quantities are significantly increased at cosmic ray maximum(solar minimum),with a proportional change greater than that of the cosmic ray change.
Resumo:
During a 4-week run in October–November 2006, a pilot experiment was performed at the CERN Proton Synchrotron in preparation for the Cosmics Leaving OUtdoor Droplets (CLOUD) experiment, whose aim is to study the possible influence of cosmic rays on clouds. The purpose of the pilot experiment was firstly to carry out exploratory measurements of the effect of ionising particle radiation on aerosol formation from trace H2SO4 vapour and secondly to provide technical input for the CLOUD design. A total of 44 nucleation bursts were produced and recorded, with formation rates of particles above the 3 nm detection threshold of between 0.1 and 100 cm−3 s−1, and growth rates between 2 and 37 nm h−1. The corresponding H2SO4 concentrations were typically around 106 cm−3 or less. The experimentally-measured formation rates and H2SO4 concentrations are comparable to those found in the atmosphere, supporting the idea that sulphuric acid is involved in the nucleation of atmospheric aerosols. However, sulphuric acid alone is not able to explain the observed rapid growth rates, which suggests the presence of additional trace vapours in the aerosol chamber, whose identity is unknown. By analysing the charged fraction, a few of the aerosol bursts appear to have a contribution from ion-induced nucleation and ion-ion recombination to form neutral clusters. Some indications were also found for the accelerator beam timing and intensity to influence the aerosol particle formation rate at the highest experimental SO2 concentrations of 6 ppb, although none was found at lower concentrations. Overall, the exploratory measurements provide suggestive evidence for ion-induced nucleation or ion-ion recombination as sources of aerosol particles. However in order to quantify the conditions under which ion processes become significant, improvements are needed in controlling the experimental variables and in the reproducibility of the experiments. Finally, concerning technical aspects, the most important lessons for the CLOUD design include the stringent requirement of internal cleanliness of the aerosol chamber, as well as maintenance of extremely stable temperatures (variations below 0.1 _C).