648 resultados para ASTRO-R8


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis deals with some aspects of the Physics of the early universe, like phase transitions, bubble nucleations and premodial density perturbations which lead to the formation structures in the universe. Quantum aspects of the gravitational interaction play an essential role in retical high-energy physics. The questions of the quantum gravity are naturally connected with early universe and Grand Unification Theories. In spite of numerous efforts, the various problems of quantum gravity remain still unsolved. In this condition, the consideration of different quantum gravity models is an inevitable stage to study the quantum aspects of gravitational interaction. The important role of gravitationally coupled scalar field in the physics of the early universe is discussed in this thesis. The study shows that the scalar-gravitational coupling and the scalar curvature did play a crucial role in determining the nature of phase transitions that took place in the early universe. The key idea in studying the formation structure in the universe is that of gravitational instability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El CEIP Manuel P??rez de Bollullos Par del Condado (Huelva) ha recibido el Primer Premio de P??ginas Web Educativas de la Junta de Andaluc??a, y ha sido galardonado con la Medalla de Oro al M??rito Educativo

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Logo for the school of Physics and Astronomy in Inkscape SVG, PDF and high-resolution PNG format

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O fascínio do poder de dotar de vida ao que está inanimado talvez possa ajudar a revelar o encanto que sentimos quando objectos fixos se começam a mover e a desenvolver coreografias cheias de vida. A técnica do desenho tornou-se o processo ideal para explorar estes temas por ser mais fácil desenhar robôs do que construir personagens, naves ou cenários futuristas para o cinema live-action. Assim, a animé favoreceu o género SF por precisamente ser mais económico a sua reprodução, tornando-a uma melhor escolha para um cheaper spectacle (Clements & McCarthy, 2006, p. 567). O motivo financeiro não foi a única razão, pois existem algumas questões importantes a explorar que reforçam esta tendência SF da animé . Após a 2ª grande guerra, no momento inicial da reconstrução do Japão, o Ground Zero, coincide com o desenvolvimento de uma cultura pop na qual os desenhos manga e a animé se tornaram referências fundamentais. Iremos verificar como das importantes particularidades dos filmes ou séries de animação japonesa são a exploração de imaginários SF onde o atómico é encanto monstruoso, que se controla ou pelo qual se é controlado. Com especial destaque às séries de televisão Prince Planet (Sato Okura, 1965), Gigantor (Mitsuteru Yokoyama, 1963) ou Astro Boy (Osamu Tezuka, 1963), iremos demonstrar que pela componente técnica ou visual da animação, a animé seja um específico modelo de estudo para tentar contextualizar atracções humanas por mundos controlados por máquinas, manipulação genética ou pela possibilidade de uma paisagem ciborgue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O fascínio do poder de dotar de vida ao que está inanimado talvez possa ajudar a revelar o encanto que sentimos quando os objectos fixos se começam a mover e a desenvolver coreografias cheias de vida. Na animé, a técnica do desenho tornou-se o processo ideal para explorar estes temas por ser mais fácil desenhar figuras monstruosas do que construir personagens, naves ou cenários futuristas para o cinema live-action. Após a II Grande Guerra, no momento inicial da reconstrução do Japão, o Ground Zero, coincide com o desenvolvimento de uma cultura pop na qual os desenhos manga e a animé se tornaram referências fundamentais. Iremos verificar como uma das importantes particularidades de séries de animação japonesa é a exploração de imaginários SF onde o atómico é encanto monstruoso, que se controla ou pelo qual se é controlado. Dando especial destaque às séries de televisão Prince Planet (Sato Okura, 1965), Gigantor (Mitsuteru Yokoyama, 1963) ou Astro Boy (Osamu Tezuka, 1963), iremos demonstrar que pela componente técnica ou visual da animação a animé é um modelo específico de estudo para tentar contextualizar a atracção humana por mundos controlados por máquinas, pela manipulação genética ou pela possibilidade de uma paisagem cyborg.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present invention provides a process comprising substitution of an acceptor molecule comprising a group -XC(O)- wherein X is O, S or NR8, where R8 is C1-6 alkyl, C6-12 aryl or hydrogen, with a nucleophile, wherein the acceptor molecule is cyclised such that said nucleophilic substitution at -XC (O)- occurs without racemisation. This process has particular application for the production of a peptide by extension from the activated carboxy-terminus of an acyl amino acid residue without epimerisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present invention provides a process comprising substitution of an acceptor molecule comprising a group -XC(O)- wherein X is O, S or NR8, where R8 is C1-6 alkyl, C6-12 aryl or hydrogen, with a nucleophile, wherein the acceptor molecule is cyclised such that said nucleophilic substitution at -XC (O)- occurs without racemisation. This process has particular application for the production of a peptide by extension from the activated carboxy-terminus of an acyl amino acid residue without epimerisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the first time, vertical column measurements of (HNO3) above the Arctic Stratospheric Ozone Observatory (AStrO) at Eureka (80N, 86W), Canada, have been made during polar night using lunar spectra recorded with a Fourier Transform Infrared (FTIR) spectrometer, from October 2001 to March 2002. AStrO is part of the primary Arctic station of the Network for the Detection of Stratospheric Change (NDSC). These measurements were compared with FTIR measurements at two other NDSC Arctic sites: Thule, Greenland (76.5N, 68.8W) and Kiruna, Sweden (67.8N, 20.4E). The measurements were also compared with two atmospheric models: the Canadian Middle Atmosphere Model (CMAM) and SLIMCAT. This is the first time that CMAM HNO3 columns have been compared with observations in the Arctic. Eureka lunar measurements are in good agreement with solar ones made with the same instrument. Eureka and Thule HNO3 columns are consistent within measurement error. Differences among HNO3 columns measured at Kiruna and those measured at Eureka and Thule can be explained on the basis of the available sunlight hours and the polar vortex location. The comparison of CMAM HNO3 columns with Eureka and Kiruna data shows good agreement, considering CMAM small inter-annual variability. The warm 2001/02 winter with almost no Polar Stratospheric Clouds (PSCs) makes the comparison of the warm climate version of CMAM with these observations a good test for CMAM under no PSC conditions. SLIMCAT captures the magnitude of HNO3 columns at Eureka, and the day-to-day variability, but generally reports higher HNO3 columns than the CMAM climatological mean columns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We outline our first steps towards marrying two new and emerging technologies; the Virtual Observatory (e.g, Astro- Grid) and the computational grid. We discuss the construction of VOTechBroker, which is a modular software tool designed to abstract the tasks of submission and management of a large number of computational jobs to a distributed computer system. The broker will also interact with the AstroGrid workflow and MySpace environments. We present our planned usage of the VOTechBroker in computing a huge number of n–point correlation functions from the SDSS, as well as fitting over a million CMBfast models to the WMAP data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding effects of ionisation in the lower atmosphere is a new interdisciplinary area, crossing the traditionally distinct scientific boundaries between astro-particle and atmospheric physics and also requiring understanding of both heliospheric and magnetospheric influences on cosmic rays. Following the paper of Erlykin et al. (2014) we develop further the interpretation of our observed changes in long-wave (LW) radiation, Aplin and Lockwood (2013) by taking account of both cosmic ray ionisation yields and atmospheric radiative transfer. To demonstrate this, we show that the thermal structure of the whole atmosphere needs to be considered along with the vertical profile of ionisation. Allowing for, in particular, ionisation by all components of a cosmic ray shower and not just by the muons, reveals that the effect we have detected is certainly not inconsistent with laboratory observations of the LW absorption cross section. The analysis presented here, although very different from that of Erlykin et al., does come to the same conclusion that the events detected by AL were not caused by individual cosmic ray primaries – not because it is impossible on energetic grounds, but because events of the required energy are too infrequent for the 12 h_1 rate at which they were seen by the AL experiment. The present paper numerically models the effect of three different scenario changes to the primary GCR spectrum which all reproduce the required magnitude of the effect observed by AL. However, they cannot solely explain the observed delay in the peak effect which, if confirmed, would appear to open up a whole new and interesting area in the study of water oligomers and their effects on LW radiation. We argue that a technical artefact in the AL experiment is highly unlikely and that our initial observations merit both a wide-ranging follow-up experiment and more rigorous, self-consistent, three-dimensional radiative transfer modelling

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the use of optical coherence tomography (OCT) to detect and quantify demineralization process induced by S. mutans biofilm in third molars human teeth. Artificial lesions were induced by a S. mutans microbiological culture and the samples (N = 50) were divided into groups according to the demineralization time: 3, 5, 7, 9, and 11days. The OCT system was implemented using a light source delivering an average power of 96 mu W in the sample arm, and spectral characteristics allowing 23 mu m of axial resolution. The images were produced with lateral scans step of 10 pan and analyzed individually. As a result of the evaluation of theses images, lesion depth was calculated as function of demineralization time. The depth of the lesion in the root dentine increased from 70 pm to 230,urn (corrected by the enamel refraction index, 1.62 @ 856 nm), depending of exposure time. The lesion depth in root dentine was correlated to demineralization time, showing that it follows a geometrical progression like a bacteria growth law. [GRAPHICS] Progression of lesion depth in root dentine as function of exposure time, showing that it follows a geometrical progression like a bacteria growth law(C) 2009 by Astro Ltd. Published exclusively by WILEY-VCH Verlag GmbH & Co. KGaA

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have revisited photoassociative ionization (PAI) in a cold sample of Na atmos. A two-color experiment was performed ina magneto-optical trap through the addition of aprobe laser. The observation of a marked change in the PAI rate for a definite frequency range can be attributed to the influence of repuisive levels and a possible avoided crossing between long-range molecular levels. (c) 2009 by Astro Ltd. Published exclusively by WLLEY-VCH Verlag GmbH & Co. KGaA

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigated the effects of photodynamic therapy (PDT) outcome when combining three laser systems that produce light in three different wavelengths (600, 630, and 660 nm). Cooperative as well as independent effects can be observed. We compared the results of the combined wavelengths of light with the effect of single laser for the excitation of the photosensitizer. In the current experiment, the used photosensitizer was Photogem (R) (1.5 mg/kg). Combining two wavelengths for PDT, their cumulative dose and different penetrability may change the overall effect of the fluence of light, which can be effective for increasing the depth of necrosis. This evaluation was performed by comparing the depth and specific aspect of necrosis obtained by using single and dual wavelengths for irradiation of healthy liver of male Wistar rats. We used 15 animals and divided them in five groups of three animals. First, Photogem (R) was administered; follow by measurement of the fluorescence spectrum of the liver before PDT to confirm the level of accumulation of photosensitizer in the tissue. After that, an area of 1 cm(2) of the liver was illuminated using different laser combinations. Qualitative analysis of the necrosis was carried out through histological and morphological study. [GRAPHICS] (a) - microscopic images of rat liver cells, (b) - superficial necrosis caused by PDT using dual-wavelength illumination, (c) - neutrophilic infiltration around the vessel inside the necrosis, and (d) - neutrophilic infiltration around the vessel between necrosis and live tissue (C) 2011 by Astro Ltd. Published exclusively by WILEY-VCH Verlag GmbH & Co. KGaA

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel concept of quantum turbulence in finite size superfluids, such as trapped bosonic atoms, is discussed. We have used an atomic (87)Rb Bose-Einstein condensate (BEC) to study the emergence of this phenomenon. In our experiment, the transition to the quantum turbulent regime is characterized by a tangled vortex lines formation, controlled by the amplitude and time duration of the excitation produced by an external oscillating field. A simple model is suggested to account for the experimental observations. The transition from the non-turbulent to the turbulent regime is a rather gradual crossover. But it takes place in a sharp enough way, allowing for the definition of an effective critical line separating the regimes. Quantum turbulence emerging in a finite-size superfluid may be a new idea helpful for revealing important features associated to turbulence, a more general and broad phenomenon. [GRAPHICS] Amplitude versus elapsed time diagram of magnetically excited BEC superfluid, presenting the evolution from the non-turbulent regime, with well separated vortices, to the turbulent regimes, with tangled vortices (C) 2011 by Astro Ltd. Published exclusively by WILEY-VCH Verlag GmbH & Co. KGaA