33 resultados para MILKY-WAY SATELLITES

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The exhibition "Isabel Banal. Via Lactea" is part of the exhibition cycle "Blanc sota negre. Treballs des de l'imperceptible / 5" curated by Joana Masó (Centre Dona i Literatura) and Assumpta Bassas (Universitat de Barcelona). In this exhibition, the artist shows long-term and open projects exhibited on five tables. The table is one of the central elements from the beginning of her career, in relation to the creative and domestic space, but also as a double metaphor for the ground and for soil.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the unresolved questions of modern physics is the nature of Dark Matter. Strong experimental evidences suggest that the presence of this elusive component in the energy budget of the Universe is quite significant, without, however, being able to provide conclusive information about its nature. The most plausible scenario is that of weakly interacting massive particles (WIMPs), that includes a large class of non-baryonic Dark Matter candidates with a mass typically between few tens of GeV and few TeVs, and a cross section of the order of weak interactions. Search for Dark Matter particles using very high energy gamma-ray Cherenkov telescopes is based on the model that WIMPs can self-annihilate, leading to production of detectable species, like photons. These photons are very energetic, and since unreflected by the Universe's magnetic fields, they can be traced straight to the source of their creation. The downside of the approach is a great amount of background radiation, coming from the conventional astrophysical objects, that usually hides clear signals of the Dark Matter particle interactions. That is why good choice of the observational candidates is the crucial factor in search for Dark Matter. With MAGIC (Major Atmospheric Gamma-ray Imaging Cherenkov Telescopes), a two-telescope ground-based system located in La Palma, Canary Islands, we choose objects like dwarf spheroidal satellite galaxies of the Milky Way and galaxy clusters for our search. Our idea is to increase chances for WIMPs detection by pointing to objects that are relatively close, with great amount of Dark Matter and with as-little-as-possible pollution from the stars. At the moment, several observation projects are ongoing and analyses are being performed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context. The understanding of Galaxy evolution can be facilitated by the use of population synthesis models, which allow to test hypotheses on the star formation history, star evolution, as well as chemical and dynamical evolution of the Galaxy. Aims. The new version of the Besanc¸on Galaxy Model (hereafter BGM) aims to provide a more flexible and powerful tool to investigate the Initial Mass Function (IMF) and Star Formation Rate (SFR) of the Galactic disc. Methods. We present a new strategy for the generation of thin disc stars which assumes the IMF, SFR and evolutionary tracks as free parameters. We have updated most of the ingredients for the star count production and, for the first time, binary stars are generated in a consistent way. We keep in this new scheme the local dynamical self-consistency as in Bienayme et al (1987). We then compare simulations from the new model with Tycho-2 data and the local luminosity function, as a first test to verify and constrain the new ingredients. The effects of changing thirteen different ingredients of the model are systematically studied. Results. For the first time, a full sky comparison is performed between BGM and data. This strategy allows to constrain the IMF slope at high masses which is found to be close to 3.0, excluding a shallower slope such as Salpeter"s one. The SFR is found decreasing whatever IMF is assumed. The model is compatible with a local dark matter density of 0.011 M pc−3 implying that there is no compelling evidence for significant amount of dark matter in the disc. While the model is fitted to Tycho2 data, a magnitude limited sample with V<11, we check that it is still consistent with fainter stars. Conclusions. The new model constitutes a new basis for further comparisons with large scale surveys and is being prepared to become a powerful tool for the analysis of the Gaia mission data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Context. The understanding of Galaxy evolution can be facilitated by the use of population synthesis models, which allow to test hypotheses on the star formation history, star evolution, as well as chemical and dynamical evolution of the Galaxy. Aims. The new version of the Besanc¸on Galaxy Model (hereafter BGM) aims to provide a more flexible and powerful tool to investigate the Initial Mass Function (IMF) and Star Formation Rate (SFR) of the Galactic disc. Methods. We present a new strategy for the generation of thin disc stars which assumes the IMF, SFR and evolutionary tracks as free parameters. We have updated most of the ingredients for the star count production and, for the first time, binary stars are generated in a consistent way. We keep in this new scheme the local dynamical self-consistency as in Bienayme et al (1987). We then compare simulations from the new model with Tycho-2 data and the local luminosity function, as a first test to verify and constrain the new ingredients. The effects of changing thirteen different ingredients of the model are systematically studied. Results. For the first time, a full sky comparison is performed between BGM and data. This strategy allows to constrain the IMF slope at high masses which is found to be close to 3.0, excluding a shallower slope such as Salpeter"s one. The SFR is found decreasing whatever IMF is assumed. The model is compatible with a local dark matter density of 0.011 M pc−3 implying that there is no compelling evidence for significant amount of dark matter in the disc. While the model is fitted to Tycho2 data, a magnitude limited sample with V<11, we check that it is still consistent with fainter stars. Conclusions. The new model constitutes a new basis for further comparisons with large scale surveys and is being prepared to become a powerful tool for the analysis of the Gaia mission data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Besley (1988) uses a scaling approach to model merit good arguments in commodity tax policy. In this paper, I question this approach on the grounds that it produces 'wrong' recommendations--taxation (subsidisation) of merit (demerit) goods--whenever the demand for the (de)merit good is inelastic. I propose an alternative approach that does not suffer from this deficiency, and derive the ensuing first and second best tax rules, as well as the marginal cost expressions to perform tax reform analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to analyse the effects of recent regulatory reforms that Spanish Health Authorities have implemented in the pharmaceutical market: the introduction of a reference price system together with the promotion of generic drugs. The main objectives of these two reforms are to increase price competition and, ultimately, reduce pharmaceutical costs. Before the introduction of reference prices, consumers had to pay a fixed copayment of the price of whatever drug purchased. With the introduction of such system, the situation differs in the following way: if (s)he buys the more expensive branded drug, then (s)he pays a sum of two elements: the copayment associated to the reference price plus the difference between the price of this good and the reference price. However, if the consumer decides to buy the generic alternative, with price lower than the reference price, then (s)he has to pay the same copayment as before. We show that the introduction of a reference price system together with the promotion of generic drugs increase price competition and lower pharmaceutical costs only if the reference price is set in a certain interval. Also profits for the duopolists might be reduced. These results are due to the opposing effects that reference prices have on branded and generic producers respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study a retail benchmarking approach to determine access prices for interconnected networks. Instead of considering fixed access charges as in the existing literature, we study access pricing rules that determine the access price that network i pays to network j as a linear function of the marginal costs and the retail prices set by both networks. In the case of competition in linear prices, we show that there is a unique linear rule that implements the Ramsey outcome as the unique equilibrium, independently of the underlying demand conditions. In the case of competition in two-part tariffs, we consider a class of access pricing rules, similar to the optimal one under linear prices but based on average retail prices. We show that firms choose the variable price equal to the marginal cost under this class of rules. Therefore, the regulator (or the competition authority) can choose one among the rules to pursue additional objectives such as consumer surplus, network covera.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Report for the scientific sojourn at the German Aerospace Center (DLR) , Germany, during June and July 2006. The main objective of the two months stay has been to apply the techniques of LEO (Low Earth Orbiters) satellites GPS navigation which DLR currently uses in real time navigation. These techniques comprise the use of a dynamical model which takes into account the precise earth gravity field and models to account for the effects which perturb the LEO’s motion (such as drag forces due to earth’s atmosphere, solar pressure, due to the solar radiation impacting on the spacecraft, luni-solar gravity, due to the perturbation of the gravity field for the sun and moon attraction, and tidal forces, due to the ocean and solid tides). A high parameterized software was produced in the first part of work, which has been used to asses which accuracy could be reached exploring different models and complexities. The objective was to study the accuracy vs complexity, taking into account that LEOs at different heights have different behaviors. In this frame, several LEOs have been selected in a wide range of altitudes, and several approaches with different complexity have been chosen. Complexity is a very important issue, because processors onboard spacecrafts have very limited computing and memory resources, so it is mandatory to keep the algorithms simple enough to let the satellite process it by itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We give a survey of some recent results on Grothendieck duality. We begin with a brief reminder of the classical theory, and then launch into an overview of some of the striking developments since 2005.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Treball de recerca realitzat per una alumna d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. L’albedo lunar i els satèl•lits és un treball que relaciona l’enginyeria aeroespacial amb l’astronomia. El seu objectiu principal investigar si l’albedo lunar, els rajos de sol reflectits a la superfície lunar, pot modificar significativament la temperatura de les plaques solars d’un satèl•lit artificial que orbiti la Lluna i, en conseqüència, afectar-ne el rendiment. El segon objectiu del treball és calcular si seria possible fer un mapa d’albedo lunar, a partir de la temperatura d’un satèl•lit en òrbita al voltant de la Lluna, que permetria conèixer amb més precisió la composició de la superfície lunar. Després d’adquirir els fonaments teòrics necessaris per a realitzar el treball, del procés per a trobar la manera de dur a terme els càlculs i d’efectuar els càlculs en si, les conclusions del treball són que l’albedo lunar causa un augment de temperatura en els satèl•lits prou significatiu per afectar-ne el rendiment; i que amb les temperatures enregistrades per un satèl•lit en òrbita al voltant de la Lluna es podria crear un mapa d’albedo. Aquesta recerca ha estat feta per suggeriment i sota la supervisió del CTAE (Centre de Tecnologia Aeroespacial) per analitzar si els resultats són aplicables al satèl•lit que s’enviarà a la Lluna, Lunar Mission BW1.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The commitment among agents has always been a difficult task, especially when they have to decide how to distribute the available amount of a scarce resource among all. On the one hand, there are a multiplicity of possible ways for assigning the available amount; and, on the other hand, each agent is going to propose that distribution which provides her the highest possible award. In this paper, with the purpose of making this agreement easier, firstly we use two different sets of basic properties, called Commonly Accepted Equity Principles, to delimit what agents can propose as reasonable allocations. Secondly, we extend the results obtained by Chun (1989) and Herrero (2003), obtaining new characterizations of old and well known bankruptcy rules. Finally, using the fact that bankruptcy problems can be analyzed from awards and losses, we define a mechanism which provides a new justification of the convex combinations of bankruptcy rules. Keywords: Bankruptcy problems, Unanimous Concessions procedure, Diminishing Claims mechanism, Piniles’ rule, Constrained Egalitarian rule. JEL classification: C71, D63, D71.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A major obstacle to processing images of the ocean floor comes from the absorption and scattering effects of the light in the aquatic environment. Due to the absorption of the natural light, underwater vehicles often require artificial light sources attached to them to provide the adequate illumination. Unfortunately, these flashlights tend to illuminate the scene in a nonuniform fashion, and, as the vehicle moves, induce shadows in the scene. For this reason, the first step towards application of standard computer vision techniques to underwater imaging requires dealing first with these lighting problems. This paper analyses and compares existing methodologies to deal with low-contrast, nonuniform illumination in underwater image sequences. The reviewed techniques include: (i) study of the illumination-reflectance model, (ii) local histogram equalization, (iii) homomorphic filtering, and, (iv) subtraction of the illumination field. Several experiments on real data have been conducted to compare the different approaches