881 resultados para Pluto, satellites
Resumo:
Low-temperature magneto-photoluminescence is a very powerful technique to characterize high purity GaAs and InP grown by various epitaxial techniques. These III-V compound semiconductor materials are used in a wide variety of electronic, optoelectronic and microwave devices. The large binding energy differences of acceptors in GaAs and InP make possible the identification of those impurities by low-temperature photoluminescence without the use of any magnetic field. However, the sensitivity and resolution provided by this technique rema1ns inadequate to resolve the minute binding energy differences of donors in GaAs and InP. To achieve higher sensitivity and resolution needed for the identification of donors, a magneto-photoluminescence system 1s installed along with a tunable dye laser, which provides resonant excitation. Donors 1n high purity GaAs are identified from the magnetic splittings of "two-electron" satellites of donor bound exciton transitions 1n a high magnetic field and at liquid helium temperature. This technique 1s successfully used to identify donors 1n n-type GaAs as well as 1n p-type GaAs in which donors cannot be identified by any other technique. The technique is also employed to identify donors in high purity InP. The amphoteric incorporation of Si and Ge impurities as donors and acceptors in (100), (311)A and (3ll)B GaAs grown by molecular beam epitaxy is studied spectroscopically. The hydrogen passivation of C acceptors in high purity GaAs grown by molecular beam epitaxy (MBE) and metalorganic chemical vapor deposition (MOCVD) 1s investigated using photoluminescence. Si acceptors ~n MBE GaAs are also found to be passivated by hydrogenation. The instabilities in the passivation of acceptor impurities are observed for the exposure of those samples to light. Very high purity MOCVD InP samples with extremely high mobility are characterized by both electrical and optical techniques. It is determined that C is not typically incorporated as a residual acceptor ~n high purity MOCVD InP. Finally, GaAs on Si, single quantum well, and multiple quantum well heterostructures, which are fabricated from III-V semiconductors, are also measured by low-temperature photoluminescence.
Resumo:
In this thesis, we present a quantitative approach using probabilistic verification techniques for the analysis of reliability, availability, maintainability, and safety (RAMS) properties of satellite systems. The subject of our research is satellites used in mission critical industrial applications. A strong case for using probabilistic model checking to support RAMS analysis of satellite systems is made by our verification results. This study is intended to build a foundation to help reliability engineers with a basic background in model checking to apply probabilistic model checking to small satellite systems. We make two major contributions. One of these is the approach of RAMS analysis to satellite systems. In the past, RAMS analysis has been extensively applied to the field of electrical and electronics engineering. It allows system designers and reliability engineers to predict the likelihood of failures from the indication of historical or current operational data. There is a high potential for the application of RAMS analysis in the field of space science and engineering. However, there is a lack of standardisation and suitable procedures for the correct study of RAMS characteristics for satellite systems. This thesis considers the promising application of RAMS analysis to the case of satellite design, use, and maintenance, focusing on its system segments. Data collection and verification procedures are discussed, and a number of considerations are also presented on how to predict the probability of failure. Our second contribution is leveraging the power of probabilistic model checking to analyse satellite systems. We present techniques for analysing satellite systems that differ from the more common quantitative approaches based on traditional simulation and testing. These techniques have not been applied in this context before. We present the use of probabilistic techniques via a suite of detailed examples, together with their analysis. Our presentation is done in an incremental manner: in terms of complexity of application domains and system models, and a detailed PRISM model of each scenario. We also provide results from practical work together with a discussion about future improvements.
Resumo:
Cassini states correspond to the equilibria of the spin axis of a body when its orbit is perturbed. They were initially described for planetary satellites, but the spin axes of black hole binaries also present this kind of equilibria. In previous works, Cassini states were reported as spin-orbit resonances, but actually the spin of black hole binaries is in circulation and there is no resonant motion. Here we provide a general description of the spin dynamics of black hole binary systems based on a Hamiltonian formalism. In absence of dissipation, the problem is integrable and it is easy to identify all possible trajectories for the spin for a given value of the total angular momentum. As the system collapses due to radiation reaction, the Cassini states are shifted to different positions, which modifies the dynamics around them. This is why the final spin distribution may differ from the initial one. Our method provides a simple way of predicting the distribution of the spin of black hole binaries at the end of the inspiral phase.
Resumo:
Extrasolar planets abound in almost any possible configuration. However, until five years ago, there was a lack of planets orbiting closer than 0.5 au to giant or subgiant stars. Since then, recent detections have started to populated this regime by confirming 13 planetary systems. We discuss the properties of these systems in terms of their formation and evolution off the main sequence. Interestingly, we find that 70.0 ± 6.6% of the planets in this regime are inner components of multiplanetary systems. This value is 4.2σ higher than for main-sequence hosts, which we find to be 42.4 ± 0.1%. The properties of the known planets seem to indicate that the closest-in planets (a< 0.06 au) to main-sequence stars are massive (i.e., hot Jupiters) and isolated and that they are subsequently engulfed by their host as it evolves to the red giant branch, leaving only the predominant population of multiplanetary systems in orbits 0.06
Resumo:
The changes in time and location of surface temperature from a water body has an important effect on climate activities, marine biology, sea currents, salinity and other characteristics of the seas and lakes water. Traditional measurement of temperature is costly and time consumer due to its dispersion and instability. In recent years the use of satellite technology and remote sensing sciences for data acquiring and parameter and lysis of climatology and oceanography is well developed. In this research we used the NOAA’s Satellite images from its AVHRR system to compare the field surface temperature data with the satellite images information. Ten satellite images were used in this project. These images were calibrated with the field data at the exact time of satellite pass above the area. The result was a significant relation between surface temperatures from satellite data with the field work. As the relative error less than %40 between these two data is acceptable, therefore in our observation the maximum error is %21.2 that can be considered it as acceptable. In all stations the result of satellite measurements is usually less than field data that cores ponds with the global result too. As this sea has a vast latitude, therefore the different in the temperature is natural. But we know this factor is not the only cause for surface currents. The information of all satellites were images extracted by ERDAS software, and the “Surfer” software is used to plot the isotherm lines.
Resumo:
The BL Lac object 1ES 1011+496 was discovered at Very High Energy (VHE, E>100GeV) γ-rays by MAGIC in spring 2007. Before that the source was little studied in different wavelengths. Therefore a multi-wavelength (MWL) campaign was organized in spring 2008. Along MAGIC, the MWL campaign included the Mets¨ahovi radio observatory, Bell and KVA optical telescopes and the Swift and AGILE satellites. MAGIC observations span from March to May, 2008 for a total of 27.9 hours, of which 19.4 hours remained after quality cuts. The light curve showed no significant variability yielding an integral flux above 200 GeV of (1.3 ± 0.3) × 10^(−11) photons cm^(−2) s^( −1) . The differential VHE spectrum could be described with a power-law function with a spectral index of 3.3 ± 0.4. Both results were similar to those obtained during the discovery. Swift XRT observations revealed an X-ray flare, characterized by a harder-when-brighter trend, as is typical for high synchrotron peak BL Lac objects (HBL). Strong optical variability was found during the campaign, but no conclusion on the connection between the optical and VHE γ-ray bands could be drawn. The contemporaneous SED shows a synchrotron dominated source, unlike concluded in previous work based on non-simultaneous data, and is well described by a standard one–zone synchrotron self–Compton model. We also performed a study on the source classification. While the optical and X-ray data taken during our campaign show typical characteristics of an HBL, we suggest, based on archival data, that 1ES 1011+496 is actually a borderline case between intermediate and high synchrotron peak frequency BL Lac objects.
Resumo:
Ce congrès était organisé conjointement par les deux grandes associations de développement de l'aquaculture dans le monde: la World Aquaculture Society (WAS, 2300 membres) et la European Aquaculture Society (EAS, 550 membres). Le précédent s'était tenu à Nice, en mai 2000. Il a rassemblé pendant 5 jours plus de 3000 chercheurs et responsables institutionnels de 95 nationalités. Environ 600 communications orales étaient réparties sur 67 sessions suivant un système de 11 salles en parallèle. Entre les salles de conférence étaient présentés les panneaux des 460 posters. Sur le site, l'exposition commerciale accueillait 135 entreprises et organismes; elle a reçu la visite d'environ 2000 visiteurs dont la moitié d'Italiens (source: EAS). Le thème général du congrès était le lien entre la tradition et la technologie. L'objectif était de montrer que les technologies, dont l'image est ambivalente, constituent un outil remarquable de développement de l'aquaculture, y compris en tenant compte des contraintes de durabilité. En effet, les attentes du citoyen, comme du consommateur, restent centrées autour des notions de qualité, sécurité alimentaire, bien-être et santé animale. Les travaux portaient sur les disciplines classiques de l'aquaculture (nutrition, physiologie, génétique, etc) et leur relation avec les biotechnologies. Il faut souligner l'émergence de thèmes de plus en plus liés la démonstration que l'aquaculture peut s'intégrer dans détruire (capacité de charge d'un écosystème, animaux échappés, etc) et à la perception de la société (perception du consommateur, aquaculture et société, position des ONG écologistes, etc). L'U.E. était très présente avec 5 représentants et une implication marquée dans de plusieurs sessions. Sous différentes formes, ses représentants ont rappelé la volonté de l'UE de continuer à soutenir l'aquaculture, avec l'objectif de poursuivre le développement de ce secteur (4 % de croissance moyenne par an). L'aquaculture devrait générer 8 à 10 000 emplois nouveaux sur les 15 prochaines années, notamment dans la conchyliculture et la pisciculture marine au large avec comme mot clef l'intégration dans l'environnement, dans le tissu socio-économique côtier et dans l'imaginaire des gens, touristes, consommateurs, élus, etc. Ce congrès à vocation mondiale a attiré des représentants de régions habituellement peu représentées comme le Moyen Orient et la Chine, présente à de nombreuses sessions. Il a été aussi le lieu de multiples réunions satellites impliquant presque toujours des chercheurs français: grands programmes européens en cours comme SeaFood+, ASEM (coop. Europe - Asie) ou Consensus, assemblée générale de l'EAS, groupe de travail de l'UICN, etc. Ce type de réunion confirme l'importance des contacts personnels directs pour 1. Evaluer les grandes tendances mondiales du secteur 2. Etablir des contacts directs avec des chercheurs seniors des grandes équipes de recherche et des décideurs au niveau européen et extra-européen 3. Tester des idées et des projets de collaboration et de partenariat Il constitue un forum exceptionnel de diffusion de connaissances, d'informations et de messages. Il offre un espace de perception et de reconnaissance d'Ifremer par de nombreux acteurs de la communauté de recherche en aquaculture. Grâce à la variété thématique des sessions où des chercheurs d'Ifremer sont actifs, l'institut renforce sa notoriété notamment dans la dimension pluridisciplinaire. Cette capacité d'ensemblier est la qualité la plus demandée dans les conclusions d'ateliers et la plus rare dans les instituts présents. Le congrès révèle bien l'évolution de l'aquaculture mondiale: il y a 10 ans, la production de masse était au Sud et la technologie et les marchés au Nord (USA, Europe, Japon). Aujourd'hui, la progression économique et scientifique rapide des pays du Sud, surtout en Asie, crée des marchés locaux .solvables pour l'aquaculture (Chine, Inde) et fait émerger une capacité de recherche « de masse». Cette situation exige que la recherche occidentale évolue pour rester compétitive. Pour préserver un secteur important, qui concourt à la sécurité alimentaire en protéines de manière croissante (4% par an, record de toutes les productions alimentaires), la recherche européenne en aquaculture doit maintenir son effort afin de garder une longueur d'avance, surtout dans les secteurs qui seront vitaux dans la décennie: relations avec l'environnement naturel (durabilité dont la maîtrise des coûts énergétiques), qualité des produits, sécurité alimentaire, intégration socio-économique dans des espaces de plus en plus convoités, maîtrise de l'image de l'espèce « cultivée» (industrielle mais contrôlée) par rapport à l'image de l'espèce « sauvage» (naturelle mais polluée et surexploitée). En conséquence, l'UE conserve tout son potentiel de développement car l'essentiel de la valeur ajoutée sera de moins en moins dans la production en quantité mais dans sa maîtrise de la qualité. Cette évolution donne toute sa valeur à la recherche menée par Ifremer et ce d'autant plus que l'institut sera capable d'anticiper les besoins et les attentes des entreprises, des consommateurs, des associations comme des organisations internationales. Dans cette vision, réactivité, capacité d'ensemblier et réflexion prospective sont les qualités à développer pour que l'lfremer puisse donner toute sa mesure notamment dans la recherche en aquaculture. Ces enjeux sont à l'échelle internationale et Ifremer fait partie du petit nombre d'instituts capables de les traiter en large partenariat, en accord complet avec la politique souhaitée par l'UE
Resumo:
Satellites have great potential for diagnosis of surface air quality conditions, though reduced sensitivity of satellite instrumentation to the lower troposphere currently impedes their applicability. One objective of the NASA DISCOVER-AQ project is to provide information relevant to improving our ability to relate satellite-observed columns to surface conditions for key trace gases and aerosols. In support of DISCOVER-AQ, this dissertation investigates the degree of correlation between O3 and NO2 column abundance and surface mixing ratio during the four DISCOVER-AQ deployments; characterize the variability of the aircraft in situ and model-simulated O3 and NO2 profiles; and use the WRF-Chem model to further investigate the role of boundary layer mixing in the column-surface connection for the Maryland 2011 deployment, and determine which of the available boundary layer schemes best captures the observations. Simple linear regression analyses suggest that O3 partial column observations from future satellite instruments with sufficient sensitivity to the lower troposphere may be most meaningful for surface air quality under the conditions associated with the Maryland 2011 campaign, which included generally deep, convective boundary layers, the least wind shear of all four deployments, and few geographical influences on local meteorology, with exception of bay breezes. Hierarchical clustering analysis of the in situ O3 and NO2 profiles indicate that the degree of vertical mixing (defined by temperature lapse rate) associated with each cluster exerted an important influence on the shapes of the median cluster profiles for O3, as well as impacted the column vs. surface correlations for many clusters for both O3 and NO2. However, comparisons to the CMAQ model suggest that, among other errors, vertical mixing is overestimated, causing too great a column-surface connection within the model. Finally, the WRF-Chem model, a meteorology model with coupled chemistry, is used to further investigate the impact of vertical mixing on the O3 and NO2 column-surface connection, for an ozone pollution event that occurred on July 26-29, 2011. Five PBL schemes were tested, with no one scheme producing a clear, consistent “best” comparison with the observations for PBLH and pollutant profiles; however, despite improvements, the ACM2 scheme continues to overestimate vertical mixing.
Resumo:
Over the past decade, the diminishing Arctic sea ice has impacted the wave field, which depends on the ice-free ocean and wind. This study characterizes the wave climate in the Arctic spanning 1992–2014 from a merged altimeter data set and a wave hindcast that uses CFSR winds and ice concentrations from satellites as input. The model performs well, verified by the altimeters, and is relatively consistent for climate studies. The wave seasonality and extremes are linked to the ice coverage, wind strength, and wind direction, creating distinct features in the wind seas and swells. The altimeters and model show that the reduction of sea ice coverage causes increasing wave heights instead of the wind. However, trends are convoluted by interannual climate oscillations like the North Atlantic Oscillation (NAO) and Pacific Decadal Oscillation. In the Nordic Greenland Sea the NAO influences the decreasing wind speeds and wave heights. Swells are becoming more prevalent and wind-sea steepness is declining. The satellite data show the sea ice minimum occurs later in fall when the wind speeds increase. This creates more favorable conditions for wave development. Therefore we expect the ice freeze-up in fall to be the most critical season in the Arctic and small changes in ice cover, wind speeds, and wave heights can have large impacts to the evolution of the sea ice throughout the year. It is inconclusive how important wave–ice processes are within the climate system, but selected events suggest the importance of waves within the marginal ice zone.
Resumo:
O princípio do posicionamento por GNSS baseia-se, resumidamente, na resolução de um problema matemático que envolve a observação das distâncias do utilizador a um conjunto de satélites com coordenadas conhecidas. A posição resultante pode ser calculada em modo absoluto ou relativo. O posicionamento absoluto necessita apenas de um recetor para a determinação da posição. Por sua vez, o posicionamento relativo implica a utilização de estações de referência e envolve a utilização de mais recetores para além do pertencente ao próprio utilizador. Assim, os métodos mais utilizados na determinação da posição de uma plataforma móvel, com exatidão na ordem dos centímetros, baseiam-se neste último tipo de posicionamento. Contudo, têm a desvantagem de estarem dependentes de estações de referência, com um alcance limitado, e requerem observações simultâneas dos mesmos satélites por parte da estação e do recetor. Neste sentido foi desenvolvida uma nova metodologia de posicionamento GNSS em modo absoluto, através da modelação ou remoção dos erros associados a cada componente das equações de observação, da utilização de efemérides precisas e correções aos relógios dos satélites. Este método de posicionamento tem a designação Precise Point Positioning (PPP) e permite manter uma elevada exatidão, equivalente à dos sistemas de posicionamento relativo. Neste trabalho, após um estudo aprofundado do tema, foi desenvolvida uma aplicação PPP, de índole académica, com recurso à biblioteca de classes C++ do GPS Toolkit, que permite determinar a posição e velocidade do recetor em modo cinemático e em tempo real. Esta aplicação foi ensaiada utilizando dados de observação de uma estação estática (processados em modo cinemático) e de uma estação em movimento instalada no NRP Auriga. Os resultados obtidos permitiram uma exatidão para a posição na ordem decimétrica e para a velocidade na ordem do cm/s.
Resumo:
Portugal tem uma das maiores Zonas Económicas Exclusivas a nível mundial, encerrando este espaço marítimo uma riqueza que ainda não se encontra devidamente aferida, mas que se julga ser enorme. Por ela passam anualmente milhares de navios, com os mais diversos destinos e transportando as mais variadas cargas. A posição geostratégica do país coloca-o no centro de algumas das mais movimentadas rotas marítimas, sendo por isso de extrema importância vigiar e monitorizar as águas portuguesas, por forma a garantir que as leis e regulamentos de direito internacional marítimo são cumpridos e que o interesse nacional é devidamente salvaguardado. Deste modo, a presente dissertação tem como objeto de estudo os sistemas de vigilância e monitorização marítimos, pretendendo constituir-se como um contributo para a melhoria do atual sistema de vigilância e monitorização dos espaços marítimos sob soberania ou jurisdição portuguesa, focando-se, para tal, nos sistemas aéreos e espaciais para a deteção de meios de superfície. Para tal, numa primeira parte considera-se estudar o ambiente marítimo e as ameaças que o afetam. Na segunda parte estudam-se os atuais sistemas que contribuem para o conhecimento situacional marítimo em Portugal culminando na terceira parte com o estudo dos meios e sensores que permitem melhorar a cobertura do espaço marítimo, com o objetivo final de garantir a segurança no mar. Através do estudo realizado foi possível concluir-se que as aeronaves não tripuladas afiguram-se como o futuro mais imediato para o esclarecimento do panorama marítimo, sendo que os satélites surgem numa segunda linha, pois apesar dos seus custos mais elevados, poderão também dar um enorme contributo para o conhecimento situacional marítimo ao serem capazes de cobrir maiores áreas e mais rapidamente.
Resumo:
The new generation of artificial satellites is providing a huge amount of Earth observation images whose exploitation can report invaluable benefits, both economical and environmental. However, only a small fraction of this data volume has been analyzed, mainly due to the large human resources needed for that task. In this sense, the development of unsupervised methodologies for the analysis of these images is a priority. In this work, a new unsupervised segmentation algorithm for satellite images is proposed. This algorithm is based on the rough-set theory, and it is inspired by a previous segmentation algorithm defined in the RGB color domain. The main contributions of the new algorithm are: (i) extending the original algorithm to four spectral bands; (ii) the concept of the superpixel is used in order to define the neighborhood similarity of a pixel adapted to the local characteristics of each image; (iii) and two new region merged strategies are proposed and evaluated in order to establish the final number of regions in the segmented image. The experimental results show that the proposed approach improves the results provided by the original method when both are applied to satellite images with different spectral and spatial resolutions.
Resumo:
Les connaissances scientifiques sur le changement climatique (CC) évoluent rapidement. Toutefois, des incertitudes persistent sur l’étendue de ses conséquences, particulièrement dans les milieux urbains, qui subiront des impacts différents de ceux vécus par les milieux ruraux. Les autorités publiques ont récemment commencé à élaborer des politiques publiques d’adaptation au changement climatique (ACC) qui visent à en limiter les conséquences indésirables. En milieu urbain, la littérature suggère qu’un des outils que devraient privilégier ces politiques est le verdissement. Des auteurs signalent que les actions visant l’ACC peuvent se greffer dans des politiques existantes. L’ACC, comme enjeu public, peut donc être réalisée par l’entremise de sa prise en compte dans les politiques publiques de verdissement. Cette prise en compte devrait affecter le contenu (quoi?) et le pilotage (comment?) des différentes étapes des politiques. Le cas de la politique publique de verdissement de la Ville de Montréal, au Québec, nous a permis d’étudier cette prise en compte. En utilisant un cadre d’analyse des politiques publiques développé par Knoepfel et al. (2015), qui porte entre autres sur la mobilisation des ressources par différents acteurs concernés par ces politiques, nous montrons que cette dernière s’est opérée de quelques façons. Premièrement, il y a eu un changement dans l’argumentaire pour le verdissement, outil qui vise à lutter contre les îlots de chaleur urbains et assurer une meilleure gestion des eaux pluviales. Ensuite, le choix de l’échelle d’agglomération pour la prise en compte de l’ACC a entraîné un changement d’échelle dans la gestion du verdissement. La publication d’un plan d’action majeur de verdissement urbain pour l’agglomération, et dont le leitmotiv est l’ACC, le démontre. Quelques modifications réglementaires et l’inclusion de nouveaux acteurs dans la politique témoignent aussi que la prise en compte a eu lieu. Finalement, le plan d’action fournit un cadre pour la mise en œuvre du verdissement dans les zones les plus vulnérables au CC en plus d’une structure de partage des coûts. Cependant, la mise en oeuvre du verdissement dans une visée d’ACC n'a pas été évaluée dans la présente étude. Nous avons aussi noté que la biodiversité est un enjeu d’importance qui va de pair avec l’ACC dans la politique de verdissement. Il y a donc une prise en compte, partielle, de l’ACC dans la politique publique de verdissement à Montréal (avec certains écueils). Nous arguons que l’enjeu de l’ACC sert peut-être d’argument supplémentaire pour verdir la ville plutôt que d’être un véritable moteur de transformation de la politique de verdissement.
Resumo:
The Hungarian Revolution is often analysed in a national context or from the angle of Hungarian-Soviet relations. From this perspective, the Eastern European satellites seem mere puppets and the Soviet bloc a monolith. Archival evidence nevertheless shows that the Kremlin actually attempted to build a new kind of international relations after Stalin’s death in 1953, in which the Eastern European leaders would gain more scope for manoeuvre. This attempt at liberalisation even facilitated the uprisings in Hungary in 1956. Avoiding a teleological approach to the Hungarian Revolution, this article argues that the Soviet invasion was neither inevitable, nor wholly unilateral. Khrushchev even sought to legitimise the invasion in bilateral and multilateral consultations. There was a mutual interest in sacrificing Hungary’s sovereignty to safeguard the communist monopoly on power. This multilateralisation of Soviet bloc security is an important explanatory factor in an analysis of the Revolution and its repercussions in Eastern Europe.