879 resultados para Bosch, Pieter van den
Resumo:
Atmospheric parameters, Such as pressure (P), temperature (T) and density (rho proportional to P/T), affect the development of extensive air showers initiated by energetic cosmic rays. We have Studied the impact of atmospheric variations on extensive air showers by means of the surface detector of the Pierre Auger Observatory. The rate of events shows a similar to 10% seasonal modulation and similar to 2% diurnal one. We find that the observed behaviour is explained by a model including the effects associated with the variations of P and rho. The former affects the longitudinal development of air showers while the latter influences the Moliere radius and hence the lateral distribution of the shower particles. The model is validated with full simulations of extensive air showers using atmospheric profiles measured at the site of the Pierre Auger Observatory. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
From direct observations of the longitudinal development of ultra-high energy air showers performed with the Pierre Auger Observatory, upper limits of 3.8%, 2.4%, 3.5% and 11.7% (at 95% c.l.) are obtained on the fraction of cosmic-ray photons above 2, 3, 5 and 10 EeV (1 EeV equivalent to 10(18) eV), respectively. These are the first experimental limits on ultra-high energy photons at energies below 10 EeV. The results complement previous constraints on top-down models from array data and they reduce systematic uncertainties in the interpretation of shower data in terms of primary flux, nuclear composition and proton-air cross-section. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The Pierre Auger Collaboration has reported. evidence for anisotropy in the distribution of arrival directions of the cosmic rays with energies E > E(th) = 5.5 x 10(19) eV. These show a correlation with the distribution of nearby extragalactic objects, including an apparent excess around the direction of Centaurus A. If the particles responsible for these excesses at E > E(th) are heavy nuclei with charge Z, the proton component of the sources should lead to excesses in the same regions at energies E/Z. We here report the lack of anisotropies in these directions at energies above E(th)/Z (for illustrative values of Z = 6, 13, 26). If the anisotropies above E(th) are due to nuclei with charge Z, and under reasonable assumptions about the acceleration process, these observations imply stringent constraints on the allowed proton fraction at the lower energies.
Resumo:
We present the results of searches for dipolar-type anisotropies in different energy ranges above 2.5 x 10(17) eV with the surface detector array of the Pierre Auger Observatory, reporting on both the phase and the amplitude measurements of the first harmonic modulation in the right-ascension distribution. Upper limits on the amplitudes are obtained, which provide the most stringent bounds at present, being below 2% at 99% C.L. for EeV energies. We also compare our results to those of previous experiments as well as with some theoretical expectations. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Since data-taking began in January 2004, the Pierre Auger Observatory has been recording the count rates of low energy secondary cosmic ray particles for the self-calibration of the ground detectors of its surface detector array. After correcting for atmospheric effects, modulations of galactic cosmic rays due to solar activity and transient events are observed. Temporal variations related with the activity of the heliosphere can be determined with high accuracy due to the high total count rates. In this study, the available data are presented together with an analysis focused on the observation of Forbush decreases, where a strong correlation with neutron monitor data is found.
Resumo:
The Pierre Auger Observatory is a detector for ultra-high energy cosmic rays. It consists of a surface array to measure secondary particles at ground level and a fluorescence detector to measure the development of air showers in the atmosphere above the array. The ""hybrid"" detection mode combines the information from the two subsystems. We describe the determination of the hybrid exposure for events observed by the fluorescence telescopes in coincidence with at least one water-Cherenkov detector of the surface array. A detailed knowledge of the time dependence of the detection operations is crucial for an accurate evaluation of the exposure. We discuss the relevance of monitoring data collected during operations, such as the status of the fluorescence detector, background light and atmospheric conditions, that are used in both simulation and reconstruction. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Data collected by the Pierre Auger Observatory through 31 August 2007 showed evidence for anisotropy in the arrival directions of cosmic rays above the Greisen-Zatsepin-Kuz`min energy threshold, 6 x 10(19) eV. The anisotropy was measured by the fraction of arrival directions that are less than 3.1 degrees from the position of an active galactic nucleus within 75 Mpc (using the Veron-Cetty and Veron 12th catalog). An updated measurement of this fraction is reported here using the arrival directions of cosmic rays recorded above the same energy threshold through 31 December 2009. The number of arrival directions has increased from 27 to 69, allowing a more precise measurement. The correlating fraction is (38(-6)(+7))%, compared with 21% expected for isotropic cosmic rays. This is down from the early estimate of (69-(+11)(13))%. The enlarged set of arrival directions is examined also in relation to other populations of nearby extragalactic objects: galaxies in the 2 Microns All Sky Survey and active galactic nuclei detected in hard X-rays by the Swift Burst Alert Telescope. A celestial region around the position of the radiogalaxy Cen A has the largest excess of arrival directions relative to isotropic expectations. The 2-point autocorrelation function is shown for the enlarged set of arrival directions and compared to the isotropic expectation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The Pierre Auger Observatory is a hybrid detector for ultra-high energy cosmic rays. It combines a surface array to measure secondary particles at ground level together with a fluorescence detector to measure the development of air showers in the atmosphere above the array. The fluorescence detector comprises 24 large telescopes specialized for measuring the nitrogen fluorescence caused by charged particles of cosmic ray air showers. In this paper we describe the components of the fluorescence detector including its optical system, the design of the camera, the electronics, and the systems for relative and absolute calibration. We also discuss the operation and the monitoring of the detector. Finally, we evaluate the detector performance and precision of shower reconstructions. (C) 2010 Elsevier B.V All rights reserved.
Measurement of the energy spectrum of cosmic rays above 10(18) eV using the Pierre Auger Observatory
Resumo:
We report a measurement of the flux of cosmic rays with unprecedented precision and Statistics using the Pierre Auger Observatory Based on fluorescence observations in coincidence with at least one Surface detector we derive a spectrum for energies above 10(18) eV We also update the previously published energy spectrum obtained with the surface detector array The two spectra are combined addressing the systematic uncertainties and, in particular. the influence of the energy resolution on the spectral shape The spectrum can be described by a broken power law E(-gamma) with index gamma = 3 3 below the ankle which is measured at log(10)(E(ankle)/eV) = 18 6 Above the ankle the spectrum is described by a power law with index 2 6 followed by a flux suppression, above about log(10)(E/eV) = 19 5, detected with high statistical significance (C) 2010 Elsevier B V All rights reserved
Resumo:
The air fluorescence detector of the Pierre Auger Observatory is designed to perforin calorimetric measurements of extensive air showers created by Cosmic rays of above 10(18) eV. To correct these measurements for the effects introduced by atmospheric fluctuations, the Observatory contains a group Of monitoring instruments to record atmospheric conditions across the detector site, ail area exceeding 3000 km(2). The atmospheric data are used extensively in the reconstruction of air showers, and are particularly important for the correct determination of shower energies and the depths of shower maxima. This paper contains a summary of the molecular and aerosol conditions measured at the Pierre Auger Observatory since the start of regular operations in 2004, and includes a discussion of the impact of these measurements oil air shower reconstructions. Between 10(18) and 10(20) eV, the systematic Uncertainties due to all atmospheric effects increase from 4% to 8% in measurements of shower energy, and 4 g cm(-2) to 8 g cm(-2) in measurements of the shower maximum. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The Open Provenance Model is a model of provenance that is designed to meet the following requirements: (1) To allow provenance information to be exchanged between systems, by means of a compatibility layer based on a shared provenance model. (2) To allow developers to build and share tools that operate on such a provenance model. (3) To define provenance in a precise, technology-agnostic manner. (4) To support a digital representation of provenance for any 'thing', whether produced by computer systems or not. (5) To allow multiple levels of description to coexist. (6) To define a core set of rules that identify the valid inferences that can be made on provenance representation. This document contains the specification of the Open Provenance Model (v1.1) resulting from a community-effort to achieve inter-operability in the Provenance Challenge series.
Resumo:
Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.
Resumo:
O objetivo deste trabalho é desenvolver um modelo de tomada de decisão, baseado em Teoria das Opções Reais, para empresas atuantes em mercados em consolidação. Tal modelo tem como objetivo indicar o momento mais adequado para o início de um processo de fusões e aquisições em determinado segmento econômico. O modelo a ser desenvolvido o será sob a ótica do aumento de poder e participação de mercado somente. O mercado brasileiro contém diversos setores da economia ainda fragmentados, que oferecem oportunidades a empresas já atuantes no mercado ou a investidores financeiros, como fundos de Private Equity, atuarem, direta ou indiretamente, respectivamente, como consolidadores de mercado, via fusões e aquisições de empresas. Oportunidades de aquisições podem ser vistas como opções reais. As aquisições sempre são associadas a custos substanciais. Como ao exercer uma opção, uma aquisição se torna rentável quando seu excedente supera seu investimento a ponto de compensar o prêmio da opção. Com este objetivo, analisou-se uma série de modelos de estratégias para aquisições, bem como modelos que explicam a motivação de ondas de fusões e aquisições. A partir do modelo desenvolvido por van den Berg e Smit (2007), adaptou-se o modelo ora apresentado para aplicação a um caso selecionado, o do segmento de serviços de apoio à medicina diagnóstica, cuja consolidação dos mercados de Brasília e Goiânia iniciou-se a partir de 2006. Tal exercício tem o objetivo de evidenciar ou não sua validade bem como condições para sua utilização.
Resumo:
O contexto do funcionalismo público como fonte empregadora tem chamado crescente atenção, uma vez que as práticas de remuneração acima da média de mercado adotadas neste contexto vêm atraindo cada vez mais indivíduos altamente qualificados (Bender & Fernandes, 2006). Entretanto, trata-se de um setor que adota práticas de remuneração que também são caracterizadas pela desigualdade, uma vez que carreiras bem remuneradas co-existem com outras mal remuneradas, por vezes, dentro do mesmo espaço organizacional e executando tarefas similares. Estudos demonstram que, a priori, um ambiente de trabalho que favorece uma situação de desigualdade afeta negativamente diversos aspectos comportamentais dos funcionários que nele exercem atividades (e.g. De Cremer & Van Kleef, 2009; Peters & Van den Bos, 2008; Peters, Van den Bos & Bobocel, 2004). Desta forma, o presente estudo buscou entender como uma situação de desigualdade remuneratória – em que alguns membros se encontram em uma situação de overpayment, enquanto que outros na situação de underpayment – pode influenciar fatores como a autoestima e o comprometimento afetivo dos funcionários públicos com relação aos seus trabalhos. Um órgão público do Poder Executivo Federal, foi escolhido como lócus de pesquisa para analisar estes impactos. A metodologia utilizada na pesquisa teve uma natureza quantitativa e qualitativa. Numa primeira etapa, aplicaram-se 105 questionários a dois grupos distintos de servidores desse órgão público (um grupo tido como overpaid e outro tido como underpaid), tendo sido analisadas, por meio de regressões hierárquicas, os impactos da percepção de justiça salarial na auto-estima e no comprometimento dos funcionários. Posteriormente, realizaram-se 20 entrevistas com funcionários dos dois grupos com o intuito de aprofundar e discutir aspectos mais sensíveis relacionados com os resultados. Dessas análises foi possível confirmar a influência direta do senso de justiça remuneratória que o indivíduo possui na sua auto-estima e no seu comprometimento afetivo. Os resultados da pesquisa demonstram que representantes das carreiras bemremuneradas tendem a comparar-se com outras carreiras melhor remuneradas, evitando a comparação com os colegas do trabalho pertencentes a carreiras menos favorecidas. Entretanto, a influência que o sentimento de justiça tem em ambos os resultados comportamentais analisados é potencializada quanto maior for a percepção do indivíduo acerca da satisfação dos seus pares com o trabalho e com a sua remuneração. Observou-se ainda o efeito moderador da motivação epistêmica nesta relação. Esta pesquisa espera ter contribuído para melhor entender os impactos que políticas salariais podem ter nos funcionários públicos.
Resumo:
As digital systems move away from traditional desktop setups, new interaction paradigms are emerging that better integrate with users’ realworld surroundings, and better support users’ individual needs. While promising, these modern interaction paradigms also present new challenges, such as a lack of paradigm-specific tools to systematically evaluate and fully understand their use. This dissertation tackles this issue by framing empirical studies of three novel digital systems in embodied cognition – an exciting new perspective in cognitive science where the body and its interactions with the physical world take a central role in human cognition. This is achieved by first, focusing the design of all these systems on a contemporary interaction paradigm that emphasizes physical interaction on tangible interaction, a contemporary interaction paradigm; and second, by comprehensively studying user performance in these systems through a set of novel performance metrics grounded on epistemic actions, a relatively well established and studied construct in the literature on embodied cognition. The first system presented in this dissertation is an augmented Four-in-a-row board game. Three different versions of the game were developed, based on three different interaction paradigms (tangible, touch and mouse), and a repeated measures study involving 36 participants measured the occurrence of three simple epistemic actions across these three interfaces. The results highlight the relevance of epistemic actions in such a task and suggest that the different interaction paradigms afford instantiation of these actions in different ways. Additionally, the tangible version of the system supports the most rapid execution of these actions, providing novel quantitative insights into the real benefits of tangible systems. The second system presented in this dissertation is a tangible tabletop scheduling application. Two studies with single and paired users provide several insights into the impact of epistemic actions on the user experience when these are performed outside of a system’s sensing boundaries. These insights are clustered by the form, size and location of ideal interface areas for such offline epistemic actions to occur, as well as how can physical tokens be designed to better support them. Finally, and based on the results obtained to this point, the last study presented in this dissertation directly addresses the lack of empirical tools to formally evaluate tangible interaction. It presents a video-coding framework grounded on a systematic literature review of 78 papers, and evaluates its value as metric through a 60 participant study performed across three different research laboratories. The results highlight the usefulness and power of epistemic actions as a performance metric for tangible systems. In sum, through the use of such novel metrics in each of the three studies presented, this dissertation provides a better understanding of the real impact and benefits of designing and developing systems that feature tangible interaction.