961 resultados para physics computing
Resumo:
In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.
In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.
Resumo:
Approximate execution is a viable technique for environments with energy constraints, provided that applications are given the mechanisms to produce outputs of the highest possible quality within the available energy budget. This paper introduces a framework for energy-constrained execution with controlled and graceful quality loss. A simple programming model allows developers to structure the computation in different tasks, and to express the relative importance of these tasks for the quality of the end result. For non-significant tasks, the developer can also supply less costly, approximate versions. The target energy consumption for a given execution is specified when the application is launched. A significance-aware runtime system employs an application-specific analytical energy model to decide how many cores to use for the execution, the operating frequency for these cores, as well as the degree of task approximation, so as to maximize the quality of the output while meeting the user-specified energy constraints. Evaluation on a dual-socket 16-core Intel platform using 9 benchmark kernels shows that the proposed framework picks the optimal configuration with high accuracy. Also, a comparison with loop perforation (a well-known compile-time approximation technique), shows that the proposed framework results in significantly higher quality for the same energy budget.
Resumo:
This paper outlines a means of improving the employability skills of first-year university students through a closely integrated model of employer engagement within computer science modules. The outlined approach illustrates how employability skills, including communication, teamwork and time management skills, can be contextualised in a manner that directly relates to student learning but can still be linked forward into employment. The paper tests the premise that developing employability skills early within the curriculum will result in improved student engagement and learning within later modules. The paper concludes that embedding employer participation within first-year models can help relate a distant notion of employability into something of more immediate relevance in terms of how students can best approach learning. Further, by enhancing employability skills early within the curriculum, it becomes possible to improve academic attainment within later modules.
Resumo:
The circumstances in Colombo, Sri Lanka, and in Belfast, Northern Ireland, which led to a) the generalization of luminescent PET (photoinduced electron transfer) sensing/switching as a design tool, b) the construction of a market-leading blood electrolyte analyzer and c) the invention of molecular logic-based computation as an experimental field, are delineated. Efforts to extend the philosophy of these approaches into issues of small object identification, nanometric mapping, animal visual perception and visual art are also outlined.
Resumo:
With the focus of ITER on the transport and emission properties of tungsten, generating atomic data for complex species has received much interest. Focusing on impurity influx diagnostics, we discuss recent work on heavy species. Perturbative approaches do not work well for near neutral systems so non-perturbative data are required, presenting a particular challenge for these influx diagnostics. Recent results on Mo+ are given as an illustration of how the diagnostic applications can guide the theoretical calculations for such systems.
Resumo:
Partially ordered preferences generally lead to choices that do not abide by standard expected utility guidelines; often such preferences are revealed by imprecision in probability values. We investigate five criteria for strategy selection in decision trees with imprecision in probabilities: “extensive” Γ-maximin and Γ-maximax, interval dominance, maximality and E-admissibility. We present algorithms that generate strategies for all these criteria; our main contribution is an algorithm for Eadmissibility that runs over admissible strategies rather than over sets of probability distributions.
Resumo:
High power lasers have proven being capable to produce high energy γ-rays, charged particles and neutrons, and to induce all kinds of nuclear reactions. At ELI, the studies with high power lasers will enter for the first time into new domains of power and intensities: 10 PW and 10^23 W/cm^2. While the development of laser based radiation sources is the main focus at the ELI-Beamlines pillar of ELI, at ELI-NP the studies that will benefit from High Power Laser System pulses will focus on Laser Driven Nuclear Physics (this TDR, acronym LDNP, associated to the E1 experimental area), High Field Physics and QED (associated to the E6 area) and fundamental research opened by the unique combination of the two 10 PW laser pulses with a gamma beam provided by the Gamma Beam System (associated to E7 area). The scientific case of the LDNP TDR encompasses studies of laser induced nuclear reactions, aiming for a better understanding of nuclear properties, of nuclear reaction rates in laser-plasmas, as well as on the development of radiation source characterization methods based on nuclear techniques. As an example of proposed studies: the promise of achieving solid-state density bunches of (very) heavy ions accelerated to about 10 MeV/nucleon through the RPA mechanism will be exploited to produce highly astrophysical relevant neutron rich nuclei around the N~126 waiting point, using the sequential fission-fusion scheme, complementary to any other existing or planned method of producing radioactive nuclei.
The studies will be implemented predominantly in the E1 area of ELI-NP. However, many of them can be, in a first stage, performed in the E5 and/or E4 areas, where higher repetition laser pulses are available, while the harsh X-ray and electromagnetic pulse (EMP) environments are less damaging compared to E1.
A number of options are discussed through the document, having an important impact on the budget and needed resources. Depending on the TDR review and subsequent project decisions, they may be taken into account for space reservation, while their detailed design and implementation will be postponed.
The present TDR is the result of contributions from several institutions engaged in nuclear physics and high power laser research. A significant part of the proposed equipment can be designed, and afterwards can be built, only in close collaboration with (or subcontracting to) some of these institutions. A Memorandum of Understanding (MOU) is currently under preparation with each of these key partners as well as with others that are interested to participate in the design or in the future experimental program.
Resumo:
The paper is concerned with the role of art and design in the history and philosophy of computing. It offers insights arising from research into a period in the 1960s and 70s, particularly in the UK, when computing became more available to artists and designers, focusing on John Lansdown (1929-1999) and Bruce Archer (1922-2005) in London. Models of computing interacted with conceptualisations of art, design and related creative activities in important ways.
Resumo:
Tese dout., Engenharia electrónica e computação - Processamento de sinal, Universidade do Algarve, 2008
Resumo:
The domain of thermal therapies applications can be improved with the development of accurate non-invasive timespatial temperature models. These models should represent the non-linear tissue thermal behaviour and be capable of tracking temperature at both time-instant and spatial position. If such estimators exist then efficient controllers for the therapeutic instrumentation could be developed, and the desired safety and effectiveness reached.
Resumo:
La seguridad y eficacia de las terapias térmicas están ligadas con la determinación exacta de la temperatura, es por ello que la retroalimentacón de la temperatura en los métodos computacionales es de vital importancia.
Resumo:
The Computer Game industry is big business, the demand for graduates is high, indeed there is a continuing shortage of skilled employees. As with most professions, the skill set required is both specific and diverse. There are currently over 30 Higher Education Institutions (HEIs) in the UK offering Computer games related courses. We expect that as the demand from the industry is sustained, more HEIs will respond with the introduction of game-related degrees. This is quite a considerable undertaking involving many issues from integration of new modules or complete courses within the existing curriculum, to staff development. In this paper we share our experiences of introducing elements of game development into our curriculum. This has occurred over the past two years, starting with the inclusion of elements of game development into existing programming modules, followed by the validation of complete modules, and culminating in a complete degree course. Our experience is that our adopting a progressive approach to development, spread over a number of years, was crucial in achieving a successful outcome.