88 resultados para Software Package Data Exchange (SPDX)
                                
                                
                                
Resumo:
This paper presents a numerical study of urban air-flow for a group of five buildings that is located at the University of Reading in the United Kingdom. The airflow around these buildings has been simulated by using ANSYS CFD software package. In this study, the association between certain architectural forms: a street canyon, a semi-closure, and a courtyard-like space in a low-rise building complex, and the wind environment were investigated. The analysis of CFD results has provided detailed information on the wind patterns of these urban built forms. The numerical results have been compared with the experimental measurements within the building complex. The observed characteristics of urban wind pattern with respect to the built structures are presented as a guideline. This information is needed for the design and/or performance assessments of systems such as passive and low energy design approach, a natural or hybrid ventilation, and passive cooling. Also, the knowledge of urban wind patterns allows us to develop better design options for the application of renewable energy technologies within urban environment.
                                
Resumo:
This report describes the analysis and development of novel tools for the global optimisation of relevant mission design problems. A taxonomy was created for mission design problems, and an empirical analysis of their optimisational complexity performed - it was demonstrated that the use of global optimisation was necessary on most classes and informed the selection of appropriate global algorithms. The selected algorithms were then applied to the di®erent problem classes: Di®erential Evolution was found to be the most e±cient. Considering the speci¯c problem of multiple gravity assist trajectory design, a search space pruning algorithm was developed that displays both polynomial time and space complexity. Empirically, this was shown to typically achieve search space reductions of greater than six orders of magnitude, thus reducing signi¯cantly the complexity of the subsequent optimisation. The algorithm was fully implemented in a software package that allows simple visualisation of high-dimensional search spaces, and e®ective optimisation over the reduced search bounds.
                                
Resumo:
[English] This paper is a tutorial introduction to pseudospectral optimal control. With pseudospectral methods, a function is approximated as a linear combination of smooth basis functions, which are often chosen to be Legendre or Chebyshev polynomials. Collocation of the differential-algebraic equations is performed at orthogonal collocation points, which are selected to yield interpolation of high accuracy. Pseudospectral methods directly discretize the original optimal control problem to recast it into a nonlinear programming format. A numerical optimizer is then employed to find approximate local optimal solutions. The paper also briefly describes the functionality and implementation of PSOPT, an open source software package written in C++ that employs pseudospectral discretization methods to solve multi-phase optimal control problems. The software implements the Legendre and Chebyshev pseudospectral methods, and it has useful features such as automatic differentiation, sparsity detection, and automatic scaling. The use of pseudospectral methods is illustrated in two problems taken from the literature on computational optimal control. [Portuguese] Este artigo e um tutorial introdutorio sobre controle otimo pseudo-espectral. Em metodos pseudo-espectrais, uma funcao e aproximada como uma combinacao linear de funcoes de base suaves, tipicamente escolhidas como polinomios de Legendre ou Chebyshev. A colocacao de equacoes algebrico-diferenciais e realizada em pontos de colocacao ortogonal, que sao selecionados de modo a minimizar o erro de interpolacao. Metodos pseudoespectrais discretizam o problema de controle otimo original de modo a converte-lo em um problema de programa cao nao-linear. Um otimizador numerico e entao empregado para obter solucoes localmente otimas. Este artigo tambem descreve sucintamente a funcionalidade e a implementacao de um pacote computacional de codigo aberto escrito em C++ chamado PSOPT. Tal pacote emprega metodos de discretizacao pseudo-spectrais para resolver problemas de controle otimo com multiplas fase. O PSOPT permite a utilizacao de metodos de Legendre ou Chebyshev, e possui caractersticas uteis tais como diferenciacao automatica, deteccao de esparsidade e escalonamento automatico. O uso de metodos pseudo-espectrais e ilustrado em dois problemas retirados da literatura de controle otimo computacional.
                                
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
                                
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
                                
Resumo:
The type and thickness of insulation on the topside horizontal of cold pitched roofs has a significant role in controlling air movement, energy conservation and moisture transfer reduction through the ceiling to the loft (roof void) space. To investigate its importance, a numerical model using a HAM software package on a Matlab platform with a Simulink simulation tool has been developed using insitu measurements of airflows from the dwelling space through the ceiling to the loft of three houses of different configurations and loft space. Considering typical UK roof underlay (i.e. bituminous felt and a vapour permeable underlay), insitu measurements of the 3 houses were tested using a calibrated passive sampling technique. Using the measured airflows, the effect of air movement on three types of roof insulation (i.e. fibreglass, cellulose and foam) was modelled to investigate associated energy losses and moisture transport. The thickness of the insulation materials were varied but the ceiling airtightness and eaves gap size were kept constant. These instances were considered in order to visualize the effects of the changing parameters. In addition, two different roof underlays of varying resistances were considered and compared to access the influence of the underlay, if any, on energy conservation. The comparison of these insulation materials in relation to the other parameters showed that the type of insulation material and thickness, contributes significantly to energy conservation and moisture transfer reduction through the roof and hence of the building as a whole.
                                
Resumo:
We explored the impact of a degraded semantic system on lexical, morphological and syntactic complexity in language production. We analysed transcripts from connected speech samples from eight patients with semantic dementia (SD) and eight age-matched healthy speakers. The frequency distributions of nouns and verbs were compared for hand-scored data and data extracted using text-analysis software. Lexical measures showed the predicted pattern for nouns and verbs in hand-scored data, and for nouns in software-extracted data, with fewer low frequency items in the speech of the patients relative to controls. The distribution of complex morpho-syntactic forms for the SD group showed a reduced range, with fewer constructions that required multiple auxiliaries and inflections. Finally, the distribution of syntactic constructions also differed between groups, with a pattern that reflects the patients’ characteristic anomia and constraints on morpho-syntactic complexity. The data are in line with previous findings of an absence of gross syntactic errors or violations in SD speech. Alterations in the distributions of morphology and syntax, however, support constraint satisfaction models of speech production in which there is no hard boundary between lexical retrieval and grammatical encoding.
                                
Resumo:
In this study, the performance, yield and characteristics of a 16 year old photovoltaic (PV) system installation have been investigated. The technology, BP Saturn modules which were steel-blue polycrystalline silicon cells are no longer in production. A bespoke monitoring system has been designed to monitor the characteristics of 6 refurbished strings, of 18 modules connected in series. The total output of the system is configured to 6.5 kWp (series to parallel configuration). In addition to experimental results, the performance ratio (PR) of known values was simulated using PVSyst, a simulation software package. From calculations using experimental values, the PV system showed approximately 10% inferior power outputs to what would have been expected as standard test conditions. However, efficiency values in comparison to standard test conditions and the performance ratio (w75% from PVSyst simulations) over the past decade have remained practically the same. This output though very relevant to the possible performance and stability of aging cells, requires additional parametric studies to develop a more robust argument. The result presented in this paper is part of an on-going investigation into PV system aging effects.
                                
Resumo:
The impact of energy policy measures has been assessed with various appraisal and evaluation tools since the 1960s. Decision analysis, environmental impact assessment and strategic environmental assessment are all notable examples of progenitors of Regulatory Impact Assessment (RIA) in the assessment of energy policies, programmes and projects. This chapter provides overview of policy tools which have been historically applied to assess the impacts of energy policies, programmes and projects. It focuses on the types of data and models that typically inform RIAs for energy policies; the organisations involved; and issues of data exchange between energy companies and policy-makers. Examples are derived from the European Commission, the UK, Italy, the Netherlands and France. It is concluded that the technical and economic analysis underpinning RIAs on energy policy and regulation varies significantly depending on the type of organisation carrying them out.
                                
Resumo:
Consider the statement "this project should cost X and has risk of Y". Such statements are used daily in industry as the basis for making decisions. The work reported here is part of a study aimed at providing a rational and pragmatic basis for such statements. Of particular interest are predictions made in the requirements and early phases of projects. A preliminary model has been constructed using Bayesian Belief Networks and in support of this, a programme to collect and study data during the execution of various software development projects commenced in May 2002. The data collection programme is undertaken under the constraints of a commercial industrial regime of multiple concurrent small to medium scale software development projects. Guided by pragmatism, the work is predicated on the use of data that can be collected readily by project managers; including expert judgements, effort, elapsed times and metrics collected within each project.
                                
                                
                                
Resumo:
This paper summarizes and analyses available data on the surface energy balance of Arctic tundra and boreal forest. The complex interactions between ecosystems and their surface energy balance are also examined, including climatically induced shifts in ecosystem type that might amplify or reduce the effects of potential climatic change. High latitudes are characterized by large annual changes in solar input. Albedo decreases strongly from winter, when the surface is snow-covered, to summer, especially in nonforested regions such as Arctic tundra and boreal wetlands. Evapotranspiration (QE) of high-latitude ecosystems is less than from a freely evaporating surface and decreases late in the season, when soil moisture declines, indicating stomatal control over QE, particularly in evergreen forests. Evergreen conifer forests have a canopy conductance half that of deciduous forests and consequently lower QE and higher sensible heat flux (QH). There is a broad overlap in energy partitioning between Arctic and boreal ecosystems, although Arctic ecosystems and light taiga generally have higher ground heat flux because there is less leaf and stem area to shade the ground surface, and the thermal gradient from the surface to permafrost is steeper. Permafrost creates a strong heat sink in summer that reduces surface temperature and therefore heat flux to the atmosphere. Loss of permafrost would therefore amplify climatic warming. If warming caused an increase in productivity and leaf area, or fire caused a shift from evergreen to deciduous forest, this would increase QE and reduce QH. Potential future shifts in vegetation would have varying climate feedbacks, with largest effects caused by shifts from boreal conifer to shrubland or deciduous forest (or vice versa) and from Arctic coastal to wet tundra. An increase of logging activity in the boreal forests appears to reduce QE by roughly 50% with little change in QH, while the ground heat flux is strongly enhanced.
 
                    