933 resultados para Software Package Data Exchange (SPDX)
Resumo:
En aquest treball es descriu l'ús de les mesures de semblança molecular quàntica (MSMQ) per a caracteritzar propietats i activitats biològiques moleculars, i definir descriptors emprables per a construir models QSAR i QSPR. L'estudi que es presenta consisteix en la continuació d'un treball recent, on es descrivien relacions entre el paràmetre log P i MSMQ, donant així una alternativa a aquest parimetre hidrofòbic empíric. L'actual contribució presenta una nova mesura, capaç d'estendre l'ús de les MSMQ, que consisteix en l'energia de repulsió electró-electró (Vee). Aquest valor, disponible normalment a partir de programari de química quàntica, considera la molècula com una sola entitat, i no cal recórrer a l'ús de contribucions de fragments. La metodologia s'ha aplicat a cinc tipus diferents de compostos on diferents propietats moleculars i activitats biològiques s'han correlacionat amb Vee com a únic descriptor molecular. En tots els casos estudiats, s'han obtingut correlacions satisfactòries.
Resumo:
An eddy current testing system consists of a multi-sensor probe, a computer and a special expansion card and software for data-collection and analysis. The probe incorporates an excitation coil, and sensor coils; at least one sensor coil is a lateral current-normal coil and at least one is a current perturbation coil.
Resumo:
An eddy current testing system consists of a multi-sensor probe, computer and a special expansion card and software for data collection and analysis. The probe incorporates an excitation coil, and sensor coils; at least one sensor coil is a lateral current-normal coil and at least one is a current perturbation coil.
Resumo:
This paper reviews Bayesian procedures for phase 1 dose-escalation studies and compares different dose schedules and cohort sizes. The methodology described is motivated by the situation of phase 1 dose-escalation studiesin oncology, that is, a single dose administered to each patient, with a single binary response ("toxicity"' or "no toxicity") observed. It is likely that a wider range of applications of the methodology is possible. In this paper, results from 10000-fold simulation runs conducted using the software package Bayesian ADEPT are presented. Four designs were compared under six scenarios. The simulation results indicate that there are slight advantages of having more dose levels and smaller cohort sizes.
Resumo:
In this paper, we give an overview of our studies by static and time-resolved X-ray diffraction of inverse cubic phases and phase transitions in lipids. In 1, we briefly discuss the lyotropic phase behaviour of lipids, focusing attention on non-lamellar structures, and their geometric/topological relationship to fusion processes in lipid membranes. Possible pathways for transitions between different cubic phases are also outlined. In 2, we discuss the effects of hydrostatic pressure on lipid membranes and lipid phase transitions, and describe how the parameters required to predict the pressure dependence of lipid phase transition temperatures can be conveniently measured. We review some earlier results of inverse bicontinuous cubic phases from our laboratory, showing effects such as pressure-induced formation and swelling. In 3, we describe the technique of pressure-jump synchrotron X-ray diffraction. We present results that have been obtained from the lipid system 1:2 dilauroylphosphatidylcholine/lauric acid for cubic-inverse hexagonal, cubic-cubic and lamellar-cubic transitions. The rate of transition was found to increase with the amplitude of the pressure-jump and with increasing temperature. Evidence for intermediate structures occurring transiently during the transitions was also obtained. In 4, we describe an IDL-based 'AXCESS' software package being developed in our laboratory to permit batch processing and analysis of the large X-ray datasets produced by pressure-jump synchrotron experiments. In 5, we present some recent results on the fluid lamellar-Pn3m cubic phase transition of the single-chain lipid 1-monoelaidin, which we have studied both by pressure-jump and temperature-jump X-ray diffraction. Finally, in 6, we give a few indicators of future directions of this research. We anticipate that the most useful technical advance will be the development of pressure-jump apparatus on the microsecond time-scale, which will involve the use of a stack of piezoelectric pressure actuators. The pressure-jump technique is not restricted to lipid phase transitions, but can be used to study a wide range of soft matter transitions, ranging from protein unfolding and DNA unwinding and transitions, to phase transitions in thermotropic liquid crystals, surfactants and block copolymers.
Resumo:
Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”.
Resumo:
This paper presents a numerical study of urban air-flow for a group of five buildings that is located at the University of Reading in the United Kingdom. The airflow around these buildings has been simulated by using ANSYS CFD software package. In this study, the association between certain architectural forms: a street canyon, a semi-closure, and a courtyard-like space in a low-rise building complex, and the wind environment were investigated. The analysis of CFD results has provided detailed information on the wind patterns of these urban built forms. The numerical results have been compared with the experimental measurements within the building complex. The observed characteristics of urban wind pattern with respect to the built structures are presented as a guideline. This information is needed for the design and/or performance assessments of systems such as passive and low energy design approach, a natural or hybrid ventilation, and passive cooling. Also, the knowledge of urban wind patterns allows us to develop better design options for the application of renewable energy technologies within urban environment.
Resumo:
This report describes the analysis and development of novel tools for the global optimisation of relevant mission design problems. A taxonomy was created for mission design problems, and an empirical analysis of their optimisational complexity performed - it was demonstrated that the use of global optimisation was necessary on most classes and informed the selection of appropriate global algorithms. The selected algorithms were then applied to the di®erent problem classes: Di®erential Evolution was found to be the most e±cient. Considering the speci¯c problem of multiple gravity assist trajectory design, a search space pruning algorithm was developed that displays both polynomial time and space complexity. Empirically, this was shown to typically achieve search space reductions of greater than six orders of magnitude, thus reducing signi¯cantly the complexity of the subsequent optimisation. The algorithm was fully implemented in a software package that allows simple visualisation of high-dimensional search spaces, and e®ective optimisation over the reduced search bounds.
Resumo:
[English] This paper is a tutorial introduction to pseudospectral optimal control. With pseudospectral methods, a function is approximated as a linear combination of smooth basis functions, which are often chosen to be Legendre or Chebyshev polynomials. Collocation of the differential-algebraic equations is performed at orthogonal collocation points, which are selected to yield interpolation of high accuracy. Pseudospectral methods directly discretize the original optimal control problem to recast it into a nonlinear programming format. A numerical optimizer is then employed to find approximate local optimal solutions. The paper also briefly describes the functionality and implementation of PSOPT, an open source software package written in C++ that employs pseudospectral discretization methods to solve multi-phase optimal control problems. The software implements the Legendre and Chebyshev pseudospectral methods, and it has useful features such as automatic differentiation, sparsity detection, and automatic scaling. The use of pseudospectral methods is illustrated in two problems taken from the literature on computational optimal control. [Portuguese] Este artigo e um tutorial introdutorio sobre controle otimo pseudo-espectral. Em metodos pseudo-espectrais, uma funcao e aproximada como uma combinacao linear de funcoes de base suaves, tipicamente escolhidas como polinomios de Legendre ou Chebyshev. A colocacao de equacoes algebrico-diferenciais e realizada em pontos de colocacao ortogonal, que sao selecionados de modo a minimizar o erro de interpolacao. Metodos pseudoespectrais discretizam o problema de controle otimo original de modo a converte-lo em um problema de programa cao nao-linear. Um otimizador numerico e entao empregado para obter solucoes localmente otimas. Este artigo tambem descreve sucintamente a funcionalidade e a implementacao de um pacote computacional de codigo aberto escrito em C++ chamado PSOPT. Tal pacote emprega metodos de discretizacao pseudo-spectrais para resolver problemas de controle otimo com multiplas fase. O PSOPT permite a utilizacao de metodos de Legendre ou Chebyshev, e possui caractersticas uteis tais como diferenciacao automatica, deteccao de esparsidade e escalonamento automatico. O uso de metodos pseudo-espectrais e ilustrado em dois problemas retirados da literatura de controle otimo computacional.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The type and thickness of insulation on the topside horizontal of cold pitched roofs has a significant role in controlling air movement, energy conservation and moisture transfer reduction through the ceiling to the loft (roof void) space. To investigate its importance, a numerical model using a HAM software package on a Matlab platform with a Simulink simulation tool has been developed using insitu measurements of airflows from the dwelling space through the ceiling to the loft of three houses of different configurations and loft space. Considering typical UK roof underlay (i.e. bituminous felt and a vapour permeable underlay), insitu measurements of the 3 houses were tested using a calibrated passive sampling technique. Using the measured airflows, the effect of air movement on three types of roof insulation (i.e. fibreglass, cellulose and foam) was modelled to investigate associated energy losses and moisture transport. The thickness of the insulation materials were varied but the ceiling airtightness and eaves gap size were kept constant. These instances were considered in order to visualize the effects of the changing parameters. In addition, two different roof underlays of varying resistances were considered and compared to access the influence of the underlay, if any, on energy conservation. The comparison of these insulation materials in relation to the other parameters showed that the type of insulation material and thickness, contributes significantly to energy conservation and moisture transfer reduction through the roof and hence of the building as a whole.
Resumo:
We explored the impact of a degraded semantic system on lexical, morphological and syntactic complexity in language production. We analysed transcripts from connected speech samples from eight patients with semantic dementia (SD) and eight age-matched healthy speakers. The frequency distributions of nouns and verbs were compared for hand-scored data and data extracted using text-analysis software. Lexical measures showed the predicted pattern for nouns and verbs in hand-scored data, and for nouns in software-extracted data, with fewer low frequency items in the speech of the patients relative to controls. The distribution of complex morpho-syntactic forms for the SD group showed a reduced range, with fewer constructions that required multiple auxiliaries and inflections. Finally, the distribution of syntactic constructions also differed between groups, with a pattern that reflects the patients’ characteristic anomia and constraints on morpho-syntactic complexity. The data are in line with previous findings of an absence of gross syntactic errors or violations in SD speech. Alterations in the distributions of morphology and syntax, however, support constraint satisfaction models of speech production in which there is no hard boundary between lexical retrieval and grammatical encoding.