994 resultados para calculation tool


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The energy and hardness profile for a series of inter and intramolecular conformational changes at several levels of calculation were computed. The hardness profiles were found to be calculated as the difference between the vertical ionization potential and electron affinity. The hardness profile shows the correct number of stationary points independently of the basis set and methodology used. It was found that the hardness profiles can be used to check the reliability of the energy profiles for those chemical system

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to conduct an instrument test of the Canadian questionnaire Alberta Context Tool (ACT) version Long-Term care for Swedish conditions. ACT is designed in order to measure the context in the care environment and different behaviours related to the changes in clinical practice. In total, 159 Licensed Practical Nurses (LPNs) and Registered Nurses (RNs) within municipality care of the elderly were included in the survey. The test included the instrument's reliability and face validity.The reliability test was implemented through calculation of Cronbach´s Alpha, and showed internal consistency for five of the scales of the ACT-instrument with Cronbach´s Alpha values ranging between 0,728 and 0,873. However, three dimensions got lower values (0,558 - 0,683).The analysis was carried out with content analysis and carried out for LPNs and RNs in separate groups. The majority of LPNs expressed that it was easy to respond to the questions (56%), while nine percent considered it as difficult. Eleven comments were given about questions that were perceived to be unclear, complicated or contained difficult words. In the RN group only 30 percent considered that the questions were easy to respond to. One third of the RNs considered that part of the questions were unclear, and six RNs expressed also which questions they experienced as unclear. In general, the questions in the ACT were perceived as relevant. The instrument's relevance as a tool to measure contextual factors that influence the implementation of evidence based nursing can also be considered to be determined. By modifying the content in the questionnaire in accordance with what appeared in this survey and to implement yet another test, the instrument should be considered to be relevant for use within Swedish municipality care of the elderly. ACT can be used both as a tool in the work on improvement of clinical practice and as a tool for further research about implementation of evidence based nursing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tool for standardized calculation of solar collector performance has been developed in cooperation between SP Technical Research Institute of Sweden, DTU Denmark and SERC Dalarna University. The tool is designed to calculate the annual performance of solar collectors at representative locations in Europe. The collector parameters used as input in the tool are compiled from tests according to EN12975, without any intermediate conversions. The main target group for this tool is test institutes and certification bodies that are intended to use it for conversion of collector model parameters (derived from performance tests) into a more user friendly quantity: the annual energy output. The energy output presented in the tool is expressed as kWh per collector module. A simplified treatment of performance for PVT collectors is added based on the assumption that the thermal part of the PVT collector can be tested and modeled as a thermal collector, when the PV electric part is active with an MPP tracker in operation. The thermal collector parameters from this operation mode are used for the PVT calculations. © 2012 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three-phase three-wire power flow algorithms, as any tool for power systems analysis, require reliable impedances and models in order to obtain accurate results. Kron's reduction procedure, which embeds neutral wire influence into phase wires, has shown good results when three-phase three-wire power flow algorithms based on current summation method were used. However, Kron's reduction can harm reliabilities of some algorithms whose iterative processes need loss calculation (power summation method). In this work, three three-phase three-wire power flow algorithms based on power summation method, will be compared with a three-phase four-wire approach based on backward-forward technique and current summation. Two four-wire unbalanced medium-voltage distribution networks will be analyzed and results will be presented and discussed. © 2004 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a computational tool to assist power system engineers in the field tuning of power system stabilizers (PSSs) and Automatic Voltage Regulators (AVRs). The outcome of this tool is a range of gain values for theses controllers within which there is a theoretical guarantee of stability for the closed-loop system. This range is given as a set of limit values for the static gains of the controllers of interest, in such a way that the engineer responsible for the field tuning of PSSs and/or AVRs can be confident with respect to system stability when adjusting the corresponding static gains within this range. This feature of the proposed tool is highly desirable from a practical viewpoint, since the PSS and AVR commissioning stage always involve some readjustment of the controller gains to account for the differences between the nominal model and the actual behavior of the system. By capturing these differences as uncertainties in the model, this computational tool is able to guarantee stability for the whole uncertain model using an approach based on linear matrix inequalities. It is also important to remark that the tool proposed in this paper can also be applied to other types of parameters of either PSSs or Power Oscillation Dampers, as well as other types of controllers (such as speed governors, for example). To show its effectiveness, applications of the proposed tool to two benchmarks for small signal stability studies are presented at the end of this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geochemical mapping is a valuable tool for the control of territory that can be used not only in the identification of mineral resources and geological, agricultural and forestry studies but also in the monitoring of natural resources by giving solutions to environmental and economic problems. Stream sediments are widely used in the sampling campaigns carried out by the world's governments and research groups for their characteristics of broad representativeness of rocks and soils, for ease of sampling and for the possibility to conduct very detailed sampling In this context, the environmental role of stream sediments provides a good basis for the implementation of environmental management measures, in fact the composition of river sediments is an important factor in understanding the complex dynamics that develop within catchment basins therefore they represent a critical environmental compartment: they can persistently incorporate pollutants after a process of contamination and release into the biosphere if the environmental conditions change. It is essential to determine whether the concentrations of certain elements, in particular heavy metals, can be the result of natural erosion of rocks containing high concentrations of specific elements or are generated as residues of human activities related to a certain study area. This PhD thesis aims to extract from an extensive database on stream sediments of the Romagna rivers the widest spectrum of informations. The study involved low and high order stream in the mountain and hilly area, but also the sediments of the floodplain area, where intensive agriculture is active. The geochemical signals recorded by the stream sediments will be interpreted in order to reconstruct the natural variability related to bedrock and soil contribution, the effects of the river dynamics, the anomalous sites, and with the calculation of background values be able to evaluate their level of degradation and predict the environmental risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work considers the reconstruction of strong gravitational lenses from their observed effects on the light distribution of background sources. After reviewing the formalism of gravitational lensing and the most common and relevant lens models, new analytical results on the elliptical power law lens are presented, including new expressions for the deflection, potential, shear and magnification, which naturally lead to a fast numerical scheme for practical calculation. The main part of the thesis investigates lens reconstruction with extended sources by means of the forward reconstruction method, in which the lenses and sources are given by parametric models. The numerical realities of the problem make it necessary to find targeted optimisations for the forward method, in order to make it feasible for general applications to modern, high resolution images. The result of these optimisations is presented in the \textsc{Lensed} algorithm. Subsequently, a number of tests for general forward reconstruction methods are created to decouple the influence of sourced from lens reconstructions, in order to objectively demonstrate the constraining power of the reconstruction. The final chapters on lens reconstruction contain two sample applications of the forward method. One is the analysis of images from a strong lensing survey. Such surveys today contain $\sim 100$ strong lenses, and much larger sample sizes are expected in the future, making it necessary to quickly and reliably analyse catalogues of lenses with a fixed model. The second application deals with the opposite situation of a single observation that is to be confronted with different lens models, where the forward method allows for natural model-building. This is demonstrated using an example reconstruction of the ``Cosmic Horseshoe''. An appendix presents an independent work on the use of weak gravitational lensing to investigate theories of modified gravity which exhibit screening in the non-linear regime of structure formation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calculation of projection structures (PSs) from Protein Data Bank (PDB)-coordinate files of membrane proteins is not well-established. Reports on such attempts exist but are rare. In addition, the different procedures are barely described and thus difficult if not impossible to reproduce. Here we present a simple, fast and well-documented method for the calculation and visualization of PSs from PDB-coordinate files of membrane proteins: the projection structure visualization (PSV)-method. The PSV-method was successfully validated using the PS of aquaporin-1 (AQP1) from 2D crystals and cryo-transmission electron microscopy, and the PDB-coordinate file of AQP1 determined from 3D crystals and X-ray crystallography. Besides AQP1, which is a relatively rigid protein, we also studied a flexible membrane transport protein, i.e. the L-arginine/agmatine antiporter AdiC. Comparison of PSs calculated from the existing PDB-coordinate files of substrate-free and L-arginine-bound AdiC indicated that conformational changes are detected in projection. Importantly, structural differences were found between the PSV-method calculated PSs of the detergent-solubilized AdiC proteins and the PS from cryo-TEM of membrane-embedded AdiC. These differences are particularly exciting since they may reflect a different conformation of AdiC induced by the lateral pressure in the lipid bilayer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Ankle-brachial pressure index (ABI) is a simple, inexpensive, and useful tool in the detection of peripheral arterial occlusive disease (PAD). The current guidelines published by the American Heart Association define ABI as the quotient of the higher of the systolic blood pressures (SBPs) of the two ankle arteries of that limb (either the anterior tibial artery or the posterior tibial artery) and the higher of the two brachial SBPs of the upper limbs. We hypothesized that considering the lower of the two ankle arterial SBPs of a side as the numerator and the higher of the brachial SBPs as the denominator would increase its diagnostic yield. METHODS: The former method of eliciting ABI was termed as high ankle pressure (HAP) and the latter low ankle pressure (LAP). ABI was assessed in 216 subjects and calculated according to the HAP and the LAP method. ABI findings were confirmed by arterial duplex ultrasonography. A significant arterial stenosis was assumed if ABI was <0.9. RESULTS: LAP had a sensitivity of 0.89 and a specificity of 0.93. The HAP method had a sensitivity of 0.68 and a specificity of 0.99. McNemar's test to compare the results of both methods demonstrated a two-tailed P < .0001, indicating a highly significant difference between both measurement methods. CONCLUSIONS: LAP is the superior method of calculating ABI to identify PAD. This result is of great interest for epidemiologic studies applying ABI measurements to detect PAD and assessing patients' cardiovascular risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present SUSY_FLAVOR version 2 — a Fortran 77 program that calculates low-energy flavor observables in the general R-parity conserving MSSM. For a set of MSSM parameters as input, the code gives predictions for: 1. Electric dipole moments of the leptons and the neutron. 2. Anomalous magnetic moments (i.e. g − 2) of the leptons. 3. Radiative lepton decays (μ → eγ and τ → μγ , eγ ). 4. Rare Kaon decays (K0 L → π0 ¯νν and K+ → π+ ¯νν). 5. Leptonic B decays (Bs,d → l+l−, B → τ ν and B → Dτ ν). 6. Radiative B decays (B → ¯ Xsγ ). 7. ΔF = 2 processes ( ¯ K0–K0, ¯D–D, ¯Bd–Bd and ¯Bs–Bs mixing). Comparing to SUSY_FLAVOR v1, where the matching conditions were calculated strictly at one-loop level, SUSY_FLAVOR v2 performs the resummation of all chirally enhanced corrections, i.e. takes into account the enhanced effects from tan β and/or large trilinear soft mixing terms to all orders in perturbation theory. Also, in SUSY_FLAVOR v2 new routines calculation of B → (D)τ ν, g − 2, radiative lepton decays and Br(l → l′γ ) were added. All calculations are done using exact diagonalization of the sfermion mass matrices. The program can be obtained from http://www.fuw.edu.pl/susy_flavor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Off-site effects of soil erosion are becoming increasingly important, particularly the pollution of surface waters. In order to develop environmentally efficient and cost effective mitigation options it is essential to identify areas that bear both a high erosion risk and high connectivity to surface waters. This paper introduces a simple risk assessment tool that allows the delineation of potential critical source areas (CSA) of sediment input into surface waters concerning the agricultural areas of Switzerland. The basis are the erosion risk map with a 2 m resolution (ERM2) and the drainage network, which is extended by drained roads, farm tracks, and slope depressions. The probability of hydrological and sedimentological connectivity is assessed by combining soil erosion risk and extended drainage network with flow distance calculation. A GIS-environment with multiple-flow accumulation algorithms is used for routing runoff generation and flow pathways. The result is a high resolution connectivity map of the agricultural area of Switzerland (888,050 ha). Fifty-five percent of the computed agricultural area is potentially connected with surface waters, 45% is not connected. Surprisingly, the larger part of 34% (62% of the connected area) is indirectly connected with surface waters through drained roads, and only 21% are directly connected. The reason is the topographic complexity and patchiness of the landscape due to a dense road and drainage network. A total of 24% of the connected area and 13% of the computed agricultural area, respectively, are rated with a high connectivity probability. On these CSA an adapted land use is recommended, supported by vegetated buffer strips preventing sediment load. Even areas that are far away from open water bodies can be indirectly connected and need to be included in planning of mitigation measures. Thus, the connectivity map presented is an important decision-making tool for policy-makers and extension services. The map is published on the web and thus available for application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proton therapy is growing increasingly popular due to its superior dose characteristics compared to conventional photon therapy. Protons travel a finite range in the patient body and stop, thereby delivering no dose beyond their range. However, because the range of a proton beam is heavily dependent on the tissue density along its beam path, uncertainties in patient setup position and inherent range calculation can degrade thedose distribution significantly. Despite these challenges that are unique to proton therapy, current management of the uncertainties during treatment planning of proton therapy has been similar to that of conventional photon therapy. The goal of this dissertation research was to develop a treatment planning method and a planevaluation method that address proton-specific issues regarding setup and range uncertainties. Treatment plan designing method adapted to proton therapy: Currently, for proton therapy using a scanning beam delivery system, setup uncertainties are largely accounted for by geometrically expanding a clinical target volume (CTV) to a planning target volume (PTV). However, a PTV alone cannot adequately account for range uncertainties coupled to misaligned patient anatomy in the beam path since it does not account for the change in tissue density. In order to remedy this problem, we proposed a beam-specific PTV (bsPTV) that accounts for the change in tissue density along the beam path due to the uncertainties. Our proposed method was successfully implemented, and its superiority over the conventional PTV was shown through a controlled experiment.. Furthermore, we have shown that the bsPTV concept can be incorporated into beam angle optimization for better target coverage and normal tissue sparing for a selected lung cancer patient. Treatment plan evaluation method adapted to proton therapy: The dose-volume histogram of the clinical target volume (CTV) or any other volumes of interest at the time of planning does not represent the most probable dosimetric outcome of a given plan as it does not include the uncertainties mentioned earlier. Currently, the PTV is used as a surrogate of the CTV’s worst case scenario for target dose estimation. However, because proton dose distributions are subject to change under these uncertainties, the validity of the PTV analysis method is questionable. In order to remedy this problem, we proposed the use of statistical parameters to quantify uncertainties on both the dose-volume histogram and dose distribution directly. The robust plan analysis tool was successfully implemented to compute both the expectation value and its standard deviation of dosimetric parameters of a treatment plan under the uncertainties. For 15 lung cancer patients, the proposed method was used to quantify the dosimetric difference between the nominal situation and its expected value under the uncertainties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NPV is a static measure of project value which does not discriminate between levels of internal and external risk in project valuation. Due to current investment project?s characteristics, a much more complex model is needed: one that includes the value of flexibility and the different risk levels associated with variables subject to uncertainty (price, costs, exchange rates, grade and tonnage of the deposits, cut off grade, among many others). Few of these variables present any correlation or can be treated uniformly. In this context, Real Option Valuation (ROV) arose more than a decade ago, as a mainly theoretical model with the potential for simultaneous calculation of the risk associated with such variables. This paper reviews the literature regarding the application of Real Options Valuation in mining, noting the prior focus on external risks, and presents a case study where ROV is applied to quantify risk associated to mine planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate computation of radioactive opacities is needed in several research fields such as astrophysics, magnetic fusion or ICF target physics analysis, in which the radiation transport is an important feature to determine in detail. Radiation transport plays an important role in the transport of energy in dense plasma and it is strongly influenced by the variation of plasma opacity with density and temperature, as well as, photon energy. In this work we present some new features of the opacity code ATMED [1]. This code has been designed to compute the spectral radioactive opacity as well as the Rosseland and Planck means for single element and mixture plasmas. The model presented is fast, stable and reasonably accurate into its range of application and it can be a useful tool to simulate ICF experiments in plasma laboratory.