980 resultados para Source wavelet estimation
Resumo:
Identification of chemical compounds with specific biological activities is an important step in both chemical biology and drug discovery. When the structure of the intended target is available, one approach is to use molecular docking programs to assess the chemical complementarity of small molecules with the target; such calculations provide a qualitative measure of affinity that can be used in virtual screening (VS) to rank order a list of compounds according to their potential to be active. rDock is a molecular docking program developed at Vernalis for high-throughput VS (HTVS) applications. Evolved from RiboDock, the program can be used against proteins and nucleic acids, is designed to be computationally very efficient and allows the user to incorporate additional constraints and information as a bias to guide docking. This article provides an overview of the program structure and features and compares rDock to two reference programs, AutoDock Vina (open source) and Schrodinger's Glide (commercial). In terms of computational speed for VS, rDock is faster than Vina and comparable to Glide. For binding mode prediction, rDock and Vina are superior to Glide. The VS performance of rDock is significantly better than Vina, but inferior to Glide for most systems unless pharmacophore constraints are used; in that case rDock and Glide are of equal performance. The program is released under the Lesser General Public License and is freely available for download, together with the manuals, example files and the complete test sets, at http://rdock.sourceforge.net/
Resumo:
In mathematical modeling the estimation of the model parameters is one of the most common problems. The goal is to seek parameters that fit to the measurements as well as possible. There is always error in the measurements which implies uncertainty to the model estimates. In Bayesian statistics all the unknown quantities are presented as probability distributions. If there is knowledge about parameters beforehand, it can be formulated as a prior distribution. The Bays’ rule combines the prior and the measurements to posterior distribution. Mathematical models are typically nonlinear, to produce statistics for them requires efficient sampling algorithms. In this thesis both Metropolis-Hastings (MH), Adaptive Metropolis (AM) algorithms and Gibbs sampling are introduced. In the thesis different ways to present prior distributions are introduced. The main issue is in the measurement error estimation and how to obtain prior knowledge for variance or covariance. Variance and covariance sampling is combined with the algorithms above. The examples of the hyperprior models are applied to estimation of model parameters and error in an outlier case.
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement -SCR-, under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.
Resumo:
This work presents, from the perspective of a freelancer professional, a case study of a practical and real implementation of an Open Source ERP software suite to a very small company, including the development of a custom software module to adapt the suite to the particular needs of the company.
Estudo comparativo sobre filtragem de sinais instrumentais usando transformadas de Fourier e Wavelet
Resumo:
A comparative study of the Fourier (FT) and the wavelet transforms (WT) for instrumental signal denoising is presented. The basic principles of wavelet theory are described in a succinct and simplified manner. For illustration, FT and WT are used to filter UV-VIS and plasma emission spectra using MATLAB software for computation. Results show that FT and WT filters are comparable when the signal does not display sharp peaks (UV-VIS spectra), but the WT yields a better filtering when the filling factor of the signal is small (plasma spectra), since it causes low peak distortion.
Resumo:
Aquesta memòria descriu la preparació, l'execució i els resultats obtinguts d'implementar un sistema calculador de rutes. El projecte Open Source Routing Machine és un motor calculador de rutes d'alt rendiment que utilitza les dades de OpenStreetMaps per calcular el camí més curt entre dos punts. En aquest projecte final no únicament es volen utilitzar les dades OpenStreetMap sinó que també es pretenen utilitzar dades pròpies en format shapefile i poder visualitzar-los en un visor web. Aquest visor permet a l'usuari, de forma senzilla, sol•licitar rutes al servidor OSRM creat, obtenint la ruta desitjada en molt pocs milisegons
Resumo:
Soitinnus: Piano.
Resumo:
Several models for the estimation of thermodynamic properties of layered double hydroxides (LDHs) are presented. The predicted thermodynamic quantities calculated by the proposed models agree with experimental thermodynamic data. A thermodynamic study of the anion exchange process on LDHs is also made using the described models. Tables for the prediction of monovalent anion exchange selectivities on LDHs are provided. Reasonable agreement is found between the predicted and the experimental monovalent anion exchange selectivities.
Resumo:
Al2O3 is the most abundantly produced nanomaterial and has been used in diverse fields, including the medical, military and industrial sectors. As there are concerns about the health effects of nanoparticles, it is important to understand how they interact with cells, and specifically with red blood cells. The hemolysis induced by three commercial nano-sized aluminum oxide particles (nanopowder 13 nm, nanopowder <50 nm and nanowire 2-6 nm × 200-400 nm) was compared to aluminum oxide and has been studied on erythrocytes from humans, rats and rabbits, in order to elucidate the mechanism of action and the influence of size and shape on hemolytic behavior. The concentrations inducing 50% hemolysis (HC50) were calculated for each compound studied. The most hemolytic aluminum oxide particles were of nanopowder 13, followed by nanowire and nanopowder 50. The addition of albumin to PBS induced a protective effect on hemolysis in all the nano-forms of Al2O3, but not on Al2O3. The drop in HC50 correlated to a decrease in nanomaterial size, which was induced by a reduction of aggregation Aluminum oxide nanoparticles are less hemolytic than other oxide nanoparticles, and behave differently depending on the size and shape of the nanoparticles. The hemolytic behavior of aluminum oxide nanoparticles differs from that of aluminum oxide.
Resumo:
Variations in water volume in small depressions in Mediterranean salt marshes in Girona (Spain) are described and the potential causes for these variations analysed. Although the basins appear to be endorrheic, groundwater circulation is intense, as estimated from the difference between water volume observed and that expected from the balance precipitation / evaporation. The rate of variation in volume (VR = AV / VAt) may be used to estimate groundwater supply ('circulation'), since direct measurements of this parameter are impossible. Volume.conductivity figures can also be used to estimate the quantity of circulation, and to investigate the origin of water supplied to the system. The relationships between variations in the volume of water in the basins and the main causes of flooding are also analysed. Sea storms, rainfall levels and strong, dry northerly winds are suggested as the main causes of the variations in the volumes of basins. The relative importance assigned to these factors has changed, following the recent regulation of freshwater flows entering the system
Resumo:
During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia
Resumo:
Al2O3 is the most abundantly produced nanomaterial and has been used in diverse fields, including the medical, military and industrial sectors. As there are concerns about the health effects of nanoparticles, it is important to understand how they interact with cells, and specifically with red blood cells. The hemolysis induced by three commercial nano-sized aluminum oxide particles (nanopowder 13 nm, nanopowder <50 nm and nanowire 2-6 nm × 200-400 nm) was compared to aluminum oxide and has been studied on erythrocytes from humans, rats and rabbits, in order to elucidate the mechanism of action and the influence of size and shape on hemolytic behavior. The concentrations inducing 50% hemolysis (HC50) were calculated for each compound studied. The most hemolytic aluminum oxide particles were of nanopowder 13, followed by nanowire and nanopowder 50. The addition of albumin to PBS induced a protective effect on hemolysis in all the nano-forms of Al2O3, but not on Al2O3. The drop in HC50 correlated to a decrease in nanomaterial size, which was induced by a reduction of aggregation Aluminum oxide nanoparticles are less hemolytic than other oxide nanoparticles, and behave differently depending on the size and shape of the nanoparticles. The hemolytic behavior of aluminum oxide nanoparticles differs from that of aluminum oxide.
Resumo:
It is a well known phenomenon that the constant amplitude fatigue limit of a large component is lower than the fatigue limit of a small specimen made of the same material. In notched components the opposite occurs: the fatigue limit defined as the maximum stress at the notch is higher than that achieved with smooth specimens. These two effects have been taken into account in most design handbooks with the help of experimental formulas or design curves. The basic idea of this study is that the size effect can mainly be explained by the statistical size effect. A component subjected to an alternating load can be assumed to form a sample of initiated cracks at the end of the crack initiation phase. The size of the sample depends on the size of the specimen in question. The main objective of this study is to develop a statistical model for the estimation of this kind of size effect. It was shown that the size of a sample of initiated cracks shall be based on the stressed surface area of the specimen. In case of varying stress distribution, an effective stress area must be calculated. It is based on the decreasing probability of equally sized initiated cracks at lower stress level. If the distribution function of the parent population of cracks is known, the distribution of the maximum crack size in a sample can be defined. This makes it possible to calculate an estimate of the largest expected crack in any sample size. The estimate of the fatigue limit can now be calculated with the help of the linear elastic fracture mechanics. In notched components another source of size effect has to be taken into account. If we think about two specimens which have similar shape, but the size is different, it can be seen that the stress gradient in the smaller specimen is steeper. If there is an initiated crack in both of them, the stress intensity factor at the crack in the larger specimen is higher. The second goal of this thesis is to create a calculation method for this factor which is called the geometric size effect. The proposed method for the calculation of the geometric size effect is also based on the use of the linear elastic fracture mechanics. It is possible to calculate an accurate value of the stress intensity factor in a non linear stress field using weight functions. The calculated stress intensity factor values at the initiated crack can be compared to the corresponding stress intensity factor due to constant stress. The notch size effect is calculated as the ratio of these stress intensity factors. The presented methods were tested against experimental results taken from three German doctoral works. Two candidates for the parent population of initiated cracks were found: the Weibull distribution and the log normal distribution. Both of them can be used successfully for the prediction of the statistical size effect for smooth specimens. In case of notched components the geometric size effect due to the stress gradient shall be combined with the statistical size effect. The proposed method gives good results as long as the notch in question is blunt enough. For very sharp notches, stress concentration factor about 5 or higher, the method does not give sufficient results. It was shown that the plastic portion of the strain becomes quite high at the root of this kind of notches. The use of the linear elastic fracture mechanics becomes therefore questionable.
Resumo:
Cost estimation is an important, but challenging process when designing a new product or a feature of it, verifying the product prices given by suppliers or planning a cost saving actions of existing products. It is even more challenging when the product is highly modular, not a bulk product. In general, cost estimation techniques can be divided into two main groups - qualitative and quantitative techniques - which can further be classified into more detailed methods. Generally, qualitative techniques are preferable when comparing alternatives and quantitative techniques when cost relationships can be found. The main objective of this thesis was to develop a method on how to estimate costs of internally manufactured and commercial elevator landing doors. Because of the challenging product structure, the proposed cost estimation framework is developed under three different levels based on past cost information available. The framework consists of features from both qualitative and quantitative cost estimation techniques. The starting point for the whole cost estimation process is an unambiguous, hierarchical product structure so that the product can be classified into controllable parts and is then easier to handle. Those controllable parts can then be compared to existing past cost knowledge of similar parts and create as accurate cost estimates as possible by that way.