806 resultados para Information content


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Herbivorous insects, their host plants and natural enemies form the largest and most species-rich communities on earth. But what forces structure such communities? Do they represent random collections of species, or are they assembled by given rules? To address these questions, food webs offer excellent tools. As a result of their versatile information content, such webs have become the focus of intensive research over the last few decades. In this thesis, I study herbivore-parasitoid food webs from a new perspective: I construct multiple, quantitative food webs in a spatially explicit setting, at two different scales. Focusing on food webs consisting of specialist herbivores and their natural enemies on the pedunculate oak, Quercus robur, I examine consistency in food web structure across space and time, and how landscape context affects this structure. As an important methodological development, I use DNA barcoding to resolve potential cryptic species in the food webs, and to examine their effect on food web structure. I find that DNA barcoding changes our perception of species identity for as many as a third of the individuals, by reducing misidentifications and by resolving several cryptic species. In terms of the variation detected in food web structure, I find surprising consistency in both space and time. From a spatial perspective, landscape context leaves no detectable imprint on food web structure, while species richness declines significantly with decreasing connectivity. From a temporal perspective, food web structure remains predictable from year to year, despite considerable species turnover in local communities. The rate of such turnover varies between guilds and species within guilds. The factors best explaining these observations are abundant and common species, which have a quantitatively dominant imprint on overall structure, and suffer the lowest turnover. By contrast, rare species with little impact on food web structure exhibit the highest turnover rates. These patterns reveal important limitations of modern metrics of quantitative food web structure. While they accurately describe the overall topology of the web and its most significant interactions, they are disproportionately affected by species with given traits, and insensitive to the specific identity of species. As rare species have been shown to be important for food web stability, metrics depicting quantitative food web structure should then not be used as the sole descriptors of communities in a changing world. To detect and resolve the versatile imprint of global environmental change, one should rather use these metrics as one tool among several.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The increased availability of high frequency data sets have led to important new insights in understanding of financial markets. The use of high frequency data is interesting and persuasive, since it can reveal new information that cannot be seen in lower data aggregation. This dissertation explores some of the many important issues connected with the use, analysis and application of high frequency data. These include the effects of intraday seasonal, the behaviour of time varying volatility, the information content of various market data, and the issue of inter market linkages utilizing high frequency 5 minute observations from major European and the U.S stock indices, namely DAX30 of Germany, CAC40 of France, SMI of Switzerland, FTSE100 of the UK and SP500 of the U.S. The first essay in the dissertation shows that there are remarkable similarities in the intraday behaviour of conditional volatility across European equity markets. Moreover, the U.S macroeconomic news announcements have significant cross border effect on both, European equity returns and volatilities. The second essay reports substantial intraday return and volatility linkages across European stock indices of the UK and Germany. This relationship appears virtually unchanged by the presence or absence of the U.S stock market. However, the return correlation among the U.K and German markets rises significantly following the U.S stock market opening, which could largely be described as a contemporaneous effect. The third essay sheds light on market microstructure issues in which traders and market makers learn from watching market data, and it is this learning process that leads to price adjustments. This study concludes that trading volume plays an important role in explaining international return and volatility transmissions. The examination concerning asymmetry reveals that the impact of the positive volume changes is larger on foreign stock market volatility than the negative changes. The fourth and the final essay documents number of regularities in the pattern of intraday return volatility, trading volume and bid-ask spreads. This study also reports a contemporaneous and positive relationship between the intraday return volatility, bid ask spread and unexpected trading volume. These results verify the role of trading volume and bid ask quotes as proxies for information arrival in producing contemporaneous and subsequent intraday return volatility. Moreover, asymmetric effect of trading volume on conditional volatility is also confirmed. Overall, this dissertation explores the role of information in explaining the intraday return and volatility dynamics in international stock markets. The process through which the information is incorporated in stock prices is central to all information-based models. The intraday data facilitates the investigation that how information gets incorporated into security prices as a result of the trading behavior of informed and uninformed traders. Thus high frequency data appears critical in enhancing our understanding of intraday behavior of various stock markets’ variables as it has important implications for market participants, regulators and academic researchers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this thesis is to examine the role of trade durations in price discovery. The motivation to use trade durations in the study of price discovery is that durations are robust to many microstructure effects that introduce a bias in the measurement of returns volatility. Another motivation to use trade durations in the study of price discovery is that it is difficult to think of economic variables, which really are useful in the determination of the source of volatility at arbitrarily high frequencies. The dissertation contains three essays. In the first essay, the role of trade durations in price discovery is examined with respect to the volatility pattern of stock returns. The theory on volatility is associated with the theory on the information content of trade, dear to the market microstructure theory. The first essay documents that the volatility per transaction is related to the intensity of trade, and a strong relationship between the stochastic process of trade durations and trading variables. In the second essay, the role of trade durations in price discovery is examined with respect to the quantification of risk due to a trading volume of a certain size. The theory on volume is intrinsically associated with the stock volatility pattern. The essay documents that volatility increases, in general, when traders choose to trade with large transactions. In the third essay, the role of trade durations in price discovery is examined with respect to the information content of a trade. The theory on the information content of a trade is associated with the theory on the rate of price revisions in the market. The essay documents that short durations are associated with information. Thus, traders are compensated for responding quickly to information

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we present a wavelet - based approach to solve the non-linear perturbation equation encountered in optical tomography. A particularly suitable data gathering geometry is used to gather a data set consisting of differential changes in intensity owing to the presence of the inhomogeneous regions. With this scheme, the unknown image, the data, as well as the weight matrix are all represented by wavelet expansions, thus yielding the representation of the original non - linear perturbation equation in the wavelet domain. The advantage in use of the non-linear perturbation equation is that there is no need to recompute the derivatives during the entire reconstruction process. Once the derivatives are computed, they are transformed into the wavelet domain. The purpose of going to the wavelet domain, is that, it has an inherent localization and de-noising property. The use of approximation coefficients, without the detail coefficients, is ideally suited for diffuse optical tomographic reconstructions, as the diffusion equation removes most of the high frequency information and the reconstruction appears low-pass filtered. We demonstrate through numerical simulations, that through solving merely the approximation coefficients one can reconstruct an image which has the same information content as the reconstruction from a non-waveletized procedure. In addition we demonstrate a better noise tolerance and much reduced computation time for reconstructions from this approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One of the significant advancements in Nuclear Magnetic Resonance spectroscopy (NMR) in combating the problem of spectral complexity for deriving the structure and conformational information is the incorporation of additional dimension and to spread the information content in a two dimensional space. This approach together with the manipulation of the dynamics of nuclear spins permitted the designing of appropriate pulse sequences leading to the evolution of diverse multidimensional NMR experiments. The desired spectral information can now be extracted in a simplified and an orchestrated manner. The indirect detection of multiple quantum (MQ) NMR frequencies is a step in this direction. The MQ technique has been extensively used in the study of molecules aligned in liquid crystalline media to reduce spectral complexity and to determine molecular geometries. Unlike in dipolar coupled systems, the size of the network of scalar coupled spins is not big in isotropic solutions and the MQ 1H detection is not routinely employed,although there are specific examples of spin topology filtering. In this brief review, we discuss our recent studies on the development and application of multiple quantum correlation and resolved techniques for the analyses of proton NMR spectra of scalar coupled spins.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A herbarium-based database (virtual herbarium) is a referral system for plants that maximizes the usefulness of the collections. The information content of such a database is essentially built on the voucher specimens that the herbarium has in its care. The present article reports on the construction of a `virtual herbarium' for the state-wide collection of flowering plants in the Herbarium JCB housed at the Centre for Ecological Sciences, Indian Institute of Science, Bangalore, that is expected to be launched soon. The taxonomic data on each species include all information presented on the herbarium specimen label, namely species name, author citation, sub-species if any, variety if any, family, subfamily, collection number, locations, date of collection, habitat and the collector's name. The data further comprise `flora' in which the species are described. Additional information includes the nomenclature update according to `The Plant List', a detailed description, phenology, species distribution, threat status and comments on any special features of the taxon. The live images of the species provided in the database form an information synergy on the species. This initiative is the first of its kind for herbaria in peninsular India.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When stimulated by a point source of cyclic AMP, a starved amoeba of Dictyostelium discoideum responds by putting out a hollow balloon-like membrane extension followed by a pseudopod. The effect of the stimulus is to influence the position where either of these protrusions is made on the cell rather than to cause them to be made. Because the pseudopod forms perpendicular to the cell surface, its location is a measure of the precision with which the cell can locate the cAMP source. Cells beyond 1 h of starvation respond non-randomly with a precision that improves steadily thereafter. A cell that is starved for 1-2 h can locate the source accurately 43% of the time; and if starved for 6-7 h, 87% of the time. The response always has a high scatter; population-level heterogeneity reflects stochasticity in single cell behaviour. From the angular distribution of the response its maximum information content is estimated to be 2-3 bits. In summary, we quantitatively demonstrate the stochastic nature of the directional response and the increase in its accuracy over time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Since years the International Herring Larvae Survey Program (IHLS) is an important and internationally established survey program in the North Sea. The IHLS serves the calibration of stock abundance estimates based on information from the commercial fishery and the method of Integrated Catch Analysis (ICA) which is a specific derivate of the Virtual Population Analysis (VPA). Meanwhile the IHLS database has been transferred from Aberdeen to Kiel and it has been agreed that the Institut für Meereskunde Kiel should continue to maintain this database and provide the abundance indices to be utilized by the ICES Herring Assessment Working Group as one of the means for assessing the state of the herring stock in the North Sea. For establishing the calculation procedure at Kiel, it was necessary to optimize both, the survey design and the index calculation. This article gives an overview over the survey’s history, it’s geography, the sampling design, the information content of the IHLS data base and the various methods of calculating the different indices necessary for the calibration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A novel spectroscopy of trapped ions is proposed which will bring single-ion detection sensitivity to the observation of magnetic resonance spectra. The approaches developed here are aimed at resolving one of the fundamental problems of molecular spectroscopy, the apparent incompatibility in existing techniques between high information content (and therefore good species discrimination) and high sensitivity. Methods for studying both electron spin resonance (ESR) and nuclear magnetic resonance (NMR) are designed. They assume established methods for trapping ions in high magnetic field and observing the trapping frequencies with high resolution (<1 Hz) and sensitivity (single ion) by electrical means. The introduction of a magnetic bottle field gradient couples the spin and spatial motions together and leads to a small spin-dependent force on the ion, which has been exploited by Dehmelt to observe directly the perturbation of the ground-state electron's axial frequency by its spin magnetic moment.

A series of fundamental innovations is described m order to extend magnetic resonance to the higher masses of molecular ions (100 amu = 2x 10^5 electron masses) and smaller magnetic moments (nuclear moments = 10^(-3) of the electron moment). First, it is demonstrated how time-domain trapping frequency observations before and after magnetic resonance can be used to make cooling of the particle to its ground state unnecessary. Second, adiabatic cycling of the magnetic bottle off between detection periods is shown to be practical and to allow high-resolution magnetic resonance to be encoded pointwise as the presence or absence of trapping frequency shifts. Third, methods of inducing spindependent work on the ion orbits with magnetic field gradients and Larmor frequency irradiation are proposed which greatly amplify the attainable shifts in trapping frequency.

The dissertation explores the basic concepts behind ion trapping, adopting a variety of classical, semiclassical, numerical, and quantum mechanical approaches to derive spin-dependent effects, design experimental sequences, and corroborate results from one approach with those from another. The first proposal presented builds on Dehmelt's experiment by combining a "before and after" detection sequence with novel signal processing to reveal ESR spectra. A more powerful technique for ESR is then designed which uses axially synchronized spin transitions to perform spin-dependent work in the presence of a magnetic bottle, which also converts axial amplitude changes into cyclotron frequency shifts. A third use of the magnetic bottle is to selectively trap ions with small initial kinetic energy. A dechirping algorithm corrects for undesired frequency shifts associated with damping by the measurement process.

The most general approach presented is spin-locked internally resonant ion cyclotron excitation, a true continuous Stern-Gerlach effect. A magnetic field gradient modulated at both the Larmor and cyclotron frequencies is devised which leads to cyclotron acceleration proportional to the transverse magnetic moment of a coherent state of the particle and radiation field. A preferred method of using this to observe NMR as an axial frequency shift is described in detail. In the course of this derivation, a new quantum mechanical description of ion cyclotron resonance is presented which is easily combined with spin degrees of freedom to provide a full description of the proposals.

Practical, technical, and experimental issues surrounding the feasibility of the proposals are addressed throughout the dissertation. Numerical ion trajectory simulations and analytical models are used to predict the effectiveness of the new designs as well as their sensitivity and resolution. These checks on the methods proposed provide convincing evidence of their promise in extending the wealth of magnetic resonance information to the study of collisionless ions via single-ion spectroscopy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A central objective in signal processing is to infer meaningful information from a set of measurements or data. While most signal models have an overdetermined structure (the number of unknowns less than the number of equations), traditionally very few statistical estimation problems have considered a data model which is underdetermined (number of unknowns more than the number of equations). However, in recent times, an explosion of theoretical and computational methods have been developed primarily to study underdetermined systems by imposing sparsity on the unknown variables. This is motivated by the observation that inspite of the huge volume of data that arises in sensor networks, genomics, imaging, particle physics, web search etc., their information content is often much smaller compared to the number of raw measurements. This has given rise to the possibility of reducing the number of measurements by down sampling the data, which automatically gives rise to underdetermined systems.

In this thesis, we provide new directions for estimation in an underdetermined system, both for a class of parameter estimation problems and also for the problem of sparse recovery in compressive sensing. There are two main contributions of the thesis: design of new sampling and statistical estimation algorithms for array processing, and development of improved guarantees for sparse reconstruction by introducing a statistical framework to the recovery problem.

We consider underdetermined observation models in array processing where the number of unknown sources simultaneously received by the array can be considerably larger than the number of physical sensors. We study new sparse spatial sampling schemes (array geometries) as well as propose new recovery algorithms that can exploit priors on the unknown signals and unambiguously identify all the sources. The proposed sampling structure is generic enough to be extended to multiple dimensions as well as to exploit different kinds of priors in the model such as correlation, higher order moments, etc.

Recognizing the role of correlation priors and suitable sampling schemes for underdetermined estimation in array processing, we introduce a correlation aware framework for recovering sparse support in compressive sensing. We show that it is possible to strictly increase the size of the recoverable sparse support using this framework provided the measurement matrix is suitably designed. The proposed nested and coprime arrays are shown to be appropriate candidates in this regard. We also provide new guarantees for convex and greedy formulations of the support recovery problem and demonstrate that it is possible to strictly improve upon existing guarantees.

This new paradigm of underdetermined estimation that explicitly establishes the fundamental interplay between sampling, statistical priors and the underlying sparsity, leads to exciting future research directions in a variety of application areas, and also gives rise to new questions that can lead to stand-alone theoretical results in their own right.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the first section of this thesis, two-dimensional properties of the human eye movement control system were studied. The vertical - horizontal interaction was investigated by using a two-dimensional target motion consisting of a sinusoid in one of the directions vertical or horizontal, and low-pass filtered Gaussian random motion of variable bandwidth (and hence information content) in the orthogonal direction. It was found that the random motion reduced the efficiency of the sinusoidal tracking. However, the sinusoidal tracking was only slightly dependent on the bandwidth of the random motion. Thus the system should be thought of as consisting of two independent channels with a small amount of mutual cross-talk.

These target motions were then rotated to discover whether or not the system is capable of recognizing the two-component nature of the target motion. That is, the sinusoid was presented along an oblique line (neither vertical nor horizontal) with the random motion orthogonal to it. The system did not simply track the vertical and horizontal components of motion, but rotated its frame of reference so that its two tracking channels coincided with the directions of the two target motion components. This recognition occurred even when the two orthogonal motions were both random, but with different bandwidths.

In the second section, time delays, prediction and power spectra were examined. Time delays were calculated in response to various periodic signals, various bandwidths of narrow-band Gaussian random motions and sinusoids. It was demonstrated that prediction occurred only when the target motion was periodic, and only if the harmonic content was such that the signal was sufficiently narrow-band. It appears as if general periodic motions are split into predictive and non-predictive components.

For unpredictable motions, the relationship between the time delay and the average speed of the retinal image was linear. Based on this I proposed a model explaining the time delays for both random and periodic motions. My experiments did not prove that the system is sampled data, or that it is continuous. However, the model can be interpreted as representative of a sample data system whose sample interval is a function of the target motion.

It was shown that increasing the bandwidth of the low-pass filtered Gaussian random motion resulted in an increase of the eye movement bandwidth. Some properties of the eyeball-muscle dynamics and the extraocular muscle "active state tension" were derived.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In a multi-target complex network, the links (L-ij) represent the interactions between the drug (d(i)) and the target (t(j)), characterized by different experimental measures (K-i, K-m, IC50, etc.) obtained in pharmacological assays under diverse boundary conditions (c(j)). In this work, we handle Shannon entropy measures for developing a model encompassing a multi-target network of neuroprotective/neurotoxic compounds reported in the CHEMBL database. The model predicts correctly >8300 experimental outcomes with Accuracy, Specificity, and Sensitivity above 80%-90% on training and external validation series. Indeed, the model can calculate different outcomes for >30 experimental measures in >400 different experimental protocolsin relation with >150 molecular and cellular targets on 11 different organisms (including human). Hereafter, we reported by the first time the synthesis, characterization, and experimental assays of a new series of chiral 1,2-rasagiline carbamate derivatives not reported in previous works. The experimental tests included: (1) assay in absence of neurotoxic agents; (2) in the presence of glutamate; and (3) in the presence of H2O2. Lastly, we used the new Assessing Links with Moving Averages (ALMA)-entropy model to predict possible outcomes for the new compounds in a high number of pharmacological tests not carried out experimentally.