999 resultados para Amplification Techniques
Resumo:
The solubility of penciclovir (C10N5O3H17) in a novel film formulation designed for the treatment of cold sores was determined using X-ray, thermal, microscopic and release rate techniques. Solubilities of 0.15–0.23, 0.44, 0.53 and 0.42% (w/w) resulted for each procedure. Linear calibration lines were achieved for experimentally and theoretically determined differential scanning calorimetry (DSC) and X-ray powder diffractometry (XRPD) data. Intra- and inter-batch data precision values were determined; intra values were more precise. Microscopy was additionally useful for examining crystal shape, size distribution and homogeneity of drug distribution within the film. Whereas DSC also determined melting point, XRPD identified polymorphs and release data provided relevant kinetics.
Resumo:
Optimal state estimation from given observations of a dynamical system by data assimilation is generally an ill-posed inverse problem. In order to solve the problem, a standard Tikhonov, or L2, regularization is used, based on certain statistical assumptions on the errors in the data. The regularization term constrains the estimate of the state to remain close to a prior estimate. In the presence of model error, this approach does not capture the initial state of the system accurately, as the initial state estimate is derived by minimizing the average error between the model predictions and the observations over a time window. Here we examine an alternative L1 regularization technique that has proved valuable in image processing. We show that for examples of flow with sharp fronts and shocks, the L1 regularization technique performs more accurately than standard L2 regularization.
Resumo:
Although modern control techniques such as eigenstructure assignment have been given extensive coverage in control literature there is a reluctance to use them in practice as they are often not believed to be as `visible' or as simple as classical methods. A simple aircraft example is used, and it is shown that eigenstructure assignment can be used easily to produce a more viable controller than with simple classical techniques.
Resumo:
Grassland restoration is the dominant activity funded by agri-environment schemes (AES). However, the re-instatement of biodiversity and ecosystem services is limited by a number of severe abiotic and biotic constraints resulting from previous agricultural management. These appear to be less severe on ex-arable sites compared with permanent grassland. We report findings of a large research programme into practical solutions to these constraints. The key abiotic constraint was high residual soil fertility, particularly phosphorus. This can most easily be addressed by targeting of sites of low nutrient status. The chief biotic constraints were lack of propagules of desirable species and suitable sites for their establishment. Addition of seed mixtures or green hay to gaps created by either mechanical disturbance or herbicide was the most effective means of overcoming these factors. Finally, manipulation of biotic interactions, including hemiparasitic plants to reduce competition from grasses and control of mollusc herbivory of sown species, significantly improved the effectiveness of these techniques.
Resumo:
This paper presents findings of our study on peer-reviewed papers published in the International Conference on Persuasive Technology from 2006 to 2010. The study indicated that out of 44 systems reviewed, 23 were reported to be successful, 2 to be unsuccessful and 19 did not specify whether or not it was successful. 56 different techniques were mentioned and it was observed that most designers use ad hoc definitions for techniques or methods used in design. Hence we propose the need for research to establish unambiguous definitions of techniques and methods in the field.
Resumo:
A mechanism for amplification of mountain waves, and their associated drag, by parametric resonance is investigated using linear theory and numerical simulations. This mechanism, which is active when the Scorer parameter oscillates with height, was recently classified by previous authors as intrinsically nonlinear. Here it is shown that, if friction is included in the simplest possible form as a Rayleigh damping, and the solution to the Taylor-Goldstein equation is expanded in a power series of the amplitude of the Scorer parameter oscillation, linear theory can replicate the resonant amplification produced by numerical simulations with some accuracy. The drag is significantly altered by resonance in the vicinity of n/l_0 = 2, where l_0 is the unperturbed value of the Scorer parameter and n is the wave number of its oscillation. Depending on the phase of this oscillation, the drag may be substantially amplified or attenuated relative to its non-resonant value, displaying either single maxima or minima, or double extrema near n/l_0 = 2. Both non-hydrostatic effects and friction tend to reduce the magnitude of the drag extrema. However, in exactly inviscid conditions, the single drag maximum and minimum are suppressed. As in the atmosphere friction is often small but non-zero outside the boundary layer, modelling of the drag amplification mechanism addressed here should be quite sensitive to the type of turbulence closure employed in numerical models, or to computational dissipation in nominally inviscid simulations.
Resumo:
A chiral bisurea-based superhydrogelator that is capable of forming supramolecular hydrogels at concentrations as low as 0.2 mm is reported. This soft material has been characterized by thermal studies, rheology, X-ray diffraction analysis, transmission electron microscopy (TEM), and by various spectroscopic techniques (electronic and vibrational circular dichroism and by FTIR and Raman spectroscopy). The expression of chirality on the molecular and supramolecular levels has been studied and a clear amplification of its chirality into the achiral analogue has been observed. Furthermore, thermal analysis showed that the hydroACHTUNGTRENUNGgel- ACHTUNGTRENUNGation of compound 1 has a high response to temperature, which corresponds to an enthalpy-driven self-assembly process. These particular thermal characteristics make these materials easy to handle for soft-application technologies
Resumo:
Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.