40 resultados para simplicity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Marine Strategy Framework Directive (MSFD) requires that European Union Member States achieve "Good Environmental Status" (GES) in respect of 11 Descriptors of the marine environment by 2020. Of those, Descriptor 4, which focuses on marine food webs, is perhaps the most challenging to implement since the identification of simple indicators able to assess the health of highly dynamic and complex interactions is difficult. Here, we present the proposed food web criteria/indicators and analyse their theoretical background and applicability in order to highlight both the current knowledge gaps and the difficulties associated with the assessment of GES. We conclude that the existing suite of indicators gives variable focus to the three important food web properties: structure, functioning and dynamics, and more emphasis should be given to the latter two and the general principles that relate these three properties. The development of food web indicators should be directed towards more integrative and process-based indicators with an emphasis on their responsiveness to multiple anthropogenic pressures. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study of catalytic behavior begins with one seemingly simple process, namely the hydrogenation of O to H2O on platinum. Despite the apparent simplicity its mechanism has been much debated. We have used density functional theory with,gradient corrections to examine microscopic reaction pathways for several elementary steps implicated in this fundamental catalytic process. We find that H2O formation from chemisorbed O and H atoms is a highly activated process. The largest barrier along this route, with a value of similar to1 eV, is the addition of the first H to O to produce OH. Once formed, however, OH groups are easily hydrogenated to H2O with a barrier of similar to0.2 eV. Disproportionation reactions with 1:1 and 2:1 stoichiometries of H2O and O have been examined as alternative routes for OH formation. Both stoichiometries of reaction produce OH groups with barriers that are much lower than that associated with the O + H reaction. H2O, therefore, acts as an autocatalyst in the overall H O formation process. Disproportionation with a 2:1 stoichiometry is thermodynamically and kinetically favored over disproportionation with a l:I stoichiometry. This highlights an additional (promotional) role of the second H2O molecule in this process. In support of our previous suggestion that the key intermediate in the low-temperature H2O formation reaction is a mixed OH and H2O overlayer we find that then is a very large barrier for the dissociation of the second H2O molecule in the 2:1 disproportionation process. We suggest that the proposed intermediate is then hydrogenated to H2O through a very facile proton transfer mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of blood spot collection cards is a simple way to obtain specimens for analysis of drugs for the purpose of therapeutic drug monitoring, assessing adherence to medications and preventing toxicity in routine clinical setting. We describe the development and validation of a microanalytical technique for the determination of metformin from dried blood spots. The method is based on reversed phase high-performance liquid chromatography with ultraviolet detection. Drug recovery in the developed method was found to be more than 84%. The limits of detection and quantification were calculated to be to be 90 and 150 ng/ml, respectively. The intraday and interday precision (measured by CV%) was always less than 9%. The accuracy (measured by relative error, %) was always less than 12%. Stability analysis showed that metformin is stable for at least 2 months when stored at -70 degrees C. The small volume of blood required (10 mu L), combined with the simplicity of the analytical technique makes this a useful procedure for monitoring metformin concentrations in routine clinical settings. The method is currently being applied to the analysis of blood spots taken from diabetic patients to assess adherence to medications and relationship between metformin level and metabolic control of diabetes. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relative Evidential Supports (RES) was developed and justified several years ago as a non-numeric apparatus that allows us to compare evidential supports for alternative conclusions when making a decision. An extension called Graded Relative Evidence (GRE) of the RES concept of pairwise balancing and trading-off of evidence is reported here which keeps its basic features of simplicity and perspicacity but enriches its modelling fidelity by permitting very modest and intuitive variations in degrees of outweighing (which the essentially binary RES does not). The formal justification is very simply based on linkages to RES and to the Dempster - Shafer theory of evidence. The use of the simple extension is illustrated and to a small degree further justified empirically by application to a topical scientific debate about what is called the Congo Crossover Conjecture here. This decision-making instance is chosen because of the wealth of evidence that has been accumulated on both sides of the debate and the range of evidence strengths manifested in it. The conjecture is that the advent of Aids was in the late 1950s in the Congo when a vaccine for polio was allegedly cultivated in the kidneys of chimpanzees which allowed the Aids infection to cross over to humans from primates. © 2005 Springer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Density functional calculations, using B3LPY/6-31G(d) methods, have been used to investigate the conformations and vibrational (Raman) spectra of a series of long-chain, saturated fatty acid methyl esters (FAMEs) with the formula CH2nO2 (n = 5-21) and two series of unsaturated FAMEs. The calculations showed that the lowest energy conformer within the saturated FAMEs is the simple (all-trans) structure and, in general, it was possible to reproduce experimental data using calculations on only the all-trans conformer. The only exception was C6H12O2, where a second low-lying conformer had to be included in order to correctly simulate the experimental Raman spectrum. The objective of the work was to provide theoretical justification for the methods that are commonly used to determine the properties of the fats and oils, such as chain length and degree of unsaturation, from experimental Raman data. Here it is shown that the calculations reproduce the trends and calibration curves that are found experimentally and also allow the reasons for the failure of what would appear to be rational measurements to be understood. This work shows that although the assumption that each FAME can simply be treated as a collection of functional groups can be justified in some cases, many of the vibrational modes are complex motions of large sections of the molecules and thus would not be expected to show simple linear trends with changes in structure, such as increasing chain length and/or unsaturation. Simple linear trends obtained from experimental data may thus arise from cancellation of opposing effects, rather than reflecting an underlying simplicity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In his provocative article, F. Mechsner (2004) advances the thesis that human voluntary movements are subject to "psychological" or "perceptual -cognitive" control and are thus organized "without regard to efferent patterns" (p. 355). Rather than considering in detail the experiments that he proffered by way of support, the present author discusses the degree to which that supposition has appeal on the grounds of simplicity and is defined in terms that are compatible with a unified science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel 3rd-order compact E-plane ridge waveguide filter is presented. Miniaturization is achieved upon introducing a configuration of parallel-coupled E-plane ridge waveguide resonators. Furthermore, the proposed filter allows for transmission zeros at finite frequencies. Fabrication simplicity and mass producibility of standard E-plane filters is maintained. The numerical and experimental results are presented to validate the proposed configuration. A miniaturisation factor of 2 and very sharp upper cutoff are achieved. 2005 Wiley Periodicals, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The United States Supreme Court case of 1991, Feist Publications, Inc. v. Rural Tel. Service Co., continues to be highly significant for property in data and databases, but remains poorly understood. The approach taken in this article contrasts with previous studies. It focuses upon the “not original” rather than the original. The delineation of the absence of a modicum of creativity in selection, coordination, and arrangement of data as a component of the not original forms a pivotal point in the Supreme Court decision. The author also aims at elucidation rather than critique, using close textual exegesis of the Supreme Court decision. The results of the exegesis are translated into a more formal logical form to enhance clarity and rigor.


The insufficiently creative is initially characterized as “so mechanical or routine.” Mechanical and routine are understood in their ordinary discourse senses, as a conjunction or as connected by AND, and as the central clause. Subsequent clauses amplify the senses of mechanical and routine without disturbing their conjunction.


The delineation of the absence of a modicum of creativity can be correlated with classic conceptions of computability. The insufficiently creative can then be understood as a routine selection, coordination, or arrangement produced by an automatic mechanical procedure or algorithm. An understanding of a modicum of creativity and of copyright law is also indicated.


The value of the exegesis and interpretation is identified as its final simplicity, clarity, comprehensiveness, and potential practical utility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Value-at-risk (VaR) forecasting generally relies on a parametric density function of portfolio returns that ignores higher moments or assumes them constant. In this paper, we propose a simple approach to forecasting of a portfolio VaR. We employ the Gram-Charlier expansion (GCE) augmenting the standard normal distribution with the first four moments, which are allowed to vary over time. In an extensive empirical study, we compare the GCE approach to other models of VaR forecasting and conclude that it provides accurate and robust estimates of the realized VaR. In spite of its simplicity, on our dataset GCE outperforms other estimates that are generated by both constant and time-varying higher-moments models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hypothetical contingent valuation surveys used to elicit values for environmental and other public goods often employ variants of the referendum mechanism due to the cognitive simplicity and familiarity of respondents with this voting format. One variant, the double referendum mechanism, requires respondents to state twice how they would vote for a given policy proposal given their cost of the good. Data from these surveys often exhibit anomalies inconsistent with standard economic models of consumer preferences. There are a number of published explanations for these anomalies, mostly focusing on problems with the second vote. This article investigates which aspects of the hypothetical task affect the degree of nondemand revelation and takes an individual-based approach to identifying people most likely to non-demand reveal. A clear profile emerges from our model of a person who faces a negative surplus i.e. a net loss in the second vote and invokes non self-interested, non financial motivations during the decision process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An enzyme labeled immunosorbent assay (ELISA) and surface plasmon resonance (SPR) biosensor assay for the detection of paralytic shellfish poisoning (PSP) toxins were developed and a comparative evaluation was performed. A polyclonal antibody (BC67) used in both assay formats was raised to saxitoxin–jeffamine–BSA in New Zealand white rabbits. Each assay format was designed as an inhibition assay. Shellfish samples (n = 54) were evaluated by each method using two simple rapid extraction procedures and compared to the AOAC high performance liquid chromatography (HPLC) and the mouse bioassay (MBA). The results of each assay format were comparable with the HPLC and MBA methods and demonstrate that an antibody with high sensitivity and broad specificity to PSP toxins can be applied to different immunological techniques. The method of choice will depend on the end-users needs. The reduced manual labor and simplicity of operation of the SPR biosensor compared to ELISA, ease of sample extraction and superior real time semi-quantitative analysis are key features that could make this technology applicable in a high-throughput monitoring unit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background

Biomedical researchers are now often faced with situations where it is necessary to test a large number of hypotheses simultaneously, eg, in comparative gene expression studies using high-throughput microarray technology. To properly control false positive errors the FDR (false discovery rate) approach has become widely used in multiple testing. The accurate estimation of FDR requires the proportion of true null hypotheses being accurately estimated. To date many methods for estimating this quantity have been proposed. Typically when a new method is introduced, some simulations are carried out to show the improved accuracy of the new method. However, the simulations are often very limited to covering only a few points in the parameter space.

Results

Here I have carried out extensive in silico experiments to compare some commonly used methods for estimating the proportion of true null hypotheses. The coverage of these simulations is unprecedented thorough over the parameter space compared to typical simulation studies in the literature. Thus this work enables us to draw conclusions globally as to the performance of these different methods. It was found that a very simple method gives the most accurate estimation in a dominantly large area of the parameter space. Given its simplicity and its overall superior accuracy I recommend its use as the first choice for estimating the proportion of true null hypotheses in multiple testing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nonlinear principal component analysis (PCA) based on neural networks has drawn significant attention as a monitoring tool for complex nonlinear processes, but there remains a difficulty with determining the optimal network topology. This paper exploits the advantages of the Fast Recursive Algorithm, where the number of nodes, the location of centres, and the weights between the hidden layer and the output layer can be identified simultaneously for the radial basis function (RBF) networks. The topology problem for the nonlinear PCA based on neural networks can thus be solved. Another problem with nonlinear PCA is that the derived nonlinear scores may not be statistically independent or follow a simple parametric distribution. This hinders its applications in process monitoring since the simplicity of applying predetermined probability distribution functions is lost. This paper proposes the use of a support vector data description and shows that transforming the nonlinear principal components into a feature space allows a simple statistical inference. Results from both simulated and industrial data confirm the efficacy of the proposed method for solving nonlinear principal component problems, compared with linear PCA and kernel PCA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inorganic polyphosphate (polyP) is increasingly being recognized as an important phosphorus sink within the environment, playing a central role in phosphorus exchange and phosphogenesis. Yet despite the significant advances made in polyP research there is a lack of rapid and efficient analytical approaches for the quantification of polyP accumulation in microbial cultures and environmental samples. A major drawback is the need to extract polyP from cells prior to analysis. Due to extraction inefficiencies this can lead to an underestimation of both intracellular polyP levels and its environmental pool size: we observed 23-58% loss of polyP using standard solutions and current protocols. Here we report a direct fluorescence based DAPI assay system which removes the requirement for prior polyP extraction before quantification. This increased the efficiency of polyP detection by 28-55% in microbial cultures suggesting quantitative measurement of the intracellular polyP pool. It provides a direct polyP assay which combines quantification capability with technical simplicity. This is an important step forward in our ability to explore the role of polyP in cellular biology and biogeochemical nutrient cycling.