938 resultados para Quasi-analytical algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Millions of blood products are transfused every year; many lives are thus directly concerned by transfusion. The three main labile blood products used in transfusion are erythrocyte concentrates, platelet concentrates and fresh frozen plasma. Each of these products has to be stored according to its particular components. However, during storage, modifications or degradation of those components may occur, and are known as storage lesions. Thus, biomarker discovery of in vivo blood aging as well as in vitro labile blood products storage lesions is of high interest for the transfusion medicine community. Pre-analytical issues are of major importance in analyzing the various blood products during storage conditions as well as according to various protocols that are currently used in blood banks for their preparations. This paper will review key elements that have to be taken into account in the context of proteomic-based biomarker discovery applied to blood banking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper aims at reconsidering some analytical measures to best encapsulate the interlanguage, in writing, of young beginner learners of English as a foreign language in the light of previous and work-in-progress research conducted within the BAF project, and in particular, whether clause and sentence length should be best viewed as a fluency or syntactic complexity measusre or as part of a different construct. In the light of a factor analysis (Navés, forthcoming) and multivariate and correlation studies (Navés et al. 2003, Navés, 2006, Torres et al. 2006) it becomes clear that the relationship between different analytical measures is also dependent on learner¿s cognitive maturity (age) and proficiency (amount of instruction). Finally, clause and sentence length should not be viewed as either a fluency or sytactic complexity measure but as part of a different construct. It is concluded that further research using regression analysis and cluster analysis is neeed in order to identify and validate the constructs of the writing components and their measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We revisit the analytical properties of the static quasi-photon polarizability function for an electron gas at finite temperature, in connection with the existence of Friedel oscillations in the potential created by an impurity. In contrast with the zero temperature case, where the polarizability is an analytical function, except for the two branch cuts which are responsible for Friedel oscillations, at finite temperature the corresponding function is non analytical, in spite of becoming continuous everywhere on the complex plane. This effect produces, as a result, the survival of the oscillatory behavior of the potential. We calculate the potential at large distances, and relate the calculation to the non-analytical properties of the polarizability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the last 2 decades, supertree reconstruction has been an active field of research and has seen the development of a large number of major algorithms. Because of the growing popularity of the supertree methods, it has become necessary to evaluate the performance of these algorithms to determine which are the best options (especially with regard to the supermatrix approach that is widely used). In this study, seven of the most commonly used supertree methods are investigated by using a large empirical data set (in terms of number of taxa and molecular markers) from the worldwide flowering plant family Sapindaceae. Supertree methods were evaluated using several criteria: similarity of the supertrees with the input trees, similarity between the supertrees and the total evidence tree, level of resolution of the supertree and computational time required by the algorithm. Additional analyses were also conducted on a reduced data set to test if the performance levels were affected by the heuristic searches rather than the algorithms themselves. Based on our results, two main groups of supertree methods were identified: on one hand, the matrix representation with parsimony (MRP), MinFlip, and MinCut methods performed well according to our criteria, whereas the average consensus, split fit, and most similar supertree methods showed a poorer performance or at least did not behave the same way as the total evidence tree. Results for the super distance matrix, that is, the most recent approach tested here, were promising with at least one derived method performing as well as MRP, MinFlip, and MinCut. The output of each method was only slightly improved when applied to the reduced data set, suggesting a correct behavior of the heuristic searches and a relatively low sensitivity of the algorithms to data set sizes and missing data. Results also showed that the MRP analyses could reach a high level of quality even when using a simple heuristic search strategy, with the exception of MRP with Purvis coding scheme and reversible parsimony. The future of supertrees lies in the implementation of a standardized heuristic search for all methods and the increase in computing power to handle large data sets. The latter would prove to be particularly useful for promising approaches such as the maximum quartet fit method that yet requires substantial computing power.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La aplicabilidad, repetibilidad y capacidad de diferentes métodos de análisis para discriminar muestras de aceites con diferentes grados de oxidación fueron evaluadas mediante aceites recogidos en procesos de fritura en continuo en varias empresas españolas. El objetivo de este trabajo fue encontrar métodos complementarios a la determinación del índice de acidez para el control de calidad rutinario de los aceites de fritura empleados en estas empresas. La optimización de la determinación de la constante dieléctrica conllevó una clara mejora de la variabilidad. No obstante, excepto en el caso del índice del ATB, el resto de métodos ensayados mostraron una menor variabilidad. La determinación del índice del ATB fue descartada ya que su sensibilidad fue insuficiente para discriminar entre aceites con diferente grado de oxidación. Los diferentes parámetros de alteración determinados en los aceites de fritura mostraron correlaciones significativas entre el índice de acidez y varios parámetros de oxidación diferentes, como la constante dieléctrica, el índice de p-anisidina, la absorción al ultravioleta y el contenido en polímeros de los triacilgliceroles. El índice de acidez solo evalúa la alteración hidrolítica, por lo que estos parámetros aportan información complementaria al evaluar la alteración termooxidativa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental quasi-two-dimensional Zn electrodeposits are grown under forced convection conditions. Large-scale effects, with preferential growth towards the impinging flow, together with small-scale roughness suppression effects are evidenced and separately analyzed by using two different radial cell configurations. Interpretations are given in terms of primary concepts concerning current and concentration distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A precise and simple computational model to generate well-behaved two-dimensional turbulent flows is presented. The whole approach rests on the use of stochastic differential equations and is general enough to reproduce a variety of energy spectra and spatiotemporal correlation functions. Analytical expressions for both the continuous and the discrete versions, together with simulation algorithms, are derived. Results for two relevant spectra, covering distinct ranges of wave numbers, are given.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the experimental data of Paret and Tabeling [Phys. Rev. Lett. 79, 4162 (1997)] we consider in detail the dispersion of particle pairs by a two-dimensional turbulent flow and its relation to the kinematic properties of the velocity field. We show that the mean square separation of a pair of particles is governed by rather rare, extreme events and that the majority of initially close pairs are not dispersed by the flow. Another manifestation of the same effect is the fact that the dispersion of an initially dense cluster is not the result of homogeneously spreading the particles within the whole system. Instead it proceeds through a splitting into smaller but also dense clusters. The statistical nature of this effect is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dynamic morphological transitions in thin-layer electrodeposits obtained from copper sulphate solutions have been studied. The chemical composition of the electrodeposits indicates that they appear as a consequence of the competition between copper and cuprous oxide formation. In addition, the Ohmic control of the process is verified at initial stages of the deposit growth. At higher deposit developments, gravity-induced convection currents play a role in the control of the whole process and affect the position of these transitions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The influence of an inert electrolyte (sodium sulfate) on quasi-two-dimensional copper electrodeposition from a nondeaerated aqueous copper sulfate solution has been analyzed. The different morphologies for a fixed concentration of CuSO4 have been classified in a diagram in terms of the applied potential and the inert electrolyte concentration. The main conclusion is the extension of the well-known Ohmic model for the homogeneous growth regime for copper sulfate solutions with small amounts of sodium sulfate. Moreover, we have observed the formation of fingerlike deposits at large applied potential and inert electrolyte concentration values, before hydrogen evolution becomes the main electrode reaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the dynamics of a water-oil meniscus moving from a smaller to a larger pore. The process is characterised by an abrupt change in the configuration, yielding a sudden energy release. A theoretic study for static conditions provides analytical solutions of the surface energy content of the system. Although the configuration after the sudden energy release is energetically more convenient, an energy barrier must be overcome before the process can happen spontaneously. The energy barrier depends on the system geometry and on the flow parameters. The analytical results are compared to numerical simulations that solve the full Navier-Stokes equation in the pore space and employ the Volume Of Fluid (VOF) method to track the evolution of the interface. First, the numerical simulations of a quasi-static process are validated by comparison with the analytical solutions for a static meniscus, then numerical simulations with varying injection velocity are used to investigate dynamic effects on the configuration change. During the sudden energy jump the system exhibits an oscillatory behaviour. Extension to more complex geometries might elucidate the mechanisms leading to a dynamic capillary pressure and to bifurcations in final distributions of fluid phases in porous

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.