28 resultados para SINGLE-STEP
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Background Enzymatic biodiesel is becoming an increasingly popular topic in bioenergy literature because of its potential to overcome the problems posed by chemical processes. However, the high cost of the enzymatic process still remains the main drawback for its industrial application, mostly because of the high price of refined oils. Unfortunately, low cost substrates, such as crude soybean oil, often release a product that hardly accomplishes the final required biodiesel specifications and need an additional pretreatment for gums removal. In order to reduce costs and to make the enzymatic process more efficient, we developed an innovative system for enzymatic biodiesel production involving a combination of a lipase and two phospholipases. This allows performing the enzymatic degumming and transesterification in a single step, using crude soybean oil as feedstock, and converting part of the phospholipids into biodiesel. Since the two processes have never been studied together, an accurate analysis of the different reaction components and conditions was carried out. Results Crude soybean oil, used as low cost feedstock, is characterized by a high content of phospholipids (900 ppm of phosphorus). However, after the combined activity of different phospholipases and liquid lipase Callera Trans L, a complete transformation into fatty acid methyl esters (FAMEs >95%) and a good reduction of phosphorus (P <5 ppm) was achieved. The combination of enzymes allowed avoidance of the acid treatment required for gums removal, the consequent caustic neutralization, and the high temperature commonly used in degumming systems, making the overall process more eco-friendly and with higher yield. Once the conditions were established, the process was also tested with different vegetable oils with variable phosphorus contents. Conclusions Use of liquid lipase Callera Trans L in biodiesel production can provide numerous and sustainable benefits. Besides reducing the costs derived from enzyme immobilization, the lipase can be used in combination with other enzymes such as phospholipases for gums removal, thus allowing the use of much cheaper, non-refined oils. The possibility to perform degumming and transesterification in a single tank involves a great efficiency increase in the new era of enzymatic biodiesel production at industrial scale.
Spanning tests in return and stochastic discount factor mean-variance frontiers: A unifying approach
Resumo:
We propose new spanning tests that assess if the initial and additional assets share theeconomically meaningful cost and mean representing portfolios. We prove their asymptoticequivalence to existing tests under local alternatives. We also show that unlike two-step oriterated procedures, single-step methods such as continuously updated GMM yield numericallyidentical overidentifyng restrictions tests, so there is arguably a single spanning test.To prove these results, we extend optimal GMM inference to deal with singularities in thelong run second moment matrix of the influence functions. Finally, we test for spanningusing size and book-to-market sorted US stock portfolios.
Resumo:
Two main approaches are commonly used to empirically evaluate linear factor pricingmodels: regression and SDF methods, with centred and uncentred versions of the latter.We show that unlike standard two-step or iterated GMM procedures, single-step estimatorssuch as continuously updated GMM yield numerically identical values for prices of risk,pricing errors, Jensen s alphas and overidentifying restrictions tests irrespective of the modelvalidity. Therefore, there is arguably a single approach regardless of the factors being tradedor not, or the use of excess or gross returns. We illustrate our results by revisiting Lustigand Verdelhan s (2007) empirical analysis of currency returns.
Resumo:
The aim of this paper is to analyse the main agreements on the EU’s External Action agreed within the European Convention and the IGC taking into account why, how and who reached the consensus on them. In other words, this paper will explore the principles followed in order to improve the instruments of the EU’s External Action such as authority, coherence, visibility, efficiency and credibility.
Resumo:
If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.
Resumo:
A crucial step for understanding how lexical knowledge is represented is to describe the relative similarity of lexical items, and how it influences language processing. Previous studies of the effects of form similarity on word production have reported conflicting results, notably within and across languages. The aim of the present study was to clarify this empirical issue to provide specific constraints for theoretical models of language production. We investigated the role of phonological neighborhood density in a large-scale picture naming experiment using fine-grained statistical models. The results showed that increasing phonological neighborhood density has a detrimental effect on naming latencies, and re-analyses of independently obtained data sets provide supplementary evidence for this effect. Finally, we reviewed a large body of evidence concerning phonological neighborhood density effects in word production, and discussed the occurrence of facilitatory and inhibitory effects in accuracy measures. The overall pattern shows that phonological neighborhood generates two opposite forces, one facilitatory and one inhibitory. In cases where speech production is disrupted (e.g. certain aphasic symptoms), the facilitatory component may emerge, but inhibitory processes dominate in efficient naming by healthy speakers. These findings are difficult to accommodate in terms of monitoring processes, but can be explained within interactive activation accounts combining phonological facilitation and lexical competition.
Resumo:
Markowitz portfolio theory (1952) has induced research into the efficiency of portfolio management. This paper studies existing nonparametric efficiency measurement approaches for single period portfolio selection from a theoretical perspective and generalises currently used efficiency measures into the full mean-variance space. Therefore, we introduce the efficiency improvement possibility function (a variation on the shortage function), study its axiomatic properties in the context of Markowitz efficient frontier, and establish a link to the indirect mean-variance utility function. This framework allows distinguishing between portfolio efficiency and allocative efficiency. Furthermore, it permits retrieving information about the revealed risk aversion of investors. The efficiency improvement possibility function thus provides a more general framework for gauging the efficiency of portfolio management using nonparametric frontier envelopment methods based on quadratic optimisation.
Resumo:
We identify in this paper two conditions that characterize the domain of single-peaked preferences on the line in the following sense: a preference profile satisfies these two properties if and only if there exists a linear order $L$ over the set of alternatives such that these preferences are single-peaked with respect L. The first property states that for any subset of alternatives the set of alternatives considered as the worst by all agents cannot contains more than 2 elements. The second property states that two agents cannot disagree on the relative ranking of two alternatives with respect to a third alternative but agree on the (relative) ranking of a fourth one. Classification-JEL: D71, C78
Resumo:
We consider the problem of allocating an infinitely divisible commodity among a group of agents with single-peaked preferences. A rule that has played a central role in the analysis of the problem is the so-called uniform rule. Chun (2001) proves that the uniform rule is the only rule satisfying Pareto optimality, no-envy, separability, and continuity (with respect to the social endowment). We obtain an alternative characterization by using a weak replication-invariance condition, called duplication-invariance, instead of continuity. Furthermore, we prove that Pareto optimality, equal division lower bound, and separability imply no-envy. Using this result, we strengthen one of Chun's (2001) characterizations of the uniform rule by showing that the uniform rule is the only rule satisfying Pareto optimality, equal división lower bound, separability, and either continuity or duplication-invariance.
Resumo:
Un reto al ejecutar las aplicaciones en un cluster es lograr mejorar las prestaciones utilizando los recursos de manera eficiente, y este reto es mayor al utilizar un ambiente distribuido. Teniendo en cuenta este reto, se proponen un conjunto de reglas para realizar el cómputo en cada uno de los nodos, basado en el análisis de cómputo y comunicaciones de las aplicaciones, se analiza un esquema de mapping de celdas y un método para planificar el orden de ejecución, tomando en consideración la ejecución por prioridad, donde las celdas de fronteras tienen una mayor prioridad con respecto a las celdas internas. En la experimentación se muestra el solapamiento del computo interno con las comunicaciones de las celdas fronteras, obteniendo resultados donde el Speedup aumenta y los niveles de eficiencia se mantienen por encima de un 85%, finalmente se obtiene ganancias de los tiempos de ejecución, concluyendo que si se puede diseñar un esquemas de solapamiento que permita que la ejecución de las aplicaciones SPMD en un cluster se hagan de forma eficiente.
Resumo:
We characterize the class of strategy-proof social choice functions on the domain of symmetric single-peaked preferences. This class is strictly larger than the set of generalized median voter schemes (the class of strategy-proof and tops-only social choice functions on the domain of single-peaked preferences characterized by Moulin (1980)) since, under the domain of symmetric single-peaked preferences, generalized median voter schemes can be disturbed by discontinuity points and remain strategy-proof on the smaller domain. Our result identifies the specific nature of these discontinuities which allow to design non-onto social choice functions to deal with feasibility constraints.
Resumo:
El procés de fusió de dues o més imatges de la mateixa escena en una d'única i més gran és conegut com a Image Mosaicing. Un cop finalitzat el procés de construcció d'un mosaic, els límits entre les imatges són habitualment visibles, degut a imprecisions en els registres fotomètric i geomètric. L'Image Blending és l'etapa del procediment de mosaicing a la que aquests artefactes són minimitzats o suprimits. Existeixen diverses metodologies a la literatura que tracten aquests problemes, però la majoria es troben orientades a la creació de panorames terrestres, imatges artístiques d'alta resolució o altres aplicacions a les quals el posicionament de la càmera o l'adquisició de les imatges no són etapes rellevants. El treball amb imatges subaquàtiques presenta desafiaments importants, degut a la presència d'scattering (reflexions de partícules en suspensió) i atenuació de la llum i a condicions físiques extremes a milers de metres de profunditat, amb control limitat dels sistemes d'adquisició i la utilització de tecnologia d'alt cost. Imatges amb il·luminació artificial similar, sense llum global com la oferta pel sol, han de ser unides sense mostrar una unió perceptible. Les imatges adquirides a gran profunditat presenten una qualitat altament depenent de la profunditat, i la seva degradació amb aquest factor és molt rellevant. El principal objectiu del treball és presentar dels principals problemes de la imatge subaquàtica, seleccionar les estratègies més adequades i tractar tota la seqüència adquisició-procesament-visualització del procés. Els resultats obtinguts demostren que la solució desenvolupada, basada en una Estratègia de Selecció de Límit Òptim, Fusió en el Domini del Gradient a les regions comunes i Emfatització Adaptativa d'Imatges amb baix nivell de detall permet obtenir uns resultats amb una alta qualitat. També s'ha proposat una estratègia, amb possibilitat d'implementació paral·lela, que permet processar mosaics de kilòmetres d'extensió amb resolució de centímetres per píxel.
Resumo:
The control of optical fields on the nanometre scale is becoming an increasingly important tool in many fields, ranging from channelling light delivery in photovoltaics and light emitting diodes to increasing the sensitivity of chemical sensors to single molecule levels. The ability to design and manipulate light fields with specific frequency and space characteristics is explored in this project. We present an alternative realisation of Extraordinary Optical Transmission (EOT) that requires only a single aperture and a coupled waveguide. We show how this waveguide-resonant EOT improves the transmissivity of single apertures. An important technique in imaging is Near-Field Scanning Optical Microscopy (NSOM); we show how waveguide-resonant EOT and the novel probe design assist in improving the efficiency of NSOM probes by two orders of magnitude, and allow the imaging of single molecules with an optical resolution of as good as 50 nm. We show how optical antennas are fabricated into the apex of sharp tips and can be used in a near-field configuration.
Resumo:
The use of orthonormal coordinates in the simplex and, particularly, balance coordinates, has suggested the use of a dendrogram for the exploratory analysis of compositional data. The dendrogram is based on a sequential binary partition of a compositional vector into groups of parts. At each step of a partition, one group of parts isdivided into two new groups, and a balancing axis in the simplex between both groupsis defined. The set of balancing axes constitutes an orthonormal basis, and the projections of the sample on them are orthogonal coordinates. They can be represented in adendrogram-like graph showing: (a) the way of grouping parts of the compositional vector; (b) the explanatory role of each subcomposition generated in the partition process;(c) the decomposition of the total variance into balance components associated witheach binary partition; (d) a box-plot of each balance. This representation is useful tohelp the interpretation of balance coordinates; to identify which are the most explanatory coordinates; and to describe the whole sample in a single diagram independentlyof the number of parts of the sample
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption