864 resultados para modularised computing unit
Resumo:
We propose a nonlinear heterogeneous panel unit root test for testing the null hypothesis of unit-roots processes against the alternative that allows a proportion of units to be generated by globally stationary ESTAR processes and a remaining non-zero proportion to be generated by unit root processes. The proposed test is simple to implement and accommodates cross sectional dependence. We show that the distribution of the test statistic is free of nuisance parameters as (N, T) −! 1. Monte Carlo simulation shows that our test holds correct size and under the hypothesis that data are generated by globally stationary ESTAR processes has a better power than the recent test proposed in Pesaran [2007]. Various applications are provided.
Resumo:
Vaccines have been used as a successful tool in medicine by way of controlling many major diseases. In spite of this, vaccines today represent only a handful of all infectious diseases. Therefore, there is a pressing demand for improvements of existing vaccines with particular reference to higher efficacy and undisputed safety profiles. To this effect, as an alternative to available vaccine technologies, there has been a drive to develop vaccine candidate polypeptides by chemical synthesis. In our laboratory, we have recently developed a technology to manufacture long synthetic peptides of up to 130 residues, which are correctly folded and biologically active. This paper discusses the advantages of the molecularly defined, long synthetic peptide approach in the context of vaccine design, development and use in human vaccination.
Resumo:
Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.
Resumo:
(Résumé de l'ouvrage) This volume contains the papers presented at the 47th Colloquium Biblicum Lovaniense (Leuven, 1998). The general theme of the meeting was the unity of the Gospel of Luke and the Acts of the Apostles. Main papers on this topic were read by R.L. Brawley, J. Delobel, A. Denaux, J.A. Fitzmeyer, F.W. Horn, J. Kremer, A. Lindemann, O. Mainville, D. Marguerat, F. Neirynck, W. Radl, M. Rese, J. Taylor, C.M. Tuckett, and J. Verheyden. While a large majority of scholars agree that Luke intended his work to cover both the past and the continuing history of Jesus (Gospel and Acts), the essays also illustrate the complexities of this view on the unity of Luke-Acts when it comes to interpret the various aspects of Lukan theology, christology, pneumatology, and ecclesiology, the expansion of the Church in light of its Jewish origins, the genre of Luke-Acts, and the literary and stylistic means Luke used to make his work a unity. In total the volume includes some 40 papers, of which 24 are offered papers: L. Alexander, H. Baarlink, M. Bachmann, D. Bechard, T.L. Brodie, G.P. Carras, A. del Agua, C. Focant, G. Geiger, B.J. Koet, V. Koperski, D.P. Moessner, G. Oegema, J. Pichler, E. Plümacher, A. Puig i Tarrèch, U. Schmid, B. Schwank, N. Taylor, P.J. Tomson, S. Van den Eynde, S. Walton, G. Wasserberg, F. Wilk. This collection is an invaluable contribution to current discussions in Lukan study and to a nuanced understanding of the relationship between Luke's two volumes.
Resumo:
A multiple-partners assignment game with heterogeneous sales and multiunit demands consists of a set of sellers that own a given number of indivisible units of (potentially many different) goods and a set of buyers who value those units and want to buy at most an exogenously fixed number of units. We define a competitive equilibrium for this generalized assignment game and prove its existence by using only linear programming. In particular, we show how to compute equilibrium price vectors from the solutions of the dual linear program associated to the primal linear program defined to find optimal assignments. Using only linear programming tools, we also show (i) that the set of competitive equilibria (pairs of price vectors and assignments) has a Cartesian product structure: each equilibrium price vector is part of a competitive equilibrium with all optimal assignments, and vice versa; (ii) that the set of (restricted) equilibrium price vectors has a natural lattice structure; and (iii) how this structure is translated into the set of agents' utilities that are attainable at equilibrium.
Resumo:
The treatment of back pain patients refers to the biopsychosocial model of care. This model includes illness in patient's personal and relational life. In this context, it is not only the physical symptom of the patient which is focused but also his psychological distress often hidden by algic complain. Clinical interviews conducted with back pain patients have highlighted psychosocial aspects able to influence the relationship between health care user and provider. Taking account of psychosocial aspects implies an interdisciplinary approach that identify and assesses patients' needs through adequate tools. As a result, the different health care providers implied with back pain patients have to collaborate in a structured network.
Resumo:
Aquest projecte descriu la fusió de les necessitats diaries de monitorització del experiment ATLAS des del punt de vista del cloud. La idea principal es desenvolupar un conjunt de col·lectors que recullin informació de la distribució i processat de les dades i dels test de wlcg (Service Availability Monitoring), emmagatzemant-la en BBDD específiques per tal de mostrar els resultats en una sola pàgina HLM (High Level Monitoring). Un cop aconseguit, l’aplicació ha de permetre investigar més enllà via interacció amb el front-end, el qual estarà alimentat per les estadístiques emmagatzemades a la BBDD.
Resumo:
Hypoglycemia, if recurrent, may have severe consequences on cognitive and psychomotor development of neonates. Therefore, screening for hypoglycemia is a daily routine in every facility taking care of newborn infants. Point-of-care-testing (POCT) devices are interesting for neonatal use, as their handling is easy, measurements can be performed at bedside, demanded blood volume is small and results are readily available. However, such whole blood measurements are challenged by a wide variation of hematocrit in neonates and a spectrum of normal glucose concentration at the lower end of the test range. We conducted a prospective trial to check precision and accuracy of the best suitable POCT device for neonatal use from three leading companies in Europe. Of the three devices tested (Precision Xceed, Abbott; Elite XL, Bayer; Aviva Nano, Roche), Aviva Nano exhibited the best precision. None completely fulfilled the ISO-accuracy-criteria 15197: 2003 or 2011. Aviva Nano fulfilled these criteria in 92% of cases while the others were <87%. Precision Xceed reached the 95% limit of the 2003 ISO-criteria for values ≤4.2 mmol/L, but not for the higher range (71%). Although validated for adults, new POCT devices need to be specifically evaluated on newborn infants before adopting their routine use in neonatology.
Resumo:
Hypergraph width measures are a class of hypergraph invariants important in studying the complexity of constraint satisfaction problems (CSPs). We present a general exact exponential algorithm for a large variety of these measures. A connection between these and tree decompositions is established. This enables us to almost seamlessly adapt the combinatorial and algorithmic results known for tree decompositions of graphs to the case of hypergraphs and obtain fast exact algorithms. As a consequence, we provide algorithms which, given a hypergraph H on n vertices and m hyperedges, compute the generalized hypertree-width of H in time O*(2n) and compute the fractional hypertree-width of H in time O(1.734601n.m).1