139 resultados para RM(rate monotonic)algorithm
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
El objetivo de este proyecto es la predicción de la pérdida de paquete. Para ello necesitaremos el modelado del canal. De esta manera, podremos determinar cuando una transmisión llega con éxito o no. En primer lugar, se han estudiado los algoritmos de adaptación de la tasa. Estos algoritmos mejoran el rendimiento de la comunicación. Por este motivo, el programa de simulación se basa en algunos de estos algoritmos. En paralelo, se han capturado medidas del canal terrestre para realizar el modelado. Finalmente, con un programa mucho más completo se ha simulado el comportamiento de una transmisión con el modelado del canal físico, y se han asumido algunas consideraciones, como las colisiones. Por lo tanto, se ha obtenido un resultado más realista, con el cual se ha analizado teóricamente las posibilidades de un enlace entre el canal terrestre y el canal satélite, para crear una red híbrida.
Resumo:
A systolic array to implement lattice-reduction-aided lineardetection is proposed for a MIMO receiver. The lattice reductionalgorithm and the ensuing linear detections are operated in the same array, which can be hardware-efficient. All-swap lattice reduction algorithm (ASLR) is considered for the systolic design.ASLR is a variant of the LLL algorithm, which processes all lattice basis vectors within one iteration. Lattice-reduction-aided linear detection based on ASLR and LLL algorithms have very similarbit-error-rate performance, while ASLR is more time efficient inthe systolic array, especially for systems with a large number ofantennas.
Resumo:
Background: Attention to patients with acute minor-illnesses requesting same-day consultation represents a major burden in primary care. The workload is assumed by general practitioners in many countries. A number of reports suggest that care to these patients may be provided, at in least in part, by nurses. However, there is scarce information with respect to the applicability of a program of nurse management for adult patients with acute minor-illnesses in large areas. The aim of this study is to assess the effectiveness of a program of nurse algorithm-guided care for adult patients with acute minor illnesses requesting same-day consultation in primary care in a largely populated area. Methods: A cross-sectional study of all adult patients seeking same day consultation for 16 common acute minor illnesses in a large geographical area with 284 primary care practices. Patients were included in a program of nurse case management using management algorithms. The main outcome measure was case resolution, defined as completion of the algorithm by the nurse without need of referral of the patient to the general practitioner. The secondary outcome measure was return to consultation, defined as requirement of new consultation for the same reason as the first one, in primary care within a 7-day period. Results: During a two year period (April 2009-April 2011), a total of 1,209,669 consultations were performed in the program. Case resolution was achieved by nurses in 62.5% of consultations. The remaining cases were referred to a general practitioner. Resolution rates ranged from 94.2% in patients with burns to 42% in patients with upper respiratory symptoms. None of the 16 minor illnesses had a resolution rate below 40%. Return to consultation during a 7-day period was low, only 4.6%. Conclusions: A program of algorithms-guided care is effective for nurse case management of patients requesting same day consultation for minor illnesses in primary care.
Resumo:
Background: Attention to patients with acute minor-illnesses requesting same-day consultation represents a major burden in primary care. The workload is assumed by general practitioners in many countries. A number of reports suggest that care to these patients may be provided, at in least in part, by nurses. However, there is scarce information with respect to the applicability of a program of nurse management for adult patients with acute minor-illnesses in large areas. The aim of this study is to assess the effectiveness of a program of nurse algorithm-guided care for adult patients with acute minor illnesses requesting same-day consultation in primary care in a largely populated area. Methods: A cross-sectional study of all adult patients seeking same day consultation for 16 common acute minor illnesses in a large geographical area with 284 primary care practices. Patients were included in a program of nurse case management using management algorithms. The main outcome measure was case resolution, defined as completion of the algorithm by the nurse without need of referral of the patient to the general practitioner. The secondary outcome measure was return to consultation, defined as requirement of new consultation for the same reason as the first one, in primary care within a 7-day period. Results: During a two year period (April 2009-April 2011), a total of 1,209,669 consultations were performed in the program. Case resolution was achieved by nurses in 62.5% of consultations. The remaining cases were referred to a general practitioner. Resolution rates ranged from 94.2% in patients with burns to 42% in patients with upper respiratory symptoms. None of the 16 minor illnesses had a resolution rate below 40%. Return to consultation during a 7-day period was low, only 4.6%. Conclusions: A program of algorithms-guided care is effective for nurse case management of patients requesting same day consultation for minor illnesses in primary care.
Resumo:
Background: Attention to patients with acute minor-illnesses requesting same-day consultation represents a major burden in primary care. The workload is assumed by general practitioners in many countries. A number of reports suggest that care to these patients may be provided, at in least in part, by nurses. However, there is scarce information with respect to the applicability of a program of nurse management for adult patients with acute minor-illnesses in large areas. The aim of this study is to assess the effectiveness of a program of nurse algorithm-guided care for adult patients with acute minor illnesses requesting same-day consultation in primary care in a largely populated area. Methods: A cross-sectional study of all adult patients seeking same day consultation for 16 common acute minor illnesses in a large geographical area with 284 primary care practices. Patients were included in a program of nurse case management using management algorithms. The main outcome measure was case resolution, defined as completion of the algorithm by the nurse without need of referral of the patient to the general practitioner. The secondary outcome measure was return to consultation, defined as requirement of new consultation for the same reason as the first one, in primary care within a 7-day period. Results: During a two year period (April 2009-April 2011), a total of 1,209,669 consultations were performed in the program. Case resolution was achieved by nurses in 62.5% of consultations. The remaining cases were referred to a general practitioner. Resolution rates ranged from 94.2% in patients with burns to 42% in patients with upper respiratory symptoms. None of the 16 minor illnesses had a resolution rate below 40%. Return to consultation during a 7-day period was low, only 4.6%. Conclusions: A program of algorithms-guided care is effective for nurse case management of patients requesting same day consultation for minor illnesses in primary care.
Resumo:
Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.
Resumo:
The parameterized expectations algorithm (PEA) involves a long simulation and a nonlinear least squares (NLS) fit, both embedded in a loop. Both steps are natural candidates for parallelization. This note shows that parallelization can lead to important speedups for the PEA. I provide example code for a simple model that can serve as a template for parallelization of more interesting models, as well as a download link for an image of a bootable CD that allows creation of a cluster and execution of the example code in minutes, with no need to install any software.
Resumo:
The purpose of this paper is to study the determinants of equilibrium in the market for daily funds. We use the EONIA panel database which includes daily information on the lending rates applied by contributing commercial banks. The data clearly shows an increase in both the time series volatility and the cross section dispersion of rates towards the end of the reserve maintenance period. These increases are highly correlated. With respect to quantities, we find that the volume of trade as well as the use of the standing facilities are also larger at the end of the maintenance period. Our theoretical model shows how the operational framework of monetary policy causes a reduction in the elasticity of the supply of funds by banks throughout the reserve maintenance period. This reduction in the elasticity together with market segmentation and heterogeneity are able to generate distributions for the interest rates and quantities traded with the same properties as in the data.
Resumo:
The paper provides a description and analysis of the Hodgskin section of Theories of Surplus Value and the general law section of the first version of Volume III of Capital. It then considers Part III of Volume III, the evolution of Marx's thought and various interpretations of his theory in the light of this analysis. It is suggested that Marx thought that the rate of profit must fall and even in the 1870s hoped to be able to provide a demonstration of this. However the main conclusions are: 1. Marx's major attempt to show that the rate of profit must fall occurred in the general law section. 2. Part III does not contain a demonstration that the rate of profit must fall. 3. Marx was never able to demonstrate that the rate of profit must fall and he was aware of this.
Resumo:
This paper analyzes the linkages between the credibility of a target zone regime, the volatility of the exchange rate, and the width of the band where the exchange rate is allowed to fluctuate. These three concepts should be related since the band width induces a trade-off between credibility and volatility. Narrower bands should give less scope for the exchange rate to fluctuate but may make agents perceive a larger probability of realignment which by itself should increase the volatility of the exchange rate. We build a model where this trade-off is made explicit. The model is used to understand the reduction in volatility experienced by most EMS countries after their target zones were widened on August 1993. As a natural extension, the model also rationalizes the existence of non-official, implicit target zones (or fear of floating), suggested by some authors.
Resumo:
This paper analyses the theoretical relevance of the dynamical aspects of growth on the discussion about the observed positive correlation between per capita real income and real exchange rates. With this purpose, we develop a simple exogenous growth model where the internal, external and intertemporal equilibrium conditions of a typical macroeconomic model are imposed; this last one through the inclusion of a balanced growth path for the foreign assets accumulation. The main result under this consideration is that the relationship defended by the Balassa-Samuelson hypothesis is no more so straightforward. In our particular approach, the mentioned bilateral relationship depends on a parameter measuring thriftiness in the economy. Therefore, the probability of ending up with a positive relationship between growth and real exchange rates -as the classical economic theory predicts- will be higher when the economy is able to maintain a minimum saving ratio. Moreover, given that our model considers a simple Keynesian consumption function, some explosive paths can also be possible.
Resumo:
This paper examines, both descriptively and analytically, Marx's arguments for the falling rate of profit from the Hodgskin section of Theories of Surplus Value, The General Law section of the recently published Volume 33 of the Collected Works and Chapter 3 of Volume III of Capital. The conclusions are as follows: First, Marx realised that his main attempt to give an intrinsic explanation of the falling rate of profit, which occurred in the General Law section, had failed; but he still hoped that he would be able to demonstrate it in the future. Second, the Hodgskin and General Law sections contain a number of subsidiary explanations, mostly related to resource scarcity, some of which are correct. Third, Part III of volume III does not contain a demonstration of the falling rate of profit, but a description of the role of the falling rate of profit in capitalist development. Forth, it also contains suppressed references to resource scarcity. And finally, in Chapter 3 of Volume III, Marx says that it is resource scarcity that causes the fall in the rate of profit described in Part III of the same volume. The key to all these conclusions in the careful analysis of the General Law section.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.
Resumo:
From the classical gold standard up to the current ERM2 arrangement of the European Union, target zones have been a widely used exchange regime in contemporary history. This paper presents a benchmark model that rationalizes the choice of target zones over the rest of regimes: the fixed rate, the free float and the managed float. It is shown that the monetary authority may gain efficiency by reducing volatility of both the exchange rate and the interest rate at the same time. Furthermore, the model is consistent with some known stylized facts in the empirical literature that previous models were not able to produce, namely, the positive relation between the exchange rate and the interest rate differential, the degree of non-linearity of the function linking the exchage rate to fundamentals and the shape of the exchange rate stochastic distribution.