29 resultados para Forward error correcting code
Resumo:
This paper presents results on a verification test of a Direct Numerical Simulation code of mixed high-order of accuracy using the method of manufactured solutions (MMS). This test is based on the formulation of an analytical solution for the Navier-Stokes equations modified by the addition of a source term. The present numerical code was aimed at simulating the temporal evolution of instability waves in a plane Poiseuille flow. The governing equations were solved in a vorticity-velocity formulation for a two-dimensional incompressible flow. The code employed two different numerical schemes. One used mixed high-order compact and non-compact finite-differences from fourth-order to sixth-order of accuracy. The other scheme used spectral methods instead of finite-difference methods for the streamwise direction, which was periodic. In the present test, particular attention was paid to the boundary conditions of the physical problem of interest. Indeed, the verification procedure using MMS can be more demanding than the often used comparison with Linear Stability Theory. That is particularly because in the latter test no attention is paid to the nonlinear terms. For the present verification test, it was possible to manufacture an analytical solution that reproduced some aspects of an instability wave in a nonlinear stage. Although the results of the verification by MMS for this mixed-order numerical scheme had to be interpreted with care, the test was very useful as it gave confidence that the code was free of programming errors. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Reconciliation can be divided into stages, each stage representing the performance of a mining operation, such as: long-term estimation, short-term estimation, planning, mining and mineral processing. The gold industry includes another stage which is the budget, when the company informs the financial market of its annual production forecast. The division of reconciliation into stages increases the reliability of the annual budget informed by the mining companies, while also detecting and correcting the critical steps responsible for the overall estimation error by the optimization of sampling protocols and equipment. This paper develops and validates a new reconciliation model for the gold industry, which is based on correct sampling practices and the subdivision of reconciliation into stages, aiming for better grade estimates and more efficient control of the mining industry`s processes, from resource estimation to final production.
Resumo:
In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
In this work, a wide analysis of local search multiuser detection (LS-MUD) for direct sequence/code division multiple access (DS/CDMA) systems under multipath channels is carried out considering the performance-complexity trade-off. It is verified the robustness of the LS-MUD to variations in loading, E(b)/N(0), near-far effect, number of fingers of the Rake receiver and errors in the channel coefficients estimates. A compared analysis of the bit error rate (BER) and complexity trade-off is accomplished among LS, genetic algorithm (GA) and particle swarm optimization (PSO). Based on the deterministic behavior of the LS algorithm, it is also proposed simplifications over the cost function calculation, obtaining more efficient algorithms (simplified and combined LS-MUD versions) and creating new perspectives for the MUD implementation. The computational complexity is expressed in terms of the number of operations in order to converge. Our conclusion pointed out that the simplified LS (s-LS) method is always more efficient, independent of the system conditions, achieving a better performance with a lower complexity than the others heuristics detectors. Associated to this, the deterministic strategy and absence of input parameters made the s-LS algorithm the most appropriate for the MUD problem. (C) 2008 Elsevier GmbH. All rights reserved.
Resumo:
This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
This paper analyzes the complexity-performance trade-off of several heuristic near-optimum multiuser detection (MuD) approaches applied to the uplink of synchronous single/multiple-input multiple-output multicarrier code division multiple access (S/MIMO MC-CDMA) systems. Genetic algorithm (GA), short term tabu search (STTS) and reactive tabu search (RTS), simulated annealing (SA), particle swarm optimization (PSO), and 1-opt local search (1-LS) heuristic multiuser detection algorithms (Heur-MuDs) are analyzed in details, using a single-objective antenna-diversity-aided optimization approach. Monte- Carlo simulations show that, after convergence, the performances reached by all near-optimum Heur-MuDs are similar. However, the computational complexities may differ substantially, depending on the system operation conditions. Their complexities are carefully analyzed in order to obtain a general complexity-performance framework comparison and to show that unitary Hamming distance search MuD (uH-ds) approaches (1-LS, SA, RTS and STTS) reach the best convergence rates, and among them, the 1-LS-MuD provides the best trade-off between implementation complexity and bit error rate (BER) performance.
Resumo:
We develop a forward-looking version of the recursive dynamic MIT Emissions Prediction and Policy Analysis (EPPA) model, and apply it to examine the economic implications of proposals in the US Congress to limit greenhouse gas (GHG) emissions. We find that shocks in the consumption path are smoothed out in the forward-looking model and that the lifetime welfare cost of GHG policy is lower than in the recursive model, since the forward-looking model can fully optimize over time. The forward-looking model allows us to explore issues for which it is uniquely well suited, including revenue-recycling and early action crediting. We find capital tax recycling to be more welfare-cost reducing than labor tax recycling because of its long-term effect on economic growth. Also, there are substantial incentives for early action credits; however, when spread over the full horizon of the policy they do not have a substantial effect on lifetime welfare costs.
Resumo:
This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
This paper outlines approaches to developing the International Society of Physical and Rehabilitation Medicine (ISPRM) and addresses many current challenges Most importantly, these approaches provide the basis for ISPRM to develop its leadership role within the field of Physical and Rehabilitation Medicine (PRM) and in relation to the World Health Organization (WHO) and the United Nations (UN) system at large. They also address a number of specific critiques of the current situation. A positioning of ISPRM within the world architecture of the UN and WHO systems, as well as the consideration and fostering of respective emerging regional PRM societies, is central to establishing networking connections at different levels of the world society. Yearly congresses, possibly in co-operation with a regional society, based on a defined regional rotation, are suggested. Thus, frustration with the current bidding system for a biennial congress and an intermediate meeting could be overcome. Yearly congresses are also an important step towards increasing the organization`s funding base, and hence the possibility to expand the functions of ISPRM`s central office. ISPRM`s envisioned leadership role in the context of an international web of PRM journals complementing the formally defined official journal of ISPRM, regional societies and so forth, is an inclusive rather than exclusive approach that contributes to the development of PRM journals worldwide. An important prerequisite for the further development of ISPRM is the expansion and bureaucratization of its Central Office, adding professionalism and systematic allocation of resources to the strengths of the voluntary engagement of individual PRM doctors.
Resumo:
Telemedicine might increase the speed of diagnosis for leprosy and reduce the development of disabilities. We compared the accuracy of diagnosis made by telemedicine with that made by in-person examination. The cases were patients with suspected leprosy at eight public health clinics in outlying areas of the city of Sao Paulo. The case history and clinical examination data, and at least two clinical images for each patient, were stored in a web-based system developed for teledermatology. After the examination in the public clinic, patients then attended a teaching hospital for an in-person examination. The benchmark was the clinical examination of two dermatologists at the university hospital. From August 2005 to April 2006, 142 suspected cases of leprosy were forwarded to the website by the doctors at the clinics. Of these, 36 cases were excluded. There was overall agreement in the diagnosis of leprosy in 74% of the 106 remaining cases. The sensitivity was 78% and the specificity was 31%. Although the specificity was low, the study suggests that telemedicine may be a useful low-cost method for obtaining second opinions in programmes to control leprosy.
Resumo:
Background: Biochemical analysis of fluid is the primary laboratory approach hi pleural effusion diagnosis. Standardization of the steps between collection and laboratorial analyses are fundamental to maintain the quality of the results. We evaluated the influence of temperature and storage time on sample stability. Methods: Pleural fluid from 30 patients was submitted to analyses of proteins, albumin, lactic dehydrogenase (LDH), cholesterol, triglycerides, and glucose. Aliquots were stored at 21 degrees, 4 degrees, and-20 degrees C, and concentrations were determined after 1, 2, 3, 4, 7, and 14 days. LDH isoenzymes were quantified in 7 random samples. Results: Due to the instability of isoenzymes 4 and 5, a decrease in LDH was observed in the first 24 h in samples maintained at -20 degrees C and after 2 days when maintained at 4 degrees C. Aside from glucose, all parameters were stable for up to at least day 4 when stored at room temperature or 4 degrees C. Conclusions: Temperature and storage time are potential preanalytical errors in pleural fluid analyses, mainly if we consider the instability of glucose and LDH. The ideal procedure is to execute all the tests immediately after collection. However, most of the tests can be done in refrigerated sample;, excepting LDH analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Parenteral anticoagulation is a cornerstone in the management of venous and arterial thrombosis. Unfractionated heparin has a wide dose/response relationship, requiring frequent and troublesome laboratorial follow-up. Because of all these factors, low-molecular-weight heparin use has been increasing. Inadequate dosage has been pointed out as a potential problem because the use of subjectively estimated weight instead of real measured weight is common practice in the emergency department (ED). To evaluate the impact of inadequate weight estimation on enoxaparin dosage, we investigated the adequacy of anticoagulation of patients in a tertiary ED where subjective weight estimation is common practice. We obtained the estimated, informed, and measured weight of 28 patients in need of parenteral anticoagulation. Basal and steady-state (after the second subcutaneous shot of enoxaparin) anti-Xa activity was obtained as a measure of adequate anticoagulation. The patients were divided into 2 groups according the anticoagulation adequacy. From the 28 patients enrolled, 75% (group 1, n = 21) received at least 0.9 mg/kg per dose BID and 25% (group 2, n = 7) received less than 0.9 mg/kg per dose BID of enoxaparin. Only 4 (14.3%) of all patients had anti-Xa activity less than the inferior limit of the therapeutic range (<0.5 UI/mL), all of them from group 2. In conclusion, when weight estimation was used to determine the enoxaparin dosage, 25% of the patients were inadequately anticoagulated (anti-Xa activity <0.5 UI/mL) during the initial crucial phase of treatment. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Clinicians working in the field of congenital and paediatric cardiology have long felt the need for a common diagnostic and therapeutic nomenclature and coding system with which to classify patients of all ages with congenital and acquired cardiac disease. A cohesive and comprehensive system of nomenclature, suitable for setting a global standard for multicentric analysis of outcomes and stratification of risk, has only recently emerged, namely, The International Paediatric and Congenital Cardiac Code. This review, will give an historical perspective on the development of systems of nomenclature in general, and specifically with respect to the diagnosis and treatment of patients with paediatric and congenital cardiac disease. Finally, current and future efforts to merge such systems into the paperless environment of the electronic health or patient record on a global scale are briefly explored. On October 6, 2000, The International Nomenclature Committee for Pediatric and Congenital Heart Disease was established. In January, 2005, the International Nomenclature Committee was constituted in Canada as The International Society for Nomenclature of Paediatric and Congenital Heart Disease. This International Society now has three working groups. The Nomenclature Working Group developed The International Paediatric and Congenital Cardiac Code and will continue to maintain, expand, update, and preserve this International Code. It will also provide ready access to the International Code for the global paediatric and congenital cardiology and cardiac surgery communities, related disciplines, the healthcare industry, and governmental agencies, both electronically and in published form. The Definitions Working Group will write definitions for the terms in the International Paediatric and Congenital Cardiac Code, building on the previously published definitions from the Nomenclature Working Group. The Archiving Working Group, also known as The Congenital Heart Archiving Research Team, will link images and videos to the International Paediatric and Congenital Cardiac Code. The images and videos will be acquired from cardiac morphologic specimens and imaging modalities such as echocardiography, angiography, computerized axial tomography and magnetic resonance imaging, as well as intraoperative images and videos. Efforts are ongoing to expand the usage of The International Paediatric and Congenital Cardiac Code to other areas of global healthcare. Collaborative efforts are under-way involving the leadership of The International Nomenclature Committee for Pediatric and Congenital Heart Disease and the representatives of the steering group responsible for the creation of the 11th revision of the International Classification of Diseases, administered by the World Health Organisation. Similar collaborative efforts are underway involving the leadership of The International Nomenclature Committee for Pediatric and Congenital Heart Disease and the International Health Terminology Standards Development Organisation, who are the owners of the Systematized Nomenclature of Medicine or ""SNOMED"". The International Paediatric and Congenital Cardiac Code was created by specialists in the field to name and classify paediatric and congenital cardiac disease and its treatment. It is a comprehensive code that can be freely downloaded from the internet (http://www.IPCCC.net) and is already in use worldwide, particularly for international comparisons of outcomes. The goal of this effort is to create strategies for stratification of risk and to improve healthcare for the individual patient. The collaboration with the World Heath Organization, the International Health Terminology Standards Development Organisation, and the healthcare Industry, will lead to further enhancement of the International Code, and to Its more universal use.
Resumo:
Gene expression profiling by cDNA microarrays during murine thymus ontogeny has contributed to dissecting the large-scale molecular genetics of T cell maturation. Gene profiling, although useful for characterizing the thymus developmental phases and identifying the differentially expressed genes, does not permit the determination of possible interactions between genes. In order to reconstruct genetic interactions, on RNA level, within thymocyte differentiation, a pair of microarrays containing a total of 1,576 cDNA sequences derived from the IMAGE MTB library was applied on samples of developing thymuses (14-17 days of gestation). The data were analyzed using the GeneNetwork program. Genes that were previously identified as differentially expressed during thymus ontogeny showed their relationships with several other genes. The present method provided the detection of gene nodes coding for proteins implicated in the calcium signaling pathway, such as Prrg2 and Stxbp3, and in protein transport toward the cell membrane, such as Gosr2. The results demonstrate the feasibility of reconstructing networks based on cDNA microarray gene expression determinations, contributing to a clearer understanding of the complex interactions between genes involved in thymus/thymocyte development.