11 resultados para Code Division Multiple Access System
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
In this work, a wide analysis of local search multiuser detection (LS-MUD) for direct sequence/code division multiple access (DS/CDMA) systems under multipath channels is carried out considering the performance-complexity trade-off. It is verified the robustness of the LS-MUD to variations in loading, E(b)/N(0), near-far effect, number of fingers of the Rake receiver and errors in the channel coefficients estimates. A compared analysis of the bit error rate (BER) and complexity trade-off is accomplished among LS, genetic algorithm (GA) and particle swarm optimization (PSO). Based on the deterministic behavior of the LS algorithm, it is also proposed simplifications over the cost function calculation, obtaining more efficient algorithms (simplified and combined LS-MUD versions) and creating new perspectives for the MUD implementation. The computational complexity is expressed in terms of the number of operations in order to converge. Our conclusion pointed out that the simplified LS (s-LS) method is always more efficient, independent of the system conditions, achieving a better performance with a lower complexity than the others heuristics detectors. Associated to this, the deterministic strategy and absence of input parameters made the s-LS algorithm the most appropriate for the MUD problem. (C) 2008 Elsevier GmbH. All rights reserved.
Resumo:
This work aims at proposing the use of the evolutionary computation methodology in order to jointly solve the multiuser channel estimation (MuChE) and detection problems at its maximum-likelihood, both related to the direct sequence code division multiple access (DS/CDMA). The effectiveness of the proposed heuristic approach is proven by comparing performance and complexity merit figures with that obtained by traditional methods found in literature. Simulation results considering genetic algorithm (GA) applied to multipath, DS/CDMA and MuChE and multi-user detection (MuD) show that the proposed genetic algorithm multi-user channel estimation (GAMuChE) yields a normalized mean square error estimation (nMSE) inferior to 11%, under slowly varying multipath fading channels, large range of Doppler frequencies and medium system load, it exhibits lower complexity when compared to both maximum likelihood multi-user channel estimation (MLMuChE) and gradient descent method (GrdDsc). A near-optimum multi-user detector (MuD) based on the genetic algorithm (GAMuD), also proposed in this work, provides a significant reduction in the computational complexity when compared to the optimum multi-user detector (OMuD). In addition, the complexity of the GAMuChE and GAMuD algorithms were (jointly) analyzed in terms of number of operations necessary to reach the convergence, and compared to other jointly MuChE and MuD strategies. The joint GAMuChE-GAMuD scheme can be regarded as a promising alternative for implementing third-generation (3G) and fourth-generation (4G) wireless systems in the near future. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
This paper analyzes the complexity-performance trade-off of several heuristic near-optimum multiuser detection (MuD) approaches applied to the uplink of synchronous single/multiple-input multiple-output multicarrier code division multiple access (S/MIMO MC-CDMA) systems. Genetic algorithm (GA), short term tabu search (STTS) and reactive tabu search (RTS), simulated annealing (SA), particle swarm optimization (PSO), and 1-opt local search (1-LS) heuristic multiuser detection algorithms (Heur-MuDs) are analyzed in details, using a single-objective antenna-diversity-aided optimization approach. Monte- Carlo simulations show that, after convergence, the performances reached by all near-optimum Heur-MuDs are similar. However, the computational complexities may differ substantially, depending on the system operation conditions. Their complexities are carefully analyzed in order to obtain a general complexity-performance framework comparison and to show that unitary Hamming distance search MuD (uH-ds) approaches (1-LS, SA, RTS and STTS) reach the best convergence rates, and among them, the 1-LS-MuD provides the best trade-off between implementation complexity and bit error rate (BER) performance.
Resumo:
We have designed, built, and tested an early prototype of a novel subxiphoid access system intended to facilitate epicardial electrophysiology, but with possible applications elsewhere in the body. The present version of the system consists of a commercially available insertion needle, a miniature pressure sensor and interconnect tubing, read-out electronics to monitor the pressures measured during the access procedure, and a host computer with user-interface software. The nominal resolution of the system is <0.1 mmHg, and it has deviations from linearity of <1%. During a pilot series of human clinical studies with this system, as well as in an auxiliary study done with an independent method, we observed that the pericardial space contained pressure-frequency components related to both the heart rate and respiratory rate, while the thorax contained components related only to the respiratory rate, a previously unobserved finding that could facilitate access to the pericardial space. We present and discuss the design principles, details of construction, and performance characteristics of this system.
Resumo:
A novel method of preparation of water-in-oil-in-micelle-containing water (W/O/W(m)) Multiple emulsions using the one-step emulsification method is reported. These multiple emulsions were normal (not temporary) and stable over a 60 day test period. Previously, reported multiple emulsion by the one-step method were abnormal systems that formed at the inversion point of simple emulsion (where there is an incompatibility in the Ostwald and Bancroft theories, and typically these are O/W/O systems). Pseudoternary phase diagrams and bidimensional process-composition (phase inversion) maps were constructed to assist in process and composition optimization. The surfactants used were PEG40 hydrogenated castor oil and sorbitan oleate, and mineral and vegetables oils were investigated. Physicochemical characterization studies showed experimentally, for the First time, the significance of the ultralow surface tension point oil multiple emulsion formation by one-step via phase inversion processes. Although the significance of ultralow surface tension has been speculated previously, to the best of our knowledge, this is the first experimental confirmation. The multiple emulsion system reported here was dependent not only upon the emulsification temperature, but also upon the component ratios, therefore both the emulsion phase inversion and the phase inversion temperature were considered to fully explain their formation. Accordingly, it is hypothesized that the formation of these normal multiple emulsions is not a result of a temporary incompatibility (at the inversion point) during simple emulsion preparation, as previously reported. Rather, these normal W/O/W(m) emulsions are a result of the simultaneous occurrence of catastrophic and transitional phase inversion processes. The formation of the primary emulsions (W/O) is in accordance with the Ostwald theory and the formation of the multiple emulsions (W/O/W(m)) is in agreement with the Bancroft theory.
Resumo:
The 2009 pandemic influenza A (H1N1) caused significant morbidity and mortality. Acute lung injury is the hallmark of the disease, but multiple organ system dysfunction can develop and lead to death. Therefore, we sought to investigate whether there was postmortem evidence of H1N1 presence and virus-induced organ injury in autopsy specimens. Five cases in which patients died of influenza A (H1N1) virus infection were studied. The lungs of all patients showed macroscopic and microscopic findings already described for H1N1 (consolidation, edema, hemorrhage, alveolar damage, hyaline membrane, and inflammation), and H1N1 viruses were present in alveolar cells in immunochemical studies. Acute tubular necrosis was present in all cases, but there was no evidence of direct virus-induced kidney injury. Nevertheless, H1N1 viruses were found in the cytoplasm of glomerular macrophages in the kidneys of 4 patients. Therefore, our data provide strong evidence that H1N1 presence is not restricted to the lungs.
Resumo:
The genus Eigenmannia comprises several species groups that display a surprising variety of diploid chromosome numbers and sex-determining systems. In this study, hypotheses regarding phylogenetic relationships and karyotype evolution were investigated using a combination of molecular and cytogenetic methods. Phylogenetic relationships were analyzed for 11 cytotypes based on sequences from five mitochondrial DNA regions. Parsimony-based character mapping of sex chromosomes confirms previous suggestions of multiple origins of sex chromosomes. Molecular cytogenetic analyses involved chromosome painting using probes derived from whole sex chromosomes from two taxa that were hybridized to metaphases of their respective sister cytotypes. These analyses showed that a multiple XY system evolved recently (<7 mya) by fusion. Furthermore, one of the chromosomes that fused to form the neo-Y chromosome is fused independently to another chromosome in the sister cytotype. This may constitute an efficient post-mating barrier and might imply a direct function of sex chromosomes in the speciation processes in Eigenmannia. The other chromosomal sex-determination system investigated is shown to have differentiated by an accumulation of heterochromatin on the X chromosome. This has occurred in the past 0.6 my, and is the most recent chromosomal sex-determining system described to date. These results show that the evolution of sex-determining systems can proceed very rapidly. Heredity (2011) 106, 391-400; doi:10.1038/hdy.2010.82; published online 23 June 2010
Resumo:
Clinicians working in the field of congenital and paediatric cardiology have long felt the need for a common diagnostic and therapeutic nomenclature and coding system with which to classify patients of all ages with congenital and acquired cardiac disease. A cohesive and comprehensive system of nomenclature, suitable for setting a global standard for multicentric analysis of outcomes and stratification of risk, has only recently emerged, namely, The International Paediatric and Congenital Cardiac Code. This review, will give an historical perspective on the development of systems of nomenclature in general, and specifically with respect to the diagnosis and treatment of patients with paediatric and congenital cardiac disease. Finally, current and future efforts to merge such systems into the paperless environment of the electronic health or patient record on a global scale are briefly explored. On October 6, 2000, The International Nomenclature Committee for Pediatric and Congenital Heart Disease was established. In January, 2005, the International Nomenclature Committee was constituted in Canada as The International Society for Nomenclature of Paediatric and Congenital Heart Disease. This International Society now has three working groups. The Nomenclature Working Group developed The International Paediatric and Congenital Cardiac Code and will continue to maintain, expand, update, and preserve this International Code. It will also provide ready access to the International Code for the global paediatric and congenital cardiology and cardiac surgery communities, related disciplines, the healthcare industry, and governmental agencies, both electronically and in published form. The Definitions Working Group will write definitions for the terms in the International Paediatric and Congenital Cardiac Code, building on the previously published definitions from the Nomenclature Working Group. The Archiving Working Group, also known as The Congenital Heart Archiving Research Team, will link images and videos to the International Paediatric and Congenital Cardiac Code. The images and videos will be acquired from cardiac morphologic specimens and imaging modalities such as echocardiography, angiography, computerized axial tomography and magnetic resonance imaging, as well as intraoperative images and videos. Efforts are ongoing to expand the usage of The International Paediatric and Congenital Cardiac Code to other areas of global healthcare. Collaborative efforts are under-way involving the leadership of The International Nomenclature Committee for Pediatric and Congenital Heart Disease and the representatives of the steering group responsible for the creation of the 11th revision of the International Classification of Diseases, administered by the World Health Organisation. Similar collaborative efforts are underway involving the leadership of The International Nomenclature Committee for Pediatric and Congenital Heart Disease and the International Health Terminology Standards Development Organisation, who are the owners of the Systematized Nomenclature of Medicine or ""SNOMED"". The International Paediatric and Congenital Cardiac Code was created by specialists in the field to name and classify paediatric and congenital cardiac disease and its treatment. It is a comprehensive code that can be freely downloaded from the internet (http://www.IPCCC.net) and is already in use worldwide, particularly for international comparisons of outcomes. The goal of this effort is to create strategies for stratification of risk and to improve healthcare for the individual patient. The collaboration with the World Heath Organization, the International Health Terminology Standards Development Organisation, and the healthcare Industry, will lead to further enhancement of the International Code, and to Its more universal use.
Resumo:
Neutron multiplicities for several targets and spallation products of proton-induced reactions in thin targets of interest to an accelerator-driven system obtained with the CRISP code have been reported. This code is a Monte Carlo calculation that simulates the intranuclear cascade and evaporationl fission competition processes. Results are compared with experimental data, and agreement between each other can be considered quite satisfactory in a very broad energy range of incitant particles and different targets.
Resumo:
Direct analysis, with minimal sample pretreatment, of antidepressant drugs, fluoxetine, imipramine, desipramine, amitriptyline, and nortriptyline in biofluids was developed with a total run time of 8 min. The setup consists of two HPLC pumps, injection valve, capillary RAM-ADS-C18 pre-column and a capillary analytical C 18 column connected by means of a six-port valve in backflush mode. Detection was performed with ESI-MS/MS and only 1 mu m of sample was injected. Validation was adequately carried out using FLU-d(5) as internal standard. Calibration curves were constructed under a linear range of 1-250 ng mL(-1) in plasma, being the limit of quantification (LOQ), determined as 1 ng mL(-1), for all the analytes. With the described approach it was possible to reach a quantified mass sensitivity of 0.3 pg for each analyte (equivalent to 1.1-1.3 fmol), translating to a lower sample consumption (in the order of 103 less sample than using conventional methods). (C) 2008 Elsevier B.V. All rights reserved.