931 resultados para RADIO FREQUENCY IDENTIFICATION SYSTEMS (RFI)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of new drug delivery systems to target the anterior segment of the eye may offer many advantages: to increase the biodisponibility of the drug, to allow the penetration of drug that cannot be formulated as solutions, to obtain constant and sustained drug release, to achieve higher local concentrations without systemic effects, to target more specifically one tissue or cell type, to reduce the frequency of instillation and therefore increase the observance and comfort of the patient while reducing side effects of frequent instillation. Several approaches are developed, aiming to increase the corneal contact time by modified formulation or reservoir systems, or by increasing the tissue permeability using iontophoresis. To date, no ocular drug delivery system is ideal for all purposes. To maximize treatment efficacy, careful evaluation of the specific pathological condition, the targeted Intraocular tissue and the location of the most severe pathology must be made before selecting the method of delivery most suitable for each individual patient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: In the present review, we will provide the scientific rationale for applying systems biology to the development of vaccines and particularly HIV vaccines, the predictive power of systems biology on the vaccine immunological profile, the correlation between systems biology and the immunological functional profiles of different candidate vaccines, and the value of systems biology in the selection process of identifying the best-in-class candidate vaccines and in the decision process to move into in-vivo evaluation in clinical trials. RECENT FINDINGS: Systems biology has been recently applied to the characterization of the protective yellow fever vaccine YF17D and of seasonal flu vaccines. This has been instrumental in the identification of the components of the immune response that need to be stimulated by the vaccine in order to generate protective immunity. It is worth noting that a systems biology approach is currently being performed to identify correlates of immune protection of the RV144 Thai vaccine, the only known vaccine that showed modest protection against HIV reacquisition. SUMMARY: Systems biology represents a novel and powerful approach to predict the vaccine immunological profile, to identify the protective components of the immune response, and to help in the selection process of the best-in-class vaccines to move into clinical development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trenchless technologies are methods used for the construction and rehabilitation of underground utility pipes. These methods are growing increasingly popular due to their versatility and their potential to lower project costs. However, the use of trenchless technologies in Iowa and their effects on surrounding soil and nearby structures has not been adequately documented. Surveys of and interviews with professionals working in trenchless-related industries in Iowa were conducted, and the results were analyzed and compared to survey results from the United States as a whole. The surveys focused on method familiarity, pavement distress observed, reliability of trenchless methods, and future improvements. Results indicate that the frequency of pavement distress or other trenchless-related issues are an ongoing problem in the industry. Inadequate soil information and quality control/quality assurance (QC/QA) are partially to blame. Fieldwork involving the observation of trenchless construction projects was undertaken with the purpose of documenting current practices and applications of trenchless technology in the United States and Iowa. Field tests were performed in which push-in pressure cells were used to measure the soil stresses induced by trenchless construction methods. A program of laboratory soil testing was carried out in conjunction with the field testing. Soil testing showed that the installations were made in sandy clay or well-graded sand with silt and gravel. Pipes were installed primarily using horizontal directional drilling with pipe diameters from 3 to 12 inches. Pressure cell monitoring was conducted during the following construction phases: pilot bore, pre-reaming, and combined pipe pulling and reaming. The greatest increase in lateral earth pressure was 5.6 psi and was detected 2.1 feet from the centerline of the bore during a pilot hole operation in sandy lean clay. Measurements from 1.0 to 2.5 psi were common. Comparisons were made between field measurements and analytical and finite element calculation methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of recent attempts to redefine the 'skin notation' concept, a position paper summarizing an international workshop on the topic stated that the skin notation should be a hazard indicator related to the degree of toxicity and the potential for transdermal exposure of a chemical. Within the framework of developing a web-based tool integrating this concept, we constructed a database of 7101 agents for which a percutaneous permeation constant can be estimated (using molecular weight and octanol-water partition constant), and for which at least one of the following toxicity indices could be retrieved: Inhalation occupational exposure limit (n=644), Oral lethal dose 50 (LD50, n=6708), cutaneous LD50 (n=1801), Oral no observed adverse effect level (NOAEL, n=1600), and cutaneous NOAEL (n=187). Data sources included the Registry of toxic effects of chemical substances (RTECS, MDL information systems, Inc.), PHYSPROP (Syracuse Research Corp.) and safety cards from the International Programme on Chemical Safety (IPCS). A hazard index, which corresponds to the product of exposure duration and skin surface exposed that would yield an internal dose equal to a toxic reference dose was calculated. This presentation provides a descriptive summary of the database, correlations between toxicity indices, and an example of how the web tool will help industrial hygienist decide on the possibility of a dermal risk using the hazard index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé La thématique de cette thèse peut être résumée par le célèbre paradoxe de biologie évolutive sur le maintien du polymorphisme face à la sélection et par l'équation du changement de fréquence gamétique au cours du temps dû, à la sélection. La fréquence d'un gamète xi à la génération (t + 1) est: !!!Equation tronquée!!! Cette équation est utilisée pour générer des données utlisée tout au long de ce travail pour 2, 3 et 4 locus dialléliques. Le potentiel de l'avantage de l'hétérozygote pour le maintien du polymorphisme est le sujet de la première partie. La définition commune de l'avantage de l'hétérozygote n'etant applicable qu'a un locus ayant 2 allèles, cet avantage est redéfini pour un système multilocus sur les bases de précédentes études. En utilisant 5 définitions différentes de l'avantage de l'hétérozygote, je montre que cet avantage ne peut être un mécanisme général dans le maintien du polymorphisme sous sélection. L'étude de l'influence de locus non-détectés sur les processus évolutifs, seconde partie de cette thèse, est motivée par les travaux moléculaires ayant pour but de découvrir le nombre de locus codant pour un trait. La plupart de ces études sous-estiment le nombre de locus. Je montre que des locus non-détectés augmentent la probabilité d'observer du polymorphisme sous sélection. De plus, les conclusions sur les facteurs de maintien du polymorphisme peuvent être trompeuses si tous les locus ne sont pas détectés. Dans la troisième partie, je m'intéresse à la valeur attendue de variance additive après un goulot d'étranglement pour des traits sélectionés. Une études précédente montre que le niveau de variance additive après goulot d'étranglement augmente avec le nombre de loci. Je montre que le niveau de variance additive après un goulot d'étranglement augmente (comparé à des traits neutres), mais indépendamment du nombre de loci. Par contre, le taux de recombinaison a une forte influence, entre autre en regénérant les gamètes disparus suite au goulot d'étranglement. La dernière partie de ce travail de thèse décrit un programme pour le logiciel de statistique R. Ce programme permet d'itérer l'équation ci-dessus en variant les paramètres de sélection, recombinaison et de taille de populations pour 2, 3 et 4 locus dialléliques. Cette thèse montre qu'utiliser un système multilocus permet d'obtenir des résultats non-conformes à ceux issus de systèmes rnonolocus (la référence en génétique des populations). Ce programme ouvre donc d'intéressantes perspectives en génétique des populations. Abstract The subject of this PhD thesis can be summarized by one famous paradox of evolu-tionary biology: the maintenance of polymorphism in the face of selection, and one classical equation of theoretical population genetics: the changes in gametic frequencies due to selection and recombination. The frequency of gamete xi at generation (t + 1) is given by: !!! Truncated equation!!! This equation is used to generate data on selection at two, three, and four diallelic loci for the different parts of this work. The first part focuses on the potential of heterozygote advantage to maintain genetic polymorphism. Results of previous studies are used to (re)define heterozygote advantage for multilocus systems, since the classical definition is for one diallelic locus. I use 5 different definitions of heterozygote advantage. And for these five definitions, I show that heterozygote advantage is not a general mechanism for the maintenance of polymorphism. The study of the influence of undetected loci on evolutionary processes (second part of this work) is motivated by molecular works which aim at discovering the loci coding for a trait. For most of these works, some coding loci remains undetected. I show that undetected loci increases the probability of maintaining polymorphism under selection. In addition, conclusions about the factor that maintain polymorphism can be misleading if not all loci are considered. This is, therefore, only when all loci are detected that exact conclusions on the level of maintained polymorphism or on the factor(s) that maintain(s) polymorphism could be drawn. In the third part, the focus is on the expected release of additive genetic variance after bottleneck for selected traits. A previous study shows that the expected release of additive variance increases with an increase in the number of loci. I show that the expected release of additive variance after bottleneck increases for selected traits (compared with neutral), but this increase is not a function of the number of loci, but function of the recombination rate. Finally, the last part of this PhD thesis is a description of a package for the statistical software R that implements the Equation given above. It allows to generate data for different scenario regarding selection, recombination, and population size. This package opens perspectives for the theoretical population genetics that mainly focuses on one locus, while this work shows that increasing the number of loci leads not necessarily to straightforward results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Data on the frequency of extraintestinal manifestations (EIM) in Crohnʼs disease (CD) and ulcerative colitis (UC) are scarce. Goal: to evaluate prevalences, forms of EIM and risk factors in a large nationwide IBD cohort. Methods: Data from validated physician enrolment questionnaires of the adult Swiss IBD cohort were analyzed. Logistic regression models were used to identify EIM risk factors. Results: 950 patients were included, 580 (61%) with CD (mean age 43yrs) and 370 (39%) with UC (mean age 49yrs), of these, 249 (43%) of CD and 113 (31%) of UC patients had one to 5 EIM. The following EIM were found: arthritis (CD 33%, UC 21%), aphthous stomatitis (CD 10%, UC 4%), uveitis (CD 6%, UC 4%), erythema nodosum (CD 6%, UC 3%), ankylosing spondylitis (CD 6%, UC 2%), psoriasis (CD 2%, UC 1%), pyoderma gangrenosum (CD and UC each 2%), primary sclerosing cholangitis (CD 1%, UC 4%). Logistic regression in CD identified the following items as risk factors for ongoing EIM: active disease (OR 1.95, 95% CI 1.17-3.23, P=0.01), positive IBD family history (OR 1.77, 95% CI 1.07-2.92, P=0.025). No risk factors were identified in UC patients. Conclusions: EIM are a frequent problem in CD and UC patients. Active disease and positive IBD family history are associated with ongoing EIM in CD patients. Identification of EIM prevalence and associated risk factors may result in increased awareness for this problem and thereby facilitate their diagnosis and management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2009, Cygnus X-3 (Cyg X-3) became the first microquasar to be detected in the GeV γ-ray regime, via the satellites Fermi and AGILE. The addition of this new band to the observational toolbox holds promise for building a more detailed understanding of the relativistic jets of this and other systems. We present a rich data set of radio, hard and soft X-ray, and γ-ray observations of Cyg X-3 made during a flaring episode in 2010 May. We detect a ~3 day softening and recovery of the X-ray emission, followed almost immediately by a ~1 Jy radio flare at 15 GHz, followed by a 4.3σ γ-ray flare (E > 100 MeV) ~1.5 days later. The radio sampling is sparse, but we use archival data to argue that it is unlikely the γ-ray flare was followed by any significant unobserved radio flares. In this case, the sequencing of the observed events is difficult to explain in a model in which the γ-ray emission is due to inverse Compton scattering of the companion star's radiation field. Our observations suggest that other mechanisms may also be responsible for γ-ray emission from Cyg X-3.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A linkage between obesity-related phenotypes and the 2p21-23 locus has been reported previously. The urocortin (UCN) gene resides at this interval, and its protein decreases appetite behavior, suggesting that UCN may be a candidate gene for susceptibility to obesity. We localized the UCN gene by radiation hybrid mapping, and the surrounding markers were genotyped in a collection of French families. Evidence for linkage was shown between the marker D2S165 and leptin levels (LOD score, 1.34; P = 0.006) and between D2S2247 and the z-score of body mass index (LOD score, 1.829; P = 0.0019). The gene was screened for SNPs in 96 obese patients. Four new variants were established. Two single nucleotide polymorphisms were located in the promoter (-535 A-->G, -286 G-->A), one in intron 1 (+31 C-->G), and one in the 3'-untranslated region (+34 C-->T). Association studies in cohorts of 722 unrelated obese and 381 control subjects and transmission disequilibrium tests, performed for the two frequent promoter polymorphisms, in 120 families (894 individuals) showed that no association was present between these variants and obesity, obesity-related phenotypes, and diabetes. Thus, our analyses of the genetic variations of the UCN gene suggest that, at least in French Caucasians, they do not represent a major cause of obesity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuronal networks in vitro are prominent systems to study the development of connections in living neuronal networks and the interplay between connectivity, activity and function. These cultured networks show a rich spontaneous activity that evolves concurrently with the connectivity of the underlying network. In this work we monitor the development of neuronal cultures, and record their activity using calcium fluorescence imaging. We use spectral analysis to characterize global dynamical and structural traits of the neuronal cultures. We first observe that the power spectrum can be used as a signature of the state of the network, for instance when inhibition is active or silent, as well as a measure of the network's connectivity strength. Second, the power spectrum identifies prominent developmental changes in the network such as GABAA switch. And third, the analysis of the spatial distribution of the spectral density, in experiments with a controlled disintegration of the network through CNQX, an AMPA-glutamate receptor antagonist in excitatory neurons, reveals the existence of communities of strongly connected, highly active neurons that display synchronous oscillations. Our work illustrates the interest of spectral analysis for the study of in vitro networks, and its potential use as a network-state indicator, for instance to compare healthy and diseased neuronal networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECT: Reversible cerebral vasoconstriction syndrome (RCVS) is described as a clinical and radiological entity characterized by thunderclap headaches, a reversible segmental or multifocal vasoconstriction of cerebral arteries with or without focal neurological deficits or seizures. The purpose of this study is to determine risk factors of poor outcome in patients presented a RCVS. METHODS: A retrospective multi-center review of invasive and non-invasive neurovascular imaging between January 2006 and January 2011 has identified 10 patients with criterion of reversible segmental vasoconstriction syndrome. Demographics data, vascular risks and evolution of each of these patients were analyzed. RESULTS: Seven of the ten patients were females with a mean age of 46 years. In four patients, we did not found any causative factors. Two cases presented RCVS in post-partum period between their first and their third week after delivery. The other three cases were drug-induced RCVS, mainly vaso-active drugs. Cannabis was found as the causative factor in two patient, Sumatriptan identified in one patient while cyclosporine was the causative agent in also one patient. The mean duration of clinical follow-up was 10.2 months (range: 0-28 months). Two patients had neurological sequelae: one patient kept a dysphasia and the other had a homonymous lateral hemianopia. We could not find any significant difference of the evolution between secondary RCVS and idiopathic RCVS. The only two factors, which could be correlated to the clinical outcome were the neurological status at admission and the presence of intraparenchymal abnormalities (ischemic stroke, hematoma) in brain imaging. CONCLUSIONS: Fulminant vasoconstriction resulting in progressive symptoms or death has been reported in exceptional frequency. Physicians had to remember that such evolution could happen and predict them by identifying all factors of poor prognosis (neurological status at admission, the presence of intraparenchymal abnormalities).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cognitive radio is a wireless technology aimed at improvingthe efficiency use of the radio-electric spectrum, thus facilitating a reductionin the load on the free frequency bands. Cognitive radio networkscan scan the spectrum and adapt their parameters to operate in the unoccupiedbands. To avoid interfering with licensed users operating on a givenchannel, the networks need to be highly sensitive, which is achieved byusing cooperative sensing methods. Current cooperative sensing methodsare not robust enough against occasional or continuous attacks. This articleoutlines a Group Fusion method that takes into account the behavior ofusers over the short and long term. On fusing the data, the method is basedon giving more weight to user groups that are more unanimous in their decisions.Simulations have been performed in a dynamic environment withinterferences. Results prove that when attackers are present (both reiterativeor sporadic), the proposed Group Fusion method has superior sensingcapability than other methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a Bayesian approach to the design of transmit prefiltering matrices in closed-loop schemes robust to channel estimation errors. The algorithms are derived for a multiple-input multiple-output (MIMO) orthogonal frequency division multiplexing (OFDM) system. Two different optimizationcriteria are analyzed: the minimization of the mean square error and the minimization of the bit error rate. In both cases, the transmitter design is based on the singular value decomposition (SVD) of the conditional mean of the channel response, given the channel estimate. The performance of the proposed algorithms is analyzed,and their relationship with existing algorithms is indicated. As withother previously proposed solutions, the minimum bit error rate algorithmconverges to the open-loop transmission scheme for very poor CSI estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many engineering problems that can be formulatedas constrained optimization problems result in solutionsgiven by a waterfilling structure; the classical example is thecapacity-achieving solution for a frequency-selective channel.For simple waterfilling solutions with a single waterlevel and asingle constraint (typically, a power constraint), some algorithmshave been proposed in the literature to compute the solutionsnumerically. However, some other optimization problems result insignificantly more complicated waterfilling solutions that includemultiple waterlevels and multiple constraints. For such cases, itmay still be possible to obtain practical algorithms to evaluate thesolutions numerically but only after a painstaking inspection ofthe specific waterfilling structure. In addition, a unified view ofthe different types of waterfilling solutions and the correspondingpractical algorithms is missing.The purpose of this paper is twofold. On the one hand, itoverviews the waterfilling results existing in the literature from aunified viewpoint. On the other hand, it bridges the gap betweena wide family of waterfilling solutions and their efficient implementationin practice; to be more precise, it provides a practicalalgorithm to evaluate numerically a general waterfilling solution,which includes the currently existing waterfilling solutions andothers that may possibly appear in future problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A particular property of the matched desiredimpulse response receiver is introduced in this paper, namely,the fact that full exploitation of the diversity is obtained withmultiple beamformers when the channel is spatially and timelydispersive. This particularity makes the receiver specially suitablefor mobile and underwater communications. The new structureprovides better performance than conventional and weightedVRAKE receivers, and a diversity gain with no needs of additionalradio frequency equipment. The baseband hardware neededfor this new receiver may be obtained through reconfigurabilityof the RAKE architectures available at the base station. Theproposed receiver is tested through simulations assuming UTRAfrequency-division-duplexing mode.