992 resultados para Single-particle
Resumo:
State Audit Reports
Resumo:
Wireless “MIMO” systems, employing multiple transmit and receive antennas, promise a significant increase of channel capacity, while orthogonal frequency-division multiplexing (OFDM) is attracting a good deal of attention due to its robustness to multipath fading. Thus, the combination of both techniques is an attractive proposition for radio transmission. The goal of this paper is the description and analysis of a new and novel pilot-aided estimator of multipath block-fading channels. Typical models leading to estimation algorithms assume the number of multipath components and delays to be constant (and often known), while their amplitudes are allowed to vary with time. Our estimator is focused instead on the more realistic assumption that the number of channel taps is also unknown and varies with time following a known probabilistic model. The estimation problem arising from these assumptions is solved using Random-Set Theory (RST), whereby one regards the multipath-channel response as a single set-valued random entity.Within this framework, Bayesian recursive equations determine the evolution with time of the channel estimator. Due to the lack of a closed form for the solution of Bayesian equations, a (Rao–Blackwellized) particle filter (RBPF) implementation ofthe channel estimator is advocated. Since the resulting estimator exhibits a complexity which grows exponentially with the number of multipath components, a simplified version is also introduced. Simulation results describing the performance of our channel estimator demonstrate its effectiveness.
Resumo:
Background: Association of mood stabiliser and antipsychotic medication is indicated in psychotic mania, but specific guidelines for the treatment of a first episode of psychotic mania are needed. Aims: To compare safety and efficacy profiles of chlorpromazine and olanzapine augmentation of lithium treatment in a first episode of psychotic mania. Methods: A total of 83 patients were randomised to either lithium + chlorpromazine or lithium + olanzapine in an 8-week trial. Data was collected on side effects, vital signs and weight modifications, as well as on clinical variables. Results: There were no differences in the safety profiles of both medications, but patients in the olanzapine group were significantly more likely to have reached mania remission criteria after 8 weeks. Mixed effects models repeated measures analysis of variance showed that patients in the olanzapine group reached mania remission significantly earlier than those in the chlorpromazine group. Conclusions: These results suggest that while olanzapine and chlorpromazine have a similar safety profile in a cohort of patients with first episode of psychotic mania, the former has a greater efficacy on manic symptoms. On this basis, it may be a better choice for such conditions.
Resumo:
Introduction: Responses to external stimuli are typically investigated by averaging peri-stimulus electroencephalography (EEG) epochs in order to derive event-related potentials (ERPs) across the electrode montage, under the assumption that signals that are related to the external stimulus are fixed in time across trials. We demonstrate the applicability of a single-trial model based on patterns of scalp topographies (De Lucia et al, 2007) that can be used for ERP analysis at the single-subject level. The model is able to classify new trials (or groups of trials) with minimal a priori hypotheses, using information derived from a training dataset. The features used for the classification (the topography of responses and their latency) can be neurophysiologically interpreted, because a difference in scalp topography indicates a different configuration of brain generators. An above chance classification accuracy on test datasets implicitly demonstrates the suitability of this model for EEG data. Methods: The data analyzed in this study were acquired from two separate visual evoked potential (VEP) experiments. The first entailed passive presentation of checkerboard stimuli to each of the four visual quadrants (hereafter, "Checkerboard Experiment") (Plomp et al, submitted). The second entailed active discrimination of novel versus repeated line drawings of common objects (hereafter, "Priming Experiment") (Murray et al, 2004). Four subjects per experiment were analyzed, using approx. 200 trials per experimental condition. These trials were randomly separated in training (90%) and testing (10%) datasets in 10 independent shuffles. In order to perform the ERP analysis we estimated the statistical distribution of voltage topographies by a Mixture of Gaussians (MofGs), which reduces our original dataset to a small number of representative voltage topographies. We then evaluated statistically the degree of presence of these template maps across trials and whether and when this was different across experimental conditions. Based on these differences, single-trials or sets of a few single-trials were classified as belonging to one or the other experimental condition. Classification performance was assessed using the Receiver Operating Characteristic (ROC) curve. Results: For the Checkerboard Experiment contrasts entailed left vs. right visual field presentations for upper and lower quadrants, separately. The average posterior probabilities, indicating the presence of the computed template maps in time and across trials revealed significant differences starting at ~60-70 ms post-stimulus. The average ROC curve area across all four subjects was 0.80 and 0.85 for upper and lower quadrants, respectively and was in all cases significantly higher than chance (unpaired t-test, p<0.0001). In the Priming Experiment, we contrasted initial versus repeated presentations of visual object stimuli. Their posterior probabilities revealed significant differences, which started at 250ms post-stimulus onset. The classification accuracy rates with single-trial test data were at chance level. We therefore considered sub-averages based on five single trials. We found that for three out of four subjects' classification rates were significantly above chance level (unpaired t-test, p<0.0001). Conclusions: The main advantage of the present approach is that it is based on topographic features that are readily interpretable along neurophysiologic lines. As these maps were previously normalized by the overall strength of the field potential on the scalp, a change in their presence across trials and between conditions forcibly reflects a change in the underlying generator configurations. The temporal periods of statistical difference between conditions were estimated for each training dataset for ten shuffles of the data. Across the ten shuffles and in both experiments, we observed a high level of consistency in the temporal periods over which the two conditions differed. With this method we are able to analyze ERPs at the single-subject level providing a novel tool to compare normal electrophysiological responses versus single cases that cannot be considered part of any cohort of subjects. This aspect promises to have a strong impact on both basic and clinical research.
Resumo:
Clarithromycin is compared with clindamycin for single-dose prophylaxis of streptococcal endocarditis in rats. Human-like kinetics of the two antibiotics prevented endocarditis in animals challenged with both small and large amounts of bacterial inocula. Clarithromycin was marginally superior to clindamycin against small amounts of inocula. Clarithromycin may be considered for endocarditis chemoprophylaxis in human.
Varicella Zoster Virus CNS disease in hematopoietic cell transplantation: A single center experience
Resumo:
Background: Varciella Zoster Virus (VZV) can lead to serious complications in Hematopoietic Cell Transplant (HCT) recipients. Central nervous system (CNS) VZV can be one of the most devastating infections in transplant recipients, yet little is known about this rare disease. Objectives: To describe CNS VZV in the post-transplant period and to define potential risk factors in the HCT population. Methods: We reviewed the course of all patients who received a first HCT at the Fred Hutchinson Cancer Center (FHCRC) in Seattle, WA from 1/1996 through 12/2007. Data were collected retrospectively using the Long-Term Follow-Up database, which includes on-site examinations, outside records, laboratory tests, and yearly questionnaires. Patients were classified as CNS VZV if they had laboratory confirmation of VZV in the cerebrospinal fluid (CSF), or had zoster with associated clinical and laboratory findings consistent with CNS disease. Results: A total of six patients developed VZV CNS disease during the evaluation period (table 1). Diagnosis was confirmed in 3/6 by detection of VZV in CSF by PCR. All other patients had a clinical diagnosis based on the presence of CNS symptoms, zoster, lymphocytic pleiocytosis, and response to IV acyclovir. Patients who developed CNS disease had a mean age of 42 years (range 34-51) at time of transplant. CNS disease developed at a mean of 9 months posttransplantation (range 0.5-24 months), and severity varied, ranging from meningitis (3/6) to encephalitis/myelitis (3/6). All had active graft-versus host disease (GHVD) and all were being treated with immunosuppressive therapy at time of diagnosis. Fever and headache were the most common symptoms, but patients who developed focal CNS findings or seizures (3/6) had a more complicated clinical course. While most patients presented with classic VZV/zoster skin lesions, 2/6 patients had no dermatologic findings associated with their presentation. Four (66%) of patients who developed VZV CNS disease died, two related to VZV complications despite aggressive antiviral therapy. Conclusions: In this cohort of HCT patients, VZV CNS disease was a rare complication. Mortality due to CNS VZV is high, particularly in patients who develop focal neurologic findings or seizures. Even in the absence of skin lesions, VZV CNS disease should be considered in patients who develop fevers and neurologic symptoms.
Resumo:
Myocardial tagging has shown to be a useful magnetic resonance modality for the assessment and quantification of local myocardial function. Many myocardial tagging techniques suffer from a rapid fading of the tags, restricting their application mainly to systolic phases of the cardiac cycle. However, left ventricular diastolic dysfunction has been increasingly appreciated as a major cause of heart failure. Subtraction based slice-following CSPAMM myocardial tagging has shown to overcome limitations such as fading of the tags. Remaining impediments to this technique, however, are extensive scanning times (approximately 10 min), the requirement of repeated breath-holds using a coached breathing pattern, and the enhanced sensitivity to artifacts related to poor patient compliance or inconsistent depths of end-expiratory breath-holds. We therefore propose a combination of slice-following CSPAMM myocardial tagging with a segmented EPI imaging sequence. Together with an optimized RF excitation scheme, this enables to acquire as many as 20 systolic and diastolic grid-tagged images per cardiac cycle with a high tagging contrast during a short period of sustained respiration.
Resumo:
Particle physics studies highly complex processes which cannot be directly observed. Scientific realism claims that we are nevertheless warranted in believing that these processes really occur and that the objects involved in them really exist. This dissertation defends a version of scientific realism, called causal realism, in the context of particle physics. I start by introducing the central theses and arguments in the recent philosophical debate on scientific realism (chapter 1), with a special focus on an important presupposition of the debate, namely common sense realism. Chapter 2 then discusses entity realism, which introduces a crucial element into the debate by emphasizing the importance of experiments in defending scientific realism. Most of the chapter is concerned with Ian Hacking's position, but I also argue that Nancy Cartwright's version of entity realism is ultimately preferable as a basis for further development. In chapter 3,1 take a step back and consider the question whether the realism debate is worth pursuing at all. Arthur Fine has given a negative answer to that question, proposing his natural ontologica! attitude as an alternative to both realism and antirealism. I argue that the debate (in particular the realist side of it) is in fact less vicious than Fine presents it. The second part of my work (chapters 4-6) develops, illustrates and defends causal realism. The key idea is that inference to the best explanation is reliable in some cases, but not in others. Chapter 4 characterizes the difference between these two kinds of cases in terms of three criteria which distinguish causal from theoretical warrant. In order to flesh out this distinction, chapter 5 then applies it to a concrete case from the history of particle physics, the discovery of the neutrino. This case study shows that the distinction between causal and theoretical warrant is crucial for understanding what it means to "directly detect" a new particle. But the distinction is also an effective tool against what I take to be the presently most powerful objection to scientific realism: Kyle Stanford's argument from unconceived alternatives. I respond to this argument in chapter 6, and I illustrate my response with a discussion of Jean Perrin's experimental work concerning the atomic hypothesis. In the final part of the dissertation, I turn to the specific challenges posed to realism by quantum theories. One of these challenges comes from the experimental violations of Bell's inequalities, which indicate a failure of locality in the quantum domain. I show in chapter 7 how causal realism can further our understanding of quantum non-locality by taking account of some recent experimental results. Another challenge to realism in quantum mechanics comes from delayed-choice experiments, which seem to imply that certain aspects of what happens in an experiment can be influenced by later choices of the experimenter. Chapter 8 analyzes these experiments and argues that they do not warrant the antirealist conclusions which some commentators draw from them. It pays particular attention to the case of delayed-choice entanglement swapping and the corresponding question whether entanglement is a real physical relation. In chapter 9,1 finally address relativistic quantum theories. It is often claimed that these theories are incompatible with a particle ontology, and this calls into question causal realism's commitment to localizable and countable entities. I defend the commitments of causal realism against these objections, and I conclude with some remarks connecting the interpretation of quantum field theory to more general metaphysical issues confronting causal realism.
Resumo:
Continuous respiratory exchange measurements were performed on five women and five men for 1 h before and 6 h after the administration of a milkshake (53% carbohydrates, 30% lipid, and 17% protein energy) given either as a single bolus dose or continuously during 3 h using a nasogastric tube. The energy administered corresponded to 2.3 times the postabsorptive resting energy expenditure. Resting energy expenditure, respiratory quotient, plasma glucose, and insulin concentrations increased sooner and steeper, and plasma free fatty acids levels decreased earlier with the meal ingested as a single dose than with continuous administration. The magnitude of nutrient-induced thermogenesis was greater (P less than 0.01) with the single dose (means +/- SE, 10.0 +/- 0.6%) than with the continuous administration (8.1 +/- 0.5%). The overall (6 h) substrate balances were not significantly different between the two modes of administration. It is concluded that the mode of enteral nutrient administration influences the immediate thermogenic response as well as changes in respiratory quotient, glycemia, and insulinemia; however, the overall nutrient balance was not affected by the mode of enteral nutrient administration.
Resumo:
In Quantitative Microbial Risk Assessment, it is vital to understand how lag times of individual cells are distributed over a bacterial population. Such identified distributions can be used to predict the time by which, in a growth-supporting environment, a few pathogenic cells can multiply to a poisoning concentration level. We model the lag time of a single cell, inoculated into a new environment, by the delay of the growth function characterizing the generated subpopulation. We introduce an easy-to-implement procedure, based on the method of moments, to estimate the parameters of the distribution of single cell lag times. The advantage of the method is especially apparent for cases where the initial number of cells is small and random, and the culture is detectable only in the exponential growth phase.
Resumo:
Mathematical methods combined with measurements of single-cell dynamics provide a means to reconstruct intracellular processes that are only partly or indirectly accessible experimentally. To obtain reliable reconstructions, the pooling of measurements from several cells of a clonal population is mandatory. However, cell-to-cell variability originating from diverse sources poses computational challenges for such process reconstruction. We introduce a scalable Bayesian inference framework that properly accounts for population heterogeneity. The method allows inference of inaccessible molecular states and kinetic parameters; computation of Bayes factors for model selection; and dissection of intrinsic, extrinsic and technical noise. We show how additional single-cell readouts such as morphological features can be included in the analysis. We use the method to reconstruct the expression dynamics of a gene under an inducible promoter in yeast from time-lapse microscopy data.
Resumo:
AIM: To prospectively study the intraocular pressure (IOP) lowering effect and safety of the new method of very deep sclerectomy with collagen implant (VDSCI) compared with standard deep sclerectomy with collagen implant (DSCI). METHODS: The trial involved 50 eyes of 48 patients with medically uncontrolled primary and secondary open-angle glaucoma, randomized to undergo either VDSCI procedure (25 eyes) or DSCI procedure (25 eyes). Follow-up examinations were performed before surgery and after surgery at day 1, at week 1, at months 1, 2, 3, 6, 9, 12, 18, and 24 months. Ultrasound biomicroscopy was performed at 3 and 12 months. RESULTS: Mean follow-up period was 18.6+/-5.9 (VDSCI) and 18.9+/-3.6 (DSCI) months (P=NS). Mean preoperative IOP was 22.4+/-7.4 mm Hg for VDSCI and 20.4+/-4.4 mm Hg for DSCI eyes (P=NS). Mean postoperative IOP was 3.9+/-2.3 (VDSCI) and 6.3+/-4.3 (DSCI) (P<0.05) at day 1, and 12.2+/-3.9 (VDSCI) and 13.3+/-3.4 (DSCI) (P=NS) at month 24. At the last visit, the complete success rate (defined as an IOP of < or =18 mm Hg and a percentage drop of at least 20%, achieved without medication) was 57% in VDSCI and 62% in DSCI eyes (P=NS) ultrasound biomicroscopy at 12 months showed a mean volume of the subconjunctival filtering bleb of 3.9+/-4.2 mm3 (VDSCI) and 6.8+/-7.5 mm3 (DSCI) (P=0.426) and 5.2+/-3.6 mm3 (VDSCI) and 5.4+/-2.9 mm3 (DSCI) (P=0.902) for the intrascleral space. CONCLUSIONS: Very deep sclerectomy seems to provide stable and good control of IOP at 2 years of follow-up with few postoperative complications similar to standard deep sclerectomy with the collagen implant.
Resumo:
Meta-analysis of genome-wide association studies (GWASs) has led to the discoveries of many common variants associated with complex human diseases. There is a growing recognition that identifying "causal" rare variants also requires large-scale meta-analysis. The fact that association tests with rare variants are performed at the gene level rather than at the variant level poses unprecedented challenges in the meta-analysis. First, different studies may adopt different gene-level tests, so the results are not compatible. Second, gene-level tests require multivariate statistics (i.e., components of the test statistic and their covariance matrix), which are difficult to obtain. To overcome these challenges, we propose to perform gene-level tests for rare variants by combining the results of single-variant analysis (i.e., p values of association tests and effect estimates) from participating studies. This simple strategy is possible because of an insight that multivariate statistics can be recovered from single-variant statistics, together with the correlation matrix of the single-variant test statistics, which can be estimated from one of the participating studies or from a publicly available database. We show both theoretically and numerically that the proposed meta-analysis approach provides accurate control of the type I error and is as powerful as joint analysis of individual participant data. This approach accommodates any disease phenotype and any study design and produces all commonly used gene-level tests. An application to the GWAS summary results of the Genetic Investigation of ANthropometric Traits (GIANT) consortium reveals rare and low-frequency variants associated with human height. The relevant software is freely available.