992 resultados para Eating rate
Resumo:
STUDY QUESTION. Are significant abnormalities in outward (K+) conductance and resting membrane potential (Vm) present in the spermatozoa of patients undertaking IVF and ICSI and if so, what is their functional effect on fertilization success? SUMMARY ANSWER. Negligible outward conductance (≈5% of patients) or an enhanced inward conductance (≈4% of patients), both of which caused depolarization of Vm, were associated with a low rate of fertilization following IVF. WHAT IS KNOWN ALREADY. Sperm-specific potassium channel knockout mice are infertile with defects in sperm function, suggesting that these channels are essential for fertility. These observations suggest that malfunction of K+ channels in human spermatozoa might contribute significantly to the occurrence of subfertility in men. However, remarkably little is known of the nature of K+ channels in human spermatozoa or the incidence and functional consequences of K+ channel defects. STUDY DESIGN, SIZE AND DURATION. Spermatozoa were obtained from healthy volunteer research donors and subfertile IVF and ICSI patients attending a hospital assisted reproductive techniques clinic between May 2013 and December 2015. In total, 40 IVF patients, 41 ICSI patients and 26 normozoospermic donors took part in the study. PARTICIPANTS/MATERIALS, SETTING, METHODS. Samples were examined using electrophysiology (whole-cell patch clamping). Where abnormal electrophysiological characteristics were identified, spermatozoa were further examined for Ca2+ influx induced by progesterone and penetration into viscous media if sufficient sample was available. Full exome sequencing was performed to specifically evaluate potassium calcium-activated channel subfamily M α 1 (KCNMA1), potassium calcium-activated channel subfamily U member 1 (KCNU1) and leucine-rich repeat containing 52 (LRRC52) genes and others associated with K+ signalling. In IVF patients, comparison with fertilization rates was done to assess the functional significance of the electrophysiological abnormalities. MAIN RESULTS AND THE ROLE OF CHANCE. Patch clamp electrophysiology was used to assess outward (K+) conductance and resting membrane potential (Vm) and signalling/motility assays were used to assess functional characteristics of sperm from IVF and ICSI patient samples. The mean Vm and outward membrane conductance in sperm from IVF and ICSI patients were not significantly different from those of control (donor) sperm prepared under the same conditions, but variation between individuals was significantly greater (P< 0.02) with a large number of outliers (>25%). In particular, in ≈10% of patients (7/81), we observed either a negligible outward conductance (4 patients) or an enhanced inward current (3 patients), both of which caused depolarization of Vm. Analysis of clinical data from the IVF patients showed significant association of depolarized Vm (≥0 mV) with low fertilization rate (P= 0.012). Spermatozoa with electrophysiological abnormities (conductance and Vm) responded normally to progesterone with elevation of [Ca2+]i and penetration of viscous medium, indicating retention of cation channel of sperm (CatSper) channel function. LIMITATIONS, REASONS FOR CAUTION. For practical, technical, ethical and logistical reasons, we could not obtain sufficient additional semen samples from men with conductance abnormalities to establish the cause of the conductance defects. Full exome sequencing was only available in two men with conductance defects. WIDER IMPLICATIONS OF THE FINDINGS. These data add significantly to the understanding of the role of ion channels in human sperm function and its impact on male fertility. Impaired potassium channel conductance (Gm) and/or Vm regulation is both common and complex in human spermatozoa and importantly is associated with impaired fertilization capacity when the Vm of cells is completely depolarized.
Resumo:
Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas
Resumo:
One-and two-dimensional cellular automata which are known to be fault-tolerant are very complex. On the other hand, only very simple cellular automata have actually been proven to lack fault-tolerance, i.e., to be mixing. The latter either have large noise probability ε or belong to the small family of two-state nearest-neighbor monotonic rules which includes local majority voting. For a certain simple automaton L called the soldiers rule, this problem has intrigued researchers for the last two decades since L is clearly more robust than local voting: in the absence of noise, L eliminates any finite island of perturbation from an initial configuration of all 0's or all 1's. The same holds for a 4-state monotonic variant of L, K, called two-line voting. We will prove that the probabilistic cellular automata Kε and Lε asymptotically lose all information about their initial state when subject to small, strongly biased noise. The mixing property trivially implies that the systems are ergodic. The finite-time information-retaining quality of a mixing system can be represented by its relaxation time Relax(⋅), which measures the time before the onset of significant information loss. This is known to grow as (1/ε)^c for noisy local voting. The impressive error-correction ability of L has prompted some researchers to conjecture that Relax(Lε) = 2^(c/ε). We prove the tight bound 2^(c1log^21/ε) < Relax(Lε) < 2^(c2log^21/ε) for a biased error model. The same holds for Kε. Moreover, the lower bound is independent of the bias assumption. The strong bias assumption makes it possible to apply sparsity/renormalization techniques, the main tools of our investigation, used earlier in the opposite context of proving fault-tolerance.
Resumo:
In this paper we present Statistical Rate Monotonic Scheduling (SRMS), a generalization of the classical RMS results of Liu and Layland that allows scheduling periodic tasks with highly variable execution times and statistical QoS requirements. Similar to RMS, SRMS has two components: a feasibility test and a scheduling algorithm. The feasibility test for SRMS ensures that using SRMS' scheduling algorithms, it is possible for a given periodic task set to share a given resource (e.g. a processor, communication medium, switching device, etc.) in such a way that such sharing does not result in the violation of any of the periodic tasks QoS constraints. The SRMS scheduling algorithm incorporates a number of unique features. First, it allows for fixed priority scheduling that keeps the tasks' value (or importance) independent of their periods. Second, it allows for job admission control, which allows the rejection of jobs that are not guaranteed to finish by their deadlines as soon as they are released, thus enabling the system to take necessary compensating actions. Also, admission control allows the preservation of resources since no time is spent on jobs that will miss their deadlines anyway. Third, SRMS integrates reservation-based and best-effort resource scheduling seamlessly. Reservation-based scheduling ensures the delivery of the minimal requested QoS; best-effort scheduling ensures that unused, reserved bandwidth is not wasted, but rather used to improve QoS further. Fourth, SRMS allows a system to deal gracefully with overload conditions by ensuring a fair deterioration in QoS across all tasks---as opposed to penalizing tasks with longer periods, for example. Finally, SRMS has the added advantage that its schedulability test is simple and its scheduling algorithm has a constant overhead in the sense that the complexity of the scheduler is not dependent on the number of the tasks in the system. We have evaluated SRMS against a number of alternative scheduling algorithms suggested in the literature (e.g. RMS and slack stealing), as well as refinements thereof, which we describe in this paper. Consistently throughout our experiments, SRMS provided the best performance. In addition, to evaluate the optimality of SRMS, we have compared it to an inefficient, yet optimal scheduler for task sets with harmonic periods.
Resumo:
Quality of Service (QoS) guarantees are required by an increasing number of applications to ensure a minimal level of fidelity in the delivery of application data units through the network. Application-level QoS does not necessarily follow from any transport-level QoS guarantees regarding the delivery of the individual cells (e.g. ATM cells) which comprise the application's data units. The distinction between application-level and transport-level QoS guarantees is due primarily to the fragmentation that occurs when transmitting large application data units (e.g. IP packets, or video frames) using much smaller network cells, whereby the partial delivery of a data unit is useless; and, bandwidth spent to partially transmit the data unit is wasted. The data units transmitted by an application may vary in size while being constant in rate, which results in a variable bit rate (VBR) data flow. That data flow requires QoS guarantees. Statistical multiplexing is inadequate, because no guarantees can be made and no firewall property exists between different data flows. In this paper, we present a novel resource management paradigm for the maintenance of application-level QoS for VBR flows. Our paradigm is based on Statistical Rate Monotonic Scheduling (SRMS), in which (1) each application generates its variable-size data units at a fixed rate, (2) the partial delivery of data units is of no value to the application, and (3) the QoS guarantee extended to the application is the probability that an arbitrary data unit will be successfully transmitted through the network to/from the application.
Resumo:
Statistical Rate Monotonic Scheduling (SRMS) is a generalization of the classical RMS results of Liu and Layland [LL73] for periodic tasks with highly variable execution times and statistical QoS requirements. The main tenet of SRMS is that the variability in task resource requirements could be smoothed through aggregation to yield guaranteed QoS. This aggregation is done over time for a given task and across multiple tasks for a given period of time. Similar to RMS, SRMS has two components: a feasibility test and a scheduling algorithm. SRMS feasibility test ensures that it is possible for a given periodic task set to share a given resource without violating any of the statistical QoS constraints imposed on each task in the set. The SRMS scheduling algorithm consists of two parts: a job admission controller and a scheduler. The SRMS scheduler is a simple, preemptive, fixed-priority scheduler. The SRMS job admission controller manages the QoS delivered to the various tasks through admit/reject and priority assignment decisions. In particular, it ensures the important property of task isolation, whereby tasks do not infringe on each other. In this paper we present the design and implementation of SRMS within the KURT Linux Operating System [HSPN98, SPH 98, Sri98]. KURT Linux supports conventional tasks as well as real-time tasks. It provides a mechanism for transitioning from normal Linux scheduling to a mixed scheduling of conventional and real-time tasks, and to a focused mode where only real-time tasks are scheduled. We overview the technical issues that we had to overcome in order to integrate SRMS into KURT Linux and present the API we have developed for scheduling periodic real-time tasks using SRMS.
Resumo:
Recent research have exposed new breeds of attacks that are capable of denying service or inflicting significant damage to TCP flows, without sustaining the attack traffic. Such attacks are often referred to as "low-rate" attacks and they stand in sharp contrast against traditional Denial of Service (DoS) attacks that can completely shut off TCP flows by flooding an Internet link. In this paper, we study the impact of these new breeds of attacks and the extent to which defense mechanisms are capable of mitigating the attack's impact. Through adopting a simple discrete-time model with a single TCP flow and a nonoblivious adversary, we were able to expose new variants of these low-rate attacks that could potentially have high attack potency per attack burst. Our analysis is focused towards worst-case scenarios, thus our results should be regarded as upper bounds on the impact of low-rate attacks rather than a real assessment under a specific attack scenario.
Resumo:
Speech can be understood at widely varying production rates. A working memory is described for short-term storage of temporal lists of input items. The working memory is a cooperative-competitive neural network that automatically adjusts its integration rate, or gain, to generate a short-term memory code for a list that is independent of item presentation rate. Such an invariant working memory model is used to simulate data of Repp (1980) concerning the changes of phonetic category boundaries as a function of their presentation rate. Thus the variability of categorical boundaries can be traced to the temporal in variance of the working memory code.
Resumo:
Background: With cesarean section rates increasing worldwide, clarity regarding negative effects is essential. This study aimed to investigate the rate of subsequent stillbirth, miscarriage, and ectopic pregnancy following primary cesarean section, controlling for confounding by indication. Methods and Findings: We performed a population-based cohort study using Danish national registry data linking various registers. The cohort included primiparous women with a live birth between January 1, 1982, and December 31, 2010 (n = 832,996), with follow-up until the next event (stillbirth, miscarriage, or ectopic pregnancy) or censoring by live birth, death, emigration, or study end. Cox regression models for all types of cesarean sections, sub-group analyses by type of cesarean, and competing risks analyses for the causes of stillbirth were performed. An increased rate of stillbirth (hazard ratio [HR] 1.14, 95% CI 1.01, 1.28) was found in women with primary cesarean section compared to spontaneous vaginal delivery, giving a theoretical absolute risk increase (ARI) of 0.03% for stillbirth, and a number needed to harm (NNH) of 3,333 women. Analyses by type of cesarean section showed similarly increased rates for emergency (HR 1.15, 95% CI 1.01, 1.31) and elective cesarean (HR 1.11, 95% CI 0.91, 1.35), although not statistically significant in the latter case. An increased rate of ectopic pregnancy was found among women with primary cesarean overall (HR 1.09, 95% CI 1.04, 1.15) and by type (emergency cesarean, HR 1.09, 95% CI 1.03, 1.15, and elective cesarean, HR 1.12, 95% CI 1.03, 1.21), yielding an ARI of 0.1% and a NNH of 1,000 women for ectopic pregnancy. No increased rate of miscarriage was found among women with primary cesarean, with maternally requested cesarean section associated with a decreased rate of miscarriage (HR 0.72, 95% CI 0.60, 0.85). Limitations include incomplete data on maternal body mass index, maternal smoking, fertility treatment, causes of stillbirth, and maternally requested cesarean section, as well as lack of data on antepartum/intrapartum stillbirth and gestational age for stillbirth and miscarriage. Conclusions: This study found that cesarean section is associated with a small increased rate of subsequent stillbirth and ectopic pregnancy. Underlying medical conditions, however, and confounding by indication for the primary cesarean delivery account for at least part of this increased rate. These findings will assist women and health-care providers to reach more informed decisions regarding mode of delivery.
Resumo:
Assuming that daily spot exchange rates follow a martingale process, we derive the implied time series process for the vector of 30-day forward rate forecast errors from using weekly data. The conditional second moment matrix of this vector is modelled as a multivariate generalized ARCH process. The estimated model is used to test the hypothesis that the risk premium is a linear function of the conditional variances and covariances as suggested by the standard asset pricing theory literature. Little supportt is found for this theory; instead lagged changes in the forward rate appear to be correlated with the 'risk premium.'. © 1990.
Resumo:
While cochlear implants (CIs) usually provide high levels of speech recognition in quiet, speech recognition in noise remains challenging. To overcome these difficulties, it is important to understand how implanted listeners separate a target signal from interferers. Stream segregation has been studied extensively in both normal and electric hearing, as a function of place of stimulation. However, the effects of pulse rate, independent of place, on the perceptual grouping of sequential sounds in electric hearing have not yet been investigated. A rhythm detection task was used to measure stream segregation. The results of this study suggest that while CI listeners can segregate streams based on differences in pulse rate alone, the amount of stream segregation observed decreases as the base pulse rate increases. Further investigation of the perceptual dimensions encoded by the pulse rate and the effect of sequential presentation of different stimulation rates on perception could be beneficial for the future development of speech processing strategies for CIs.
Resumo:
Maps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.
Resumo:
BACKGROUND: Primary care providers' suboptimal recognition of the severity of chronic kidney disease (CKD) may contribute to untimely referrals of patients with CKD to subspecialty care. It is unknown whether U.S. primary care physicians' use of estimated glomerular filtration rate (eGFR) rather than serum creatinine to estimate CKD severity could improve the timeliness of their subspecialty referral decisions. METHODS: We conducted a cross-sectional study of 154 United States primary care physicians to assess the effect of use of eGFR (versus creatinine) on the timing of their subspecialty referrals. Primary care physicians completed a questionnaire featuring questions regarding a hypothetical White or African American patient with progressing CKD. We asked primary care physicians to identify the serum creatinine and eGFR levels at which they would recommend patients like the hypothetical patient be referred for subspecialty evaluation. We assessed significant improvement in the timing [from eGFR < 30 to ≥ 30 mL/min/1.73m(2)) of their recommended referrals based on their use of creatinine versus eGFR. RESULTS: Primary care physicians recommended subspecialty referrals later (CKD more advanced) when using creatinine versus eGFR to assess kidney function [median eGFR 32 versus 55 mL/min/1.73m(2), p < 0.001]. Forty percent of primary care physicians significantly improved the timing of their referrals when basing their recommendations on eGFR. Improved timing occurred more frequently among primary care physicians practicing in academic (versus non-academic) practices or presented with White (versus African American) hypothetical patients [adjusted percentage(95% CI): 70% (45-87) versus 37% (reference) and 57% (39-73) versus 25% (reference), respectively, both p ≤ 0.01). CONCLUSIONS: Primary care physicians recommended subspecialty referrals earlier when using eGFR (versus creatinine) to assess kidney function. Enhanced use of eGFR by primary care physicians' could lead to more timely subspecialty care and improved clinical outcomes for patients with CKD.
Resumo:
At a workshop held at Resources for the Future in September 2011, twelve of the authors were asked by the US Environmental Protection Agency (EPA) to provide advice on the principles to be used in discounting the benefits and costs of projects that affect future generations. Maureen L. Cropper chaired the workshop. Much of the discussion in this article is based on the authors' recommendations and advice presented at the workshop. © The Author 2014.
Resumo:
© 2015 IEEE.In virtual reality applications, there is an aim to provide real time graphics which run at high refresh rates. However, there are many situations in which this is not possible due to simulation or rendering issues. When running at low frame rates, several aspects of the user experience are affected. For example, each frame is displayed for an extended period of time, causing a high persistence image artifact. The effect of this artifact is that movement may lose continuity, and the image jumps from one frame to another. In this paper, we discuss our initial exploration of the effects of high persistence frames caused by low refresh rates and compare it to high frame rates and to a technique we developed to mitigate the effects of low frame rates. In this technique, the low frame rate simulation images are displayed with low persistence by blanking out the display during the extra time such image would be displayed. In order to isolate the visual effects, we constructed a simulator for low and high persistence displays that does not affect input latency. A controlled user study comparing the three conditions for the tasks of 3D selection and navigation was conducted. Results indicate that the low persistence display technique may not negatively impact user experience or performance as compared to the high persistence case. Directions for future work on the use of low persistence displays for low frame rate situations are discussed.