844 resultados para Optimal time delay


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to propose a protocol to analyze blood samples in yellow fever 17DD vaccinated which developed serious adverse events. We investigated whether or not the time between sample collection and sample processing could interfere in lymphocyte subset percentage, for it is often impossible to analyze blood samples immediately after collection due to transport delay from collection places to the flow cytometry facility. CD4+CD38+ T, CD8+CD38+ T, CD3+ T, CD19+ B lymphocyte subsets were analyzed by flow cytometry in nine healthy volunteers immediately after blood collection and after intervals of 24 and 48 h. The whole blood lysis method and gradient sedimentation by Histopaque were applied to isolate peripheral blood mononuclear cells for flow cytometry analyses. With the lysis method, there was no significant change in lymphocyte subset percentage between the two time intervals (24 and 48 h). In contrast, when blood samples were processed by Histopaque gradient sedimentation, time intervals for sample processing influenced the percentage in T lymphocyte subsets but not in B cells. From the results obtained, we could conclude that the whole blood lysis method is more appropriate than gradient sedimentation by Histopaque for immunophenotyping of blood samples collected after serious adverse events, due to less variation in the lymphocyte subset levels with respect to the time factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Coronary magnetic resonance angiography (MRA) is a medical imaging technique that involves collecting data from consecutive heartbeats, always at the same time in the cardiac cycle, in order to minimize heart motion artifacts. This technique relies on the assumption that coronary arteries always follow the same trajectory from heartbeat to heartbeat. Until now, choosing the acquisition window in the cardiac cycle was based exclusively on the position of minimal coronary motion. The goal of this study was to test the hypothesis that there are time intervals during the cardiac cycle when coronary beat-to-beat repositioning is optimal. The repositioning uncertainty values in these time intervals were then compared with the intervals of low coronary motion in order to propose an optimal acquisition window for coronary MRA. Methods: Cine breath-hold x-ray angiograms with synchronous ECG were collected from 11 patients who underwent elective routine diagnostic coronarography. Twenty-three bifurcations of the left coronary artery were selected as markers to evaluate repositioning uncertainty and velocity during cardiac cycle. Each bifurcation was tracked by two observers, with the help of a user-assisted algorithm implemented in Matlab (The Mathworks, Natick, MA, USA) that compared the trajectories of the markers coming from consecutive heartbeats and computed the coronary repositioning uncertainty with steps of 50ms until 650ms after the R-wave. Repositioning uncertainty was defined as the diameter of the smallest circle encompassing the points to be compared at the same time after the R-wave. Student's t-tests with a false discovery rate (FDR, q=0.1) correction for multiple comparison were applied to see whether coronary repositioning and velocity vary statistically during cardiac cycle. Bland-Altman plots and linear regression were used to assess intra- and inter-observer agreement. Results: The analysis of left coronary artery beat-to-beat repositioning uncertainty shows a tendency to have better repositioning in mid systole (less than 0.84±0.58mm) and mid diastole (less than 0.89±0.6mm) than in the rest of the cardiac cycle (highest value at 50ms=1.35±0.64mm). According to Student's t-tests with FDR correction for multiple comparison (q=0.1), two intervals, in mid systole (150-200ms) and mid diastole (550-600ms), provide statistically better repositioning in comparison with the early systole and the early diastole. Coronary velocity analysis reveals that left coronary artery moves more slowly in end systole (14.35±11.35mm/s at 225ms) and mid diastole (11.78±11.62mm/s at 625ms) than in the rest of the cardiac cycle (highest value at 25ms: 55.96±22.34mm/s). This was confirmed by Student's t-tests with FDR correction for multiple comparison (q=0.1, FDR-corrected p-value=0.054): coronary velocity values at 225, 575 and 625ms are not much different between them but they are statistically inferior to all others. Bland-Altman plots and linear regression show that intra-observer agreement (y=0.97x+0.02 with R²=0.93 at 150ms) is better than inter-observer (y=0.8x+0.11 with R²=0.67 at 150ms). Discussion: The present study has demonstrated that there are two time intervals in the cardiac cycle, one in mid systole and one in mid diastole, where left coronary artery repositioning uncertainty reaches points of local minima. It has also been calculated that the velocity is the lowest in end systole and mid diastole. Since systole is less influenced by heart rate variability than diastole, it was finally proposed to test an acquisition window between 150 and 200ms after the R-wave.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most network operators have considered reducing Label Switched Routers (LSR) label spaces (i.e. the number of labels that can be used) as a means of simplifying management of underlaying Virtual Private Networks (VPNs) and, hence, reducing operational expenditure (OPEX). This letter discusses the problem of reducing the label spaces in Multiprotocol Label Switched (MPLS) networks using label merging - better known as MultiPoint-to-Point (MP2P) connections. Because of its origins in IP, MP2P connections have been considered to have tree- shapes with Label Switched Paths (LSP) as branches. Due to this fact, previous works by many authors affirm that the problem of minimizing the label space using MP2P in MPLS - the Merging Problem - cannot be solved optimally with a polynomial algorithm (NP-complete), since it involves a hard- decision problem. However, in this letter, the Merging Problem is analyzed, from the perspective of MPLS, and it is deduced that tree-shapes in MP2P connections are irrelevant. By overriding this tree-shape consideration, it is possible to perform label merging in polynomial time. Based on how MPLS signaling works, this letter proposes an algorithm to compute the minimum number of labels using label merging: the Full Label Merging algorithm. As conclusion, we reclassify the Merging Problem as Polynomial-solvable, instead of NP-complete. In addition, simulation experiments confirm that without the tree-branch selection problem, more labels can be reduced

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Coronary microvascular dysfunction (CMD) is associated with cardiovascular events in type 2 diabetes mellitus (T2DM). Optimal glycaemic control does not always preclude future events. We sought to assess the effect of the current target of HBA1c level on the coronary microcirculatory function and identify predictive factors for CMD in T2DM patients. Methods We studied 100 patients with T2DM and 214 patients without T2DM. All of them with a history of chest pain, non-obstructive angiograms and a direct assessment of coronary blood flow increase in response to adenosine and acetylcholine coronary infusion, for evaluation of endothelial independent and dependent CMD. Patients with T2DM were categorized as having optimal (HbA1c < 7 %) vs. suboptimal (HbA1c ≥ 7 %) glycaemic control at the time of catheterization. Results Baseline characteristics and coronary endothelial function parameters differed significantly between T2DM patients and control group. The prevalence of endothelial independent CMD (29.8 vs. 39.6 %, p = 0.40) and dependent CMD (61.7 vs. 62.2 %, p = 1.00) were similar in patients with optimal vs. suboptimal glycaemic control. Age (OR 1.10; CI 95 % 1.04–1.18; p < 0.001) and female gender (OR 3.87; CI 95 % 1.45–11.4; p < 0.01) were significantly associated with endothelial independent CMD whereas glomerular filtrate (OR 0.97; CI 95 % 0.95–0.99; p < 0.05) was significantly associated with endothelial dependent CMD. The optimal glycaemic control was not associated with endothelial independent (OR 0.60, CI 95 % 0.23–1.46; p 0.26) or dependent CMD (OR 0.99, CI 95 % 0.43–2.24; p = 0.98). Conclusions The current target of HBA1c level does not predict a better coronary microcirculatory function in T2DM patients. The appropriate strategy for prevention of CMD in T2DM patients remains to be addressed. Keywords: Endothelial dysfunction; Diabetes mellitus; Coronary microcirculation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The age-dependent choice between expressing individual learning (IL) or social learning (SL) affects cumulative cultural evolution. A learning schedule in which SL precedes IL is supportive of cumulative culture because the amount of nongenetically encoded adaptive information acquired by previous generations can be absorbed by an individual and augmented. Devoting time and energy to learning, however, reduces the resources available for other life-history components. Learning schedules and life history thus coevolve. Here, we analyze a model where individuals may have up to three distinct life stages: "infants" using IL or oblique SL, "juveniles" implementing IL or horizontal SL, and adults obtaining material resources with learned information. We study the dynamic allocation of IL and SL within life stages and how this coevolves with the length of the learning stages. Although no learning may be evolutionary stable, we find conditions where cumulative cultural evolution can be selected for. In that case, the evolutionary stable learning schedule causes individuals to use oblique SL during infancy and a mixture between IL and horizontal SL when juvenile. We also find that the selected pattern of oblique SL increases the amount of information in the population, but horizontal SL does not do so.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION. Patients admitted in Intensive Care Unit (ICU) from general wards are more severe and have a higher mortality than those admitted from emergency department as reported [1]. The majority of them develop signs of instability (e.g. tachypnea, tachycardia, hypotension, decreased oxygen saturation and change in conscious state) several hours before ICU admission. Considering this fact and that in-hospital cardiac arrests and unexpected deaths are usually preceded by warning signs, immediate on site intervention by specialists may be effective. This gave an impulse to medical emergency team (MET) implementation, which has been shown to decrease cardiac arrest, morbidity and mortality in several hospitals. OBJECTIVES AND METHODS. In order to verify if the same was true in our hospital and to determine if there was a need for MET, we prospectively collected all non elective ICU admissions of already hospitalized patients (general wards) and of patients remaining more than 3 h in emergency department (considered hospitalized). Instability criteria leading to MET call correspond to those described in the literature. The delay between the development of one criterion and ICU admission was registered. RESULTS. During an observation period of 12 months, 321 patients with our MET criteria were admitted to ICU. 88 patients came from the emergency department, 115 from the surgical and 113 from the medical ward. 65% were male. The median age was 65 years (range 17-89). The delay fromMETcriteria development to ICU admission was higher than 8 h in 155 patients, with a median delay of 32 h and a range of 8.4 h to 10 days. For the remaining 166 patients, an early MET criterion was present up to 8 h (median delay 3 h) before ICU admission. These results are quite concordant with the data reported in the literature (ref 1-8). 122 patients presented signs of sepsis or septic shock, 70 patients a respiratory failure, 58 patients a cardiac emergency. Cardiac arrest represent 5% of our collective of patients. CONCLUSIONS.Similar to others observations, the majority of hospitalized patients admitted on emergency basis in our ICU have warning signs lasting for several hours. More than half of them were unstable for more than 8 h. This shows there is plenty of time for early acute management by dedicated and specialized team such as MET. However, further studies are required to determine if MET implementation can reduce in-hospital cardiac arrests and influence the morbidity, the length of stay and the mortality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a thermally fluctuating long linear polymeric chain in a solution, the ends, from time to time, approach each other. At such an instance, the chain can be regarded as closed and thus will form a knot or rather a virtual knot. Several earlier studies of random knotting demonstrated that simpler knots show a higher occurrence for shorter random walks than do more complex knots. However, up to now there have been no rules that could be used to predict the optimal length of a random walk, i.e. the length for which a given knot reaches its highest occurrence. Using numerical simulations, we show here that a power law accurately describes the relation between the optimal lengths of random walks leading to the formation of different knots and the previously characterized lengths of ideal knots of a corresponding type.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Devices for venous cannulation have seen significant progress over time: the original, rigid steel cannulas have evolved toward flexible plastic cannulas with wire support that prevents kinking, very thin walled wire wound cannulas allowing for percutaneous application, and all sorts of combinations. In contrast to all these rectilinear venous cannula designs, which present the same cross-sectional area over their entire intravascular path, the smartcanula concept of "collapsed insertion and expansion in situ" is the logical next step for venous access. Automatically adjusting cross-sectional area up to a pre-determined diameter or the vessel lumen provides optimal flow and ease of use for both, insertion and removal. Smartcanula performance was assessed in a small series of patients (76 +/- 17 kg) undergoing redo procedures. The calculated target pump flow (2.4 L/min/m2) was 4.42 +/- 61 L/ min. Mean pump flow achieved during cardiopulmonary bypass was 4.84 +/- 87 L/min or 110% of the target. Reduced atrial chatter, kink resistance in situ, and improved blood drainage despite smaller access orifice size, are the most striking advantages of this new device. The benefits of smart cannulation are obvious in remote cannulation for limited access cardiac surgery, but there are many other cannula applications where space is an issue, and that is where smart cannulation is most effective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the problem of finding minimum-distortion policies for streaming delay-sensitive but distortion-tolerant data. We consider cross-layer approaches which exploit the coupling between presentation and transport layers. We make the natural assumption that the distortion function is convex and decreasing. We focus on a single source-destination pair and analytically find the optimum transmission policy when the transmission is done over an error-free channel. This optimum policy turns out to be independent of the exact form of the convex and decreasing distortion function. Then, for a packet-erasure channel, we analytically find the optimum open-loop transmission policy, which is also independent of the form of the convex distortion function. We then find computationally efficient closed-loop heuristic policies and show, through numerical evaluation, that they outperform the open-loop policy and have near optimal performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we introduce a pilot-aided multipath channel estimator for Multiple-Input Multiple-Output (MIMO) Orthogonal Frequency Division Multiplexing (OFDM) systems. Typical estimation algorithms assume the number of multipath components and delays to be known and constant, while theiramplitudes may vary in time. In this work, we focus on the more realistic assumption that also the number of channel taps is unknown and time-varying. The estimation problem arising from this assumption is solved using Random Set Theory (RST), which is a probability theory of finite sets. Due to the lack of a closed form of the optimal filter, a Rao-Blackwellized Particle Filter (RBPF) implementation of the channel estimator is derived. Simulation results demonstrate the estimator effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standards for the construction of full-depth patching in portland cement concrete pavement usually require replacement of all deteriorated based materials with crushed stone, up to the bottom of the existing pavement layer. In an effort to reduce the time of patch construction and costs, the Iowa Department of Transportation and the Department of Civil, Construction and Environmental Engineering at Iowa State University studied the use of extra concrete depth as an option for base construction. This report compares the impact of additional concrete patching material depth on rate of strength gain, potential for early opening to traffic, patching costs, and long-term patch performance. This report also compares those characteristics in terms of early setting and standard concrete mixes. The results have the potential to change the method of Portland cement concrete pavement patch construction in Iowa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: This study was undertaken to determine the delay of extubation attributable to ventilator-associated pneumonia (VAP) in comparison to other complications and complexity of surgery after repair of congenital heart lesions in neonates and children. METHODS: Cohort study in a pediatric intensive care unit of a tertiary referral center. All patients who had cardiac operations during a 22-month period and who survived surgery were eligible (n = 272, median age 1.3 years). Primary outcome was time to successful extubation. Primary variable of interest was VAP Surgical procedures were classified according to complexity. Cox proportional hazards models were calculated to adjust for confounding. Potential confounders comprised other known risk factors for delayed extubation. RESULTS: Median time to extubation was 3 days. VAP occurred in 26 patients (9.6%). The rate of VAP was not associated with complexity of surgery (P = 0.22), or cardiopulmonary bypass (P = 0.23). The adjusted analysis revealed as further factors associated with delayed extubation: other respiratory complications (n = 28, chylothorax, airway stenosis, diaphragm paresis), prolonged inotropic support (n = 48, 17.6%), and the need for secondary surgery (n = 51, 18.8%; e.g., re-operation, secondary closure of thorax). Older age promoted early extubation. The median delay of extubation attributable to VAP was 3.7 days (hazards ratio HR = 0.29, 95% CI 0.18-0.49), exceeding the effect size of secondary surgery (HR = 0.48) and other respiratory complications (HR = 0.50). CONCLUSION: VAP accounts for a major delay of extubation in pediatric cardiac surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The achievable region approach seeks solutions to stochastic optimisation problems by: (i) characterising the space of all possible performances(the achievable region) of the system of interest, and (ii) optimisingthe overall system-wide performance objective over this space. This isradically different from conventional formulations based on dynamicprogramming. The approach is explained with reference to a simpletwo-class queueing system. Powerful new methodologies due to the authorsand co-workers are deployed to analyse a general multiclass queueingsystem with parallel servers and then to develop an approach to optimalload distribution across a network of interconnected stations. Finally,the approach is used for the first time to analyse a class of intensitycontrol problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.