968 resultados para Interval of solidification
Resumo:
Millennial variability is a robust feature of many paleoclimate records, at least throughout the last several glacial cycles. Here we use the mean signal from Antarctic climate events 1 to 4 to probe the EPICA Dome C temperature proxy reconstruction through the last 500 ka for similar millennial-scale events. We find that clusters of millennial events occurred in a regular fashion over half of the time during this with a mean recurrence interval of 21 kyr. We find that there is no consistent link between ice-rafted debris deposition and millennial variability. Instead we speculate that changes in the zonality of atmospheric circulation over the North Atlantic form a viable alternative to freshwater release from icebergs as a trigger for millennial variability. We suggest that millennial changes in the zonality of atmospheric circulation over the North Atlantic are linked to precession via sea-ice feedbacks and that this relationship is modified by the presence of the large, Northern Hemisphere ice sheets during glacial periods.
Resumo:
A growing number of drugs have been shown to prolong cardiac repolarization, predisposing individuals to life-threatening ventricular arrhythmias known as Torsades de Pointes. Most of these drugs are known to interfere with the human ether à-gogo related gene 1 (hERG1) channel, whose current is one of the main determinants of action potential duration. Prolonged repolarization is reflected by lengthening of the QT interval of the electrocardiogram, as seen in the suitably named drug-induced long QT syndrome. Chirality (presence of an asymmetric atom) is a common feature of marketed drugs, which can therefore exist in at least two enantiomers with distinct three-dimensional structures and possibly distinct biological fates. Both the pharmacokinetic and pharmacodynamic properties can differ between enantiomers, as well as also between individuals who take the drug due to metabolic polymorphisms. Despite the large number of reports about drugs reducing the hERG1 current, potential stereoselective contributions have only been scarcely investigated. In this review, we present a non-exhaustive list of clinically important molecules which display chiral toxicity that may be related to hERG1-blocking properties. We particularly focus on methadone cardiotoxicity, which illustrates the importance of the stereoselective effect of drug chirality as well as individual variations resulting from pharmacogenetics. Furthermore, it seems likely that, during drug development, consideration of chirality in lead optimization and systematic assessment of the hERG1 current block with all enantiomers could contribute to the reduction of the risk of drug-induced LQTS.
Resumo:
Anaerobic digestion of food scraps has the potential to accomplish waste minimization, energy production, and compost or humus production. At Bucknell University, removal of food scraps from the waste stream could reduce municipal solid waste transportation costs and landfill tipping fees, and provide methane and humus for use on campus. To determine the suitability of food waste produced at Bucknell for high-solids anaerobic digestion (HSAD), a year-long characterization study was conducted. Physical and chemical properties, waste biodegradability, and annual production of biodegradable waste were assessed. Bucknell University food and landscape waste was digested at pilot-scale for over a year to test performance at low and high loading rates, ease of operation at 20% solids, benefits of codigestion of food and landscape waste, and toprovide digestate for studies to assess the curing needs of HSAD digestate. A laboratory-scale curing study was conducted to assess the curing duration required to reduce microbial activity, phytotoxicity, and odors to acceptable levels for subsequent use ofhumus. The characteristics of Bucknell University food and landscape waste were tested approximately weekly for one year, to determine chemical oxygen demand (COD), total solids (TS), volatile solids (VS), and biodegradability (from batch digestion studies). Fats, oil, and grease and total Kjeldahl nitrogen were also tested for some food waste samples. Based on the characterization and biodegradability studies, Bucknell University dining hall food waste is a good candidate for HSAD. During batch digestion studies Bucknell University food waste produced a mean of 288 mL CH4/g COD with a 95%confidence interval of 0.06 mL CH4/g COD. The addition of landscape waste for digestion increased methane production from both food and landscape waste; however, because the landscape waste biodegradability was extremely low the increase was small.Based on an informal waste audit, Bucknell could collect up to 100 tons of food waste from dining facilities each year. The pilot-scale high-solids anaerobic digestion study confirmed that digestion ofBucknell University food waste combined with landscape waste at a low organic loading rate (OLR) of 2 g COD/L reactor volume-day is feasible. During low OLR operation, stable reactor performance was demonstrated through monitoring of biogas production and composition, reactor total and volatile solids, total and soluble chemical oxygendemand, volatile fatty acid content, pH, and bicarbonate alkalinity. Low OLR HSAD of Bucknell University food waste and landscape waste combined produced 232 L CH4/kg COD and 229 L CH4/kg VS. When OLR was increased to high loading (15 g COD/L reactor volume-day) to assess maximum loading conditions, reactor performance became unstable due to ammonia accumulation and subsequent inhibition. The methaneproduction per unit COD also decreased (to 211 L CH4/kg COD fed), although methane production per unit VS increased (to 272 L CH4/kg VS fed). The degree of ammonia inhibition was investigated through respirometry in which reactor digestate was diluted and exposed to varying concentrations of ammonia. Treatments with low ammoniaconcentrations recovered quickly from ammonia inhibition within the reactor. The post-digestion curing process was studied at laboratory-scale, to provide a preliminary assessment of curing duration. Digestate was mixed with woodchips and incubated in an insulated container at 35 °C to simulate full-scale curing self-heatingconditions. Degree of digestate stabilization was determined through oxygen uptake rates, percent O2, temperature, volatile solids, and Solvita Maturity Index. Phytotoxicity was determined through observation of volatile fatty acid and ammonia concentrations.Stabilization of organics and elimination of phytotoxic compounds (after 10–15 days of curing) preceded significant reductions of volatile sulfur compounds (hydrogen sulfide, methanethiol, and dimethyl sulfide) after 15–20 days of curing. Bucknell University food waste has high biodegradability and is suitable for high-solids anaerobic digestion; however, it has a low C:N ratio which can result in ammonia accumulation under some operating conditions. The low biodegradability of Bucknell University landscape waste limits the amount of bioavailable carbon that it can contribute, making it unsuitable for use as a cosubstrate to increase the C:N ratio of food waste. Additional research is indicated to determine other cosubstrates with higher biodegradabilities that may allow successful HSAD of Bucknell University food waste at high OLRs. Some cosubstrates to investigate are office paper, field residues, or grease trap waste. A brief curing period of less than 3 weeks was sufficient to produce viable humus from digestate produced by low OLR HSAD of food and landscape waste.
Resumo:
Acute mental stress induces a significant increase in plasma interleukin (IL)-6 levels as a possible mechanism for how psychological stress might contribute to atherosclerosis. We investigated whether the IL-6 response would habituate in response to a repetitively applied mental stressor and whether cortisol reactivity would show a relationship with IL-6 reactivity. Study participants were 21 reasonably healthy men (mean age 46+/-7 years) who underwent the Trier Social Stress Test (combination of a 3-min preparation, 5-min speech, and 5-min mental arithmetic) three times with an interval of 1 week. Plasma IL-6 and free salivary cortisol were measured immediately before and after stress, and at 45 and 105 min of recovery from stress. Cortisol samples were also obtained 15 and 30 min after stress. Compared to non-stressed controls, IL-6 significantly increased between rest and 45 min post-stress (p=.022) and between rest and 105 min post-stress (p=.001). Peak cortisol (p=.034) and systolic blood pressure (p=.009) responses to stress both habituated between weeks one and three. No adaptation occurred in diastolic blood pressure, heart rate, and IL-6 responses to stress. The areas under the curve integrating the stress-induced changes in cortisol and IL-6 reactivity were negatively correlated at visit three (r=-.54, p=.011), but not at visit one. The IL-6 response to acute mental stress occurs delayed and shows no adaptation to repeated moderate mental stress. The hypothalamus-pituitary-adrenal axis may attenuate stress reactivity of IL-6. The lack of habituation in IL-6 responses to daily stress could subject at-risk individuals to higher atherosclerotic morbidity and mortality.
Resumo:
Worldwide, 700,000 infants are infected annually by HIV-1, most of them in resource-limited settings. Care for these children requires simple, inexpensive tests. We have evaluated HIV-1 p24 antigen for antiretroviral treatment (ART) monitoring in children. p24 by boosted enzyme-linked immunosorbent assay of heated plasma and HIV-1 RNA were measured prospectively in 24 HIV-1-infected children receiving ART. p24 and HIV-1 RNA concentrations and their changes between consecutive visits were related to the respective CD4+ changes. Age at study entry was 7.6 years; follow-up was 47.2 months, yielding 18 visits at an interval of 2.8 months (medians). There were 399 complete visit data sets and 375 interval data sets. Controlling for variation between individuals, there was a positive relationship between concentrations of HIV-1 RNA and p24 (P < 0.0001). While controlling for initial CD4+ count, age, sex, days since start of ART, and days between visits, the relative change in CD4+ count between 2 successive visits was negatively related to the corresponding relative change in HIV-1 RNA (P = 0.009), but not to the initial HIV-1 RNA concentration (P = 0.94). Similarly, we found a negative relationship with the relative change in p24 over the interval (P < 0.0001), whereas the initial p24 concentration showed a trend (P = 0.08). Statistical support for the p24 model and the HIV-1 RNA model was similar. p24 may be an accurate low-cost alternative to monitor ART in pediatric HIV-1 infection.
Resumo:
PURPOSE: To characterize the phenotype and map the locus responsible for autosomal recessive inherited ovine microphthalmia (OMO) in sheep. METHODS: Microphthalmia-affected lambs and their available relatives were collected in a field, and experimental matings were performed to obtain affected and normal lambs for detailed necropsy and histologic examinations. The matings resulted in 18 sheep families with 48 cases of microphthalmia. A comparative candidate gene approach was used to map the disease locus within the sheep genome. Initially, 27 loci responsible for the microphthalmia-anophthalmia phenotypes in humans or mice were selected to test for comparative linkage. Fifty flanking markers that were predicted from comparative genomic analysis to be closely linked to these genes were tested for linkage to the disease locus. After observation of statistical evidence for linkage, a confirmatory fine mapping strategy was applied by further genotyping of 43 microsatellites. RESULTS: The clinical and pathologic examinations showed slightly variable expressivity of isolated bilateral microphthalmia. The anterior eye chamber was small or absent, and a white mass admixed with cystic spaces extended from the papilla to the anterior eye chamber, while no recognizable vitreous body or lens was found within the affected eyes. Significant linkage to a single candidate region was identified at sheep chromosome 23. Fine mapping and haplotype analysis assigned the candidate region to a critical interval of 12.4 cM. This ovine chromosome segment encompasses an ancestral chromosomal breakpoint corresponding to two orthologue segments of human chromosomes 18, short and long arms. For the examined animals, we excluded the complete coding region and adjacent intronic regions of ovine TGIF1 to harbor disease-causing mutations. CONCLUSIONS: This is the first genetic localization for hereditary ovine isolated microphthalmia. It seems unlikely that a mutation in the TGIF1 gene is responsible for this disorder. The studied sheep represent a valuable large animal model for similar human ocular phenotypes.
Resumo:
The aim of the present study was to measure transit patterns of nutrients and the absorptive ability in ruminal drinkers (RDs) compared with healthy unweaned calves. The acetaminophen (paracetamol) absorption test was used to characterize the oroduodenal transit rate. Clinical examination and the analysis of various blood parameters provided supplementary information on digestive processes. Three unweaned bucket-fed calves (one RD and two healthy controls) each from seven Swiss dairy farms were included in the study. Measurements (tests 1 and 2) were performed twice at an interval of 10 days. Between tests, the feeding technique of the RDs and one control calf per farm was changed to feeding with a nipple instead of by bucket (without nipple). Acetaminophen appearance in the blood was delayed and reduced in RDs compared with the controls. Acid-base metabolism and several haematological and metabolic parameters differed markedly between RDs and healthy controls. The characteristics of the oroduodenal transit rate, absorptive abilities and clinical status in RDs were nearly normalised within 10 days of reconditioning.
Resumo:
BACKGROUND: Data on female patients with atherosclerotic peripheral arterial disease (PAD) are scarce, and limited primarily to the elderly population with multilevel disease. In this longitudinal observational study we compare female patients below 60 years of age with isolated lesions at the aortic bifurcation or focal superficial femoral artery disease. PATIENTS AND METHODS: Analysis is based on consecutive series of 43 female patients with PAD limited to the aortoiliac bifurcation (n = 28, group I) or an isolated femoral segment at the adductor channel (n = 15, group II) seen in a tertiary referral center between 1998 and 2000. The first assessment provided baseline data, with follow up data obtained at this study. Traditional risk factors, carotid artery disease and clinical outcome (mortality, cardiovascular events, vascular re-intervention rate, PAD progression) were evaluated over an interval of 5 (2 to 8) years. RESULTS: Female patients with aortic disease [group I] were younger (51.8 +/- 7.7 vs. 56.7 +/- 7.6 years in group II; p = 0.048), presented a more masculine phenotype, and smoked significantly more often (82% vs. 40%; p = 0.007). Arterial hypertension and diabetes mellitus were more common in group II, though it missed statistical significance (p = 0.068 and p = 0.085). Cardiovascular and limb outcome were comparable in both groups of female patients, while carotid artery disease was more severe in group I (i.e., carotid plaques in 71 vs. 53%). CONCLUSION: Our data support previous findings that cigarette smoking is a stronger risk factor for aortic disease as compared to femoral disease in younger female patients, with the strongest effect of smoking on a localized region of the aortic bifurcation.
Resumo:
Early impaired cerebral blood flow (CBF) after severe head injury (SHI) leads to poor brain tissue oxygen delivery and lactate accumulation. The purpose of this investigation was to elucidate the relationship between CBF, local dialysate lactate (lact(md)) and dialysate glucose (gluc(md)), and brain tissue oxygen levels (PtiO2) under arterial normoxia. The effect of increased brain tissue oxygenation due to high fractions of inspired oxygen (FiO2) on lact(md) and CBF was explored. A total of 47 patients with SHI were enrolled in this studies (Glasgow Coma Score [GCS] < 8). CBF was first assessed in 40 patients at one time point in the first 96 hours (27 +/- 28 hours) after SHI using stable xenon computed tomography (Xe-CT) (30% inspired xenon [FiXe] and 35% FiO2). In a second study, sequential double CBF measurements were performed in 7 patients with 35% FiO2 and 60% FiO2, respectively, with an interval of 30 minutes. In a subsequent study, 14 patients underwent normobaric hyperoxia by increasing FiO2 from 35 +/- 5% to 60% and then 100% over a period of 6 hours. This was done to test the effect of normobaric hyperoxia on lact(md) and brain gluc(md), as measured by local microdialysis. Changes in PtiO2 in response to changes in FiO2 were analyzed by calculating the oxygen reactivity. Oxygen reactivity was then related to the 3-month outcome data. The levels of lact(md) and gluc(md) under hyperoxia were compared with the baseline levels, measured at 35% FiO2. Under normoxic conditions, there was a significant correlation between CBF and PtiO2 (R = 0.7; P < .001). In the sequential double CBF study, however, FiO2 was inversely correlated with CBF (P < .05). In the 14 patients undergoing the 6-hour 100% FiO2 challenge, the mean PtiO2 levels increased to 353 (87% compared with baseline), although the mean lact(md) levels decreased by 38 +/- 16% (P < .05). The PtiO2 response to 100% FiO2 (oxygen reactivity) was inversely correlated with outcome (P < .01). Monitoring PtiO2 after SHI provides valuable information about cerebral oxygenation and substrate delivery. Increasing arterial oxygen tension (PaO2) effectively increased PtiO2, and brain lact(md) was reduced by the same maneuver.
Resumo:
BACKGROUND: At a mean follow-up of 3.1 years, twenty-seven consecutive repairs of massive rotator cuff tears yielded good and excellent clinical results despite a retear rate of 37%. Patients with a retear had improvement over the preoperative state, but those with a structurally intact repair had a substantially better result. The purpose of this study was to reassess the same patients to determine the long-term functional and structural results. METHODS: At a mean follow-up interval of 9.9 years, twenty-three of the twenty-seven patients returned for a review and were examined clinically, radiographically, and with magnetic resonance imaging with use of a methodology identical to that used at 3.1 years. RESULTS: Twenty-two of the twenty-three patients remained very satisfied or satisfied with the result. The mean subjective shoulder value was 82% (compared with 80% at 3.1 years). The mean relative Constant score was 85% (compared with 83% at 3.1 years). The retear rate was 57% at 9.9 years (compared with 37% at 3.1 years; p = 0.168). Patients with an intact repair had a better result than those with a failed reconstruction with respect to the mean absolute Constant score (81 compared with 64 points, respectively; p = 0.015), mean relative Constant score (95% and 77%; p = 0.002), and mean strength of abduction (5.5 and 2.6 kg; p = 0.007). The mean retear size had increased from 882 to 1164 mm(2) (p = 0.016). Supraspinatus and infraspinatus muscle fatty infiltration had increased (p = 0.004 and 0.008, respectively). Muscles with torn tendons preoperatively showed more fatty infiltration than muscles with intact tendons preoperatively, regardless of repair integrity. Shoulders with a retear had a significantly higher mean acromion index than those without retear (0.75 and 0.65, respectively; p = 0.004). CONCLUSIONS: Open repair of massive rotator cuff tears yielded clinically durable, excellent results with high patient satisfaction at a mean of almost ten years postoperatively. Conversely, fatty muscle infiltration of the supraspinatus and infraspinatus progressed, and the retear size increased over time. The preoperative integrity of the tendon appeared to be protective against muscle deterioration. A wide lateral extension of the acromion was identified as a previously unknown risk factor for retearing.
Resumo:
OBJECTIVE: To obtain precise information on the optimal time window for surgical antimicrobial prophylaxis. SUMMARY BACKGROUND DATA: Although perioperative antimicrobial prophylaxis is a well-established strategy for reducing the risk of surgical site infections (SSI), the optimal timing for this procedure has yet to be precisely determined. Under today's recommendations, antibiotics may be administered within the final 2 hours before skin incision, ideally as close to incision time as possible. METHODS: In this prospective observational cohort study at Basel University Hospital we analyzed the incidence of SSI by the timing of antimicrobial prophylaxis in a consecutive series of 3836 surgical procedures. Surgical wounds and resulting infections were assessed to Centers for Disease Control and Prevention standards. Antimicrobial prophylaxis consisted in single-shot administration of 1.5 g of cefuroxime (plus 500 mg of metronidazole in colorectal surgery). RESULTS: The overall SSI rate was 4.7% (180 of 3836). In 49% of all procedures antimicrobial prophylaxis was administered within the final half hour. Multivariable logistic regression analyses showed a significant increase in the odds of SSI when antimicrobial prophylaxis was administered less than 30 minutes (crude odds ratio = 2.01; adjusted odds ratio = 1.95; 95% confidence interval, 1.4-2.8; P < 0.001) and 120 to 60 minutes (crude odds ratio = 1.75; adjusted odds ratio = 1.74; 95% confidence interval, 1.0-2.9; P = 0.035) as compared with the reference interval of 59 to 30 minutes before incision. CONCLUSIONS: When cefuroxime is used as a prophylactic antibiotic, administration 59 to 30 minutes before incision is more effective than administration during the last half hour.
Resumo:
Through the use of rhetoric centered on authority and risk avoidance, scientific method has co-opted knowledge, especially women's everyday and experiential knowledge in the domestic sphere. This, in turn, has produced a profound affect on technical communication in the present day. I am drawing on rhetorical theory to study cookbooks and recipes for their contributions to changes in instructional texts. Using the rhetorical lenses of metis (cunning intelligence), kairos (timing and fitness) and mneme (memory), I examine the way in which recipes and cookbooks are constructed, used and perceived. This helps me uncover lost voices in history, the voices of women who used recipes, produced cookbooks and changed the way instructions read. Beginning with the earliest cookbooks and recipes, but focusing on the pivotal temporal interval of 1870-1935, I investigate the writing and rhetorical forces shaping instruction sets and domestic discourse. By the time of scientific cooking and domestic science, everyday and experiential knowledge were being excluded to make room for scientific method and the industrial values of the public sphere. In this study, I also assess how the public sphere, via Cooperative Extension Services and other government agencies, impacted the domestic sphere, further devaluing everyday knowledge in favor of the public scientific model. I will show how the changes in the production of food, cookbooks and recipes were related to changes in technical communication. These changes had wide rippling effects on the field of technical communication. By returning to some of the tenets and traditions of everyday and experiential knowledge, technical communication scholars, practitioners and instructors today can find new ways to encounter technical communication, specifically regarding the creation of instructional texts. Bringing cookbooks, recipes and everyday knowledge into the classroom and the field engenders a new realm of epistemological possibilities.
Resumo:
BACKGROUND: Transient neurological dysfunction (TND) consists of postoperative confusion, delirium and agitation. It is underestimated after surgery on the thoracic aorta and its influence on long-term quality of life (QoL) has not yet been studied. This study aimed to assess the influence of TND on short- and long-term outcome following surgery of the ascending aorta and proximal arch. METHODS: Nine hundred and seven patients undergoing surgery of the ascending aorta and the proximal aortic arch at our institution were included. Two hundred and ninety patients (31.9%) underwent surgery because of acute aortic dissection type A (AADA) and 617 patients because of aortic aneurysm. In 547 patients (60.3%) the distal anastomosis was performed using deep hypothermic circulatory arrest (DHCA). TND was defined as a Glasgow coma scale (GCS) value <13. All surviving patients had a clinical follow up and QoL was assessed with an SF-36 questionnaire. RESULTS: Overall in-hospital mortality was 8.3%. TND occurred in 89 patients (9.8%). As compared to patients without TND, those who suffered from TND were older (66.4 vs 59.9 years, p<0.01) underwent more frequently emergent procedures (53% vs 32%, p<0.05) and surgery under DHCA (84.3% vs 57.7%, p<0.05). However, duration of DHCA and extent of surgery did not influence the incidence of TND. In-hospital mortality in the group of patients with TND compared to the group without TND was similar (12.0% vs 11.4%; p=ns). Patients with TND suffered more frequently from coronary artery disease (28% vs 20.8%, p=ns) and were more frequently admitted in a compromised haemodynamic condition (23.6% vs 9.9%, p<0.05). Postoperative course revealed more pulmonary complications such as prolonged mechanical ventilation. Additional to their transient neurological dysfunction, significantly more patients had strokes with permanent neurological loss of function (14.6% vs 4.8%, p<0.05) compared to the patients without TND. ICU and hospital stay were significantly prolonged in TND patients (18+/-13 days vs 12+/-7 days, p<0.05). Over a mean follow-up interval of 27+/-14 months, patients with TND showed a significantly impaired QoL. CONCLUSION: The neurological outcome following surgery of the ascending aorta and proximal aortic arch is of paramount importance. The impact of TND on short- and long-term outcome is underestimated and negatively affects the short- and long-term outcome.
Resumo:
The performance of memory-guided saccades with two different delays (3 and 30 s of memorization) was studied in seven healthy subjects. Double-pulse transcranial magnetic stimulation (dTMS) with an interstimulus interval of 100 ms was applied over the right dorsolateral prefrontal cortex (DLPFC) early (1 s after target presentation) and late (28 s after target presentation). Early stimulation significantly increased in both delays the percentage of error in amplitude (PEA) of contralateral memory-guided saccades compared to the control experiment without stimulation. dTMS applied late in the delay had no significant effect on PEA. Furthermore, we found a significantly smaller effect of early stimulation in the long-delay paradigm. These results suggest a time-dependent hierarchical organization of the spatial working memory with a functional dominance of DLPFC during the early memorization, independent from the memorization delay. For a long memorization delay, however, working memory seems to have an additional, DLPFC-independent component.
Resumo:
Rapid diagnostic tests (RDT) are sometimes recommended to improve the home-based management of malaria. The accuracy of an RDT for the detection of clinical malaria and the presence of malarial parasites has recently been evaluated in a high-transmission area of southern Mali. During the same study, the cost-effectiveness of a 'test-and-treat' strategy for the home-based management of malaria (based on an artemisinin-combination therapy) was compared with that of a 'treat-all' strategy. Overall, 301 patients, of all ages, each of whom had been considered a presumptive case of uncomplicated malaria by a village healthworker, were checked with a commercial RDT (Paracheck-Pf). The sensitivity, specificity, and positive and negative predictive values of this test, compared with the results of microscopy and two different definitions of clinical malaria, were then determined. The RDT was found to be 82.9% sensitive (with a 95% confidence interval of 78.0%-87.1%) and 78.9% (63.9%-89.7%) specific compared with the detection of parasites by microscopy. In the detection of clinical malaria, it was 95.2% (91.3%-97.6%) sensitive and 57.4% (48.2%-66.2%) specific compared with a general practitioner's diagnosis of the disease, and 100.0% (94.5%-100.0%) sensitive but only 30.2% (24.8%-36.2%) specific when compared against the fulfillment of the World Health Organization's (2003) research criteria for uncomplicated malaria. Among children aged 0-5 years, the cost of the 'test-and-treat' strategy, per episode, was about twice that of the 'treat-all' (U.S.$1.0. v. U.S.$0.5). In older subjects, however, the two strategies were equally costly (approximately U.S.$2/episode). In conclusion, for children aged 0-5 years in a high-transmission area of sub-Saharan Africa, use of the RDT was not cost-effective compared with the presumptive treatment of malaria with an ACT. In older patients, use of the RDT did not reduce costs. The question remains whether either of the strategies investigated can be made affordable for the affected population.