969 resultados para prospective study


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Five-aminolevulinic acid (Gliolan, medac, Wedel, Germany, 5-ALA) is approved for fluorescence-guided resections of adult malignant gliomas. Case reports indicate that 5-ALA can be used for children, yet no prospective study has been conducted as of yet. As a basis for a study, we conducted a survey among certified European Gliolan users to collect data on their experiences with children. METHODS Information on patient characteristics, MRI characteristics of tumors, histology, fluorescence qualities, and outcomes were requested. Surgeons were further asked to indicate whether fluorescence was "useful", i.e., leading to changes in surgical strategy or identification of residual tumor. Recursive partitioning analysis (RPA) was used for defining cohorts with high or low likelihoods for useful fluorescence. RESULTS Data on 78 patients <18 years of age were submitted by 20 centers. Fluorescence was found useful in 12 of 14 glioblastomas (85 %), four of five anaplastic astrocytomas (60 %), and eight of ten ependymomas grades II and III (80 %). Fluorescence was found inconsistently useful in PNETs (three of seven; 43 %), gangliogliomas (two of five; 40 %), medulloblastomas (two of eight, 25 %) and pilocytic astrocytomas (two of 13; 15 %). RPA of pre-operative factors showed tumors with supratentorial location, strong contrast enhancement and first operation to have a likelihood of useful fluorescence of 64.3 %, as opposed to infratentorial tumors with first surgery (23.1 %). CONCLUSIONS Our survey demonstrates 5-ALA as being used in pediatric brain tumors. 5-ALA may be especially useful for contrast-enhancing supratentorial tumors. These data indicate controlled studies to be necessary and also provide a basis for planning such a study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To investigate objective and subjective effects of an adjunctive contralateral routing of signal (CROS) device at the untreated ear in patients with a unilateral cochlear implant (CI). Design: Prospective study of 10 adult experienced unilateral CI users with bilateral severe-to-profound hearing loss. Speech in noise reception (SNR) and sound localization were measured with and without the additional CROS device. SNR was measured by applying speech signals at the untreated/CROS side while noise signals came from the front (S90N0). For S0N90, signal sources were switched. Sound localization was measured in a 12-loudspeaker full circle setup. To evaluate the subjective benefit, patients tried the device for 2 weeks at home, then filled out the abbreviated Speech, Spatial and Qualities of Hearing Scale as well as the Bern benefit in single-sided deafness questionnaires. Results: In the setting S90N0, all patients showed a highly significant SNR improvement when wearing the additional CROS device (mean 6.4 dB, p < 0.001). In the unfavorable setting S0N90, only a minor deterioration of speech understanding was noted (mean -0.66 dB, p = 0.54). Sound localization did not improve substantially with CROS. In the two questionnaires, 12 of 14 items showed an improvement in mean values, but none of them was statistically significant. Conclusion: Patients with unilateral CI benefit from a contralateral CROS device, particularly in a noisy environment, when speech comes from the CROS ear side. © 2014 S. Karger AG, Basel.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Endothelial glycocalyx participates in the maintenance of vascular integrity, and its perturbations cause capillary leakage, loss of vascular responsiveness, and enhanced adhesion of leukocytes and platelets. We hypothesized that marked shedding of the glycocalyx core protein, syndecan-1, occurs in end-stage liver disease (ESLD) and that it increases during orthotopic liver transplantation (OLT). We further evaluated the effects of general anesthesia on glycocalyx shedding and its association with acute kidney injury (AKI) after OLT. PATIENTS AND METHODS Thirty consecutive liver transplant recipients were enrolled in this prospective study. Ten healthy volunteers served as a control. Acute kidney injury was defined by Acute Kidney Injury Network criteria. RESULTS Plasma syndecan-1 was significantly higher in ESLD patients than in healthy volunteers (74.3 ± 59.9 vs 10.7 ± 9.4 ng/mL), and it further increased significantly after reperfusion (74.3 ± 59.9 vs 312.6 ± 114.8 ng/mL). The type of general anesthesia had no significant effect on syndecan-1. Syndecan-1 was significantly higher during the entire study in patients with posttransplant AKI stage 2 or 3 compared to patients with AKI stage 0 or 1. The area under the curve of the receiver operating characteristics curve of syndecane-1 to predict AKI stage 2 or 3 within 48 hours after reperfusion was 0.76 (95% confidence interval, 0.57-0.89, P = 0.005). CONCLUSIONS Patients with ESLD suffer from glycocalyx alterations, and ischemia-reperfusion injury during OLT further exacerbates its damage. Despite a higher incidence of AKI in patients with elevated syndecan-1, it is not helpful to predict de novo AKI. Volatile anesthetics did not attenuate glycocalyx shedding in human OLT.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE The aim of this study was to compare the diagnostic accuracy of 3D time-of-flight (TOF-MRA) and contrast-enhanced (CE-MRA) magnetic resonance angiography at 3 T for detection and quantification of proximal high-grade stenosis using multidetector computed tomography angiography (MDCTA) as reference standard. METHODS The institutional ethics committee approved this prospective study. A total of 41 patients suspected of having internal carotid artery (ICA) stenosis underwent both MDCTA and MRA. CE-MRA and TOF-MRA were performed using a 3.0-T imager with a dedicated eight-element cervical coil. ICA stenoses were measured according to the North American Symptomatic Carotid Endarterectomy Trial criteria and categorized as 0-25 % (minimal), 25-50 % (mild), 50-69 % (moderate), 70-99 % (high grade), and 100 % (occlusion). Sensitivity and specificity for the detection of high-grade ICA stenoses (70-99 %) and ICA occlusions were determined. In addition, intermodality agreement was assessed with κ-statistics for detection of high-grade ICA stenoses (70-99 %) and ICA occlusions. RESULTS A total of 80 carotid arteries of 41 patients were reviewed. Two previously stented ICAs were excluded from analysis. On MDCTA, 7 ICAs were occluded, 12 ICAs presented with and 63 without a high-grade ICA stenosis (70-99 %). For detecting 70-99 % stenosis, both 3D TOF-MRA and CE-MRA were 91.7 % sensitive and 98.5 % specific, respectively. Both MRA techniques were highly sensitive (100 %), and specific (CE-MRA, 100 %; TOF-MRA, 98.7 %) for the detection of ICA occlusion. However, TOF-MRA misclassified one high-grade stenosis as occlusion. Intermodality agreement for detection of 70-99 % ICA stenoses was excellent between TOF-MRA and CE-MRA [κ = 0.902, 95 % confidence interval (CI) = 0.769-1.000], TOF-MRA and MDCTA (κ = 0.902, 95 % CI = 0.769-1.000), and CE-MRA and MDCTA (κ = 0.902, 95 % CI = 0.769-1.000). CONCLUSION Both 3D TOF-MRA and CE-MRA at 3 T are reliable tools for detecting high-grade proximal ICA stenoses (70-99 %). 3D TOF-MRA might misclassify pseudo-occlusions as complete occlusions. If there are no contraindications for CE-MRA, CE-MRA is recommended as primary MR imaging modality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE In traumatic brain injury, diffusion-weighted and diffusion tensor imaging of the brain are essential techniques for determining the pathology sustained and the outcome. Postmortem cross-sectional imaging is an established adjunct to forensic autopsy in death investigation. The purpose of this prospective study was to evaluate postmortem diffusion tensor imaging in forensics for its feasibility, influencing factors and correlation to the cause of death compared with autopsy. METHODS Postmortem computed tomography, magnetic resonance imaging, and diffusion tensor imaging with fiber tracking were performed in 10 deceased subjects. The Likert scale grading of colored fractional anisotropy maps was correlated to the body temperature and intracranial pathology to assess the diagnostic feasibility of postmortem diffusion tensor imaging and fiber tracking. RESULTS Optimal fiber tracking (>15,000 fiber tracts) was achieved with a body temperature at 10°C. Likert scale grading showed no linear correlation (P > 0.7) to fiber tract counts. No statistically significant correlation between total fiber count and postmortem interval could be observed (P = 0.122). Postmortem diffusion tensor imaging and fiber tracking allowed for radiological diagnosis in cases with shearing injuries but was impaired in cases with pneumencephalon and intracerebral mass hemorrhage. CONCLUSIONS Postmortem diffusion tensor imaging with fiber tracking provides an exceptional in situ insight "deep into the fibers" of the brain with diagnostic benefit in traumatic brain injury and axonal injuries in the assessment of the underlying cause of death, considering influencing factors for optimal imaging technique.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dimensional alterations of the facial soft and bone tissues following tooth extraction in the esthetic zone play an essential role to achieve successful outcomes in implant therapy. This prospective study is the first to investigate the interplay between the soft tissue dimensions and the underlying bone anatomy during an 8-wk healing period. The analysis is based on sequential 3-dimensional digital surface model superimpositions of the soft and bone tissues using digital impressions and cone beam computed tomography during an 8-wk healing period. Soft tissue thickness in thin and thick bone phenotypes at extraction was similar, averaging 0.7 mm and 0.8 mm, respectively. Interestingly, thin bone phenotypes revealed a 7-fold increase in soft tissue thickness after an 8-wk healing period, whereas in thick bone phenotypes, the soft tissue dimensions remained unchanged. The observed spontaneous soft tissue thickening in thin bone phenotypes resulted in a vertical soft tissue loss of only 1.6 mm, which concealed the underlying vertical bone resorption of 7.5 mm. Because of spontaneous soft tissue thickening, no significant differences were detected in the total tissue loss between thin and thick bone phenotypes at 2, 4, 6, and 8 wk. More than 51% of these dimensional alterations occurred within 2 wk of healing. Even though the observed spontaneous soft tissue thickening in thin bone phenotypes following tooth extraction conceals the pronounced underlying bone resorption pattern by masking the true bone deficiency, spontaneous soft tissue thickening offers advantages for subsequent bone regeneration and implant therapies in sites with high esthetic demand (Clinicaltrials.gov NCT02403700).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate serum concentrations of biochemical markers and survival time in dogs with protein-losing enteropathy (PLE). DESIGN: Prospective study. ANIMALS: 29 dogs with PLE and 18 dogs with food-responsive diarrhea (FRD). PROCEDURES: Data regarding serum concentrations of various biochemical markers at the initial evaluation were available for 18 of the 29 dogs with PLE and compared with findings for dogs with FRD. Correlations between biochemical marker concentrations and survival time (interval between time of initial evaluation and death or euthanasia) for dogs with PLE were evaluated. RESULTS: Serum C-reactive protein concentration was high in 13 of 18 dogs with PLE and in 2 of 18 dogs with FRD. Serum concentration of canine pancreatic lipase immunoreactivity was high in 3 dogs with PLE but within the reference interval in all dogs with FRD. Serum α1-proteinase inhibitor concentration was less than the lower reference limit in 9 dogs with PLE and 1 dog with FRD. Compared with findings in dogs with FRD, values of those 3 variables in dogs with PLE were significantly different. Serum calprotectin (measured by radioimmunoassay and ELISA) and S100A12 concentrations were high but did not differ significantly between groups. Seventeen of the 29 dogs with PLE were euthanized owing to this disease; median survival time was 67 days (range, 2 to 2,551 days). CONCLUSIONS AND CLINICAL RELEVANCE: Serum C-reactive protein, canine pancreatic lipase immunoreactivity, and α1-proteinase inhibitor concentrations differed significantly between dogs with PLE and FRD. Most initial biomarker concentrations were not predictive of survival time in dogs with PLE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conclusion Using a second bone anchored hearing implant (BAHI) mounted on a testband in unilaterally implanted BAHI users to test its potential advantage pre-operatively under-estimates the advantage of two BAHIs placed on two implants. Objectives To investigate how well speech understanding with a second BAHI mounted on a testband approaches the benefit of bilaterally implanted BAHIs. Method Prospective study with 16 BAHI users. Eight were implanted unilaterally (group A) and eight were implanted bilaterally (group B). Aided speech understanding was measured. Speech was presented from the front and noise came either from the left, right, or from the front in two conditions for group A (with one BAHI, and with two BAHIs, where the second device was mounted on a testband) and in three conditions for group B (same two conditions as group A, and in addition with both BAHIs mounted on implants). Results Speech understanding in noise improved with the additional device for noise from the side of the first BAHI (+0.7 to +2.1 dB) and decreased for noise from the other side (-1.8 dB to -3.9 dB). Improvements were highest (+2.1 dB, p = 0.016) and disadvantages were smallest (-1.8 dB, p = 0.047) with both BAHIs mounted on implants. Testbands yielded smaller advantages and higher disadvantages of the additional BAHI (average difference = -0.9 dB).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Video-oculography devices are now used to quantify the vestibulo-ocular reflex (VOR) at the bedside using the head impulse test (HIT). Little is known about the impact of disruptive phenomena (e.g. corrective saccades, nystagmus, fixation losses, eye-blink artifacts) on quantitative VOR assessment in acute vertigo. This study systematically characterized the frequency, nature, and impact of artifacts on HIT VOR measures. From a prospective study of 26 patients with acute vestibular syndrome (16 vestibular neuritis, 10 stroke), we classified findings using a structured coding manual. Of 1,358 individual HIT traces, 72% had abnormal disruptive saccades, 44% had at least one artifact, and 42% were uninterpretable. Physicians using quantitative recording devices to measure head impulse VOR responses for clinical diagnosis should be aware of the potential impact of disruptive eye movements and measurement artifacts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Port-wine stains (PWS) are malformations of capillaries in 0.3% of newborn children. The treatment of choice is by pulsed dye LASER (PDL), and requires several sessions. The efficacy of this treatment is at present evaluated on the basis of clinical inspection and of digital photographs taken throughout the treatment. LASER-Doppler imaging (LDI) is a noninvasive method of imaging the perfusion of the tissues by the microcirculatory system (capillaries). The aim of this paper is to demonstrate that LDI allows a quantitative, numerical evaluation of the efficacy of the PDL treatment of PWS. METHOD The PDL sessions were organized according to the usual scheme, every other month, from September 1, 2012, to September 30, 2013. LDI imaging was performed at the start and at the conclusion of the PDL treatment, and simultaneously on healthy skin in order to obtain reference values. The results evidenced by LDI were analyzed according to the "Wilcoxon signed-rank" test before and after each session, and in the intervals between the three PDL treatment sessions. RESULTS Our prospective study is based on 20 new children. On average, the vascularization of the PWS was reduced by 56% after three laser sessions. Compared with healthy skin, initial vascularization of PWS was 62% higher than that of healthy skin at the start of treatment, and 6% higher after three sessions. During the 2 months between two sessions, vascularization of the capillary network increased by 27%. CONCLUSION This study shows that LDI can demonstrate and measure the efficacy of PDL treatment of PWS in children. The figures obtained when measuring the results by LDI corroborate the clinical assessments and may allow us to refine, and perhaps even modify, our present use of PDL and thus improve the efficacy of the treatment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To evaluate whether magnetic resonance imaging (MRI) is effective as computed tomography (CT) in determining morphologic and functional pulmonary changes in patients with cystic fibrosis (CF) in association with multiple clinical parameters. MATERIALS AND METHODS Institutional review board approval and patient written informed consent were obtained. In this prospective study, 30 patients with CF (17 men and 13 women; mean (SD) age, 30.2 (9.2) years; range, 19-52 years) were included. Chest CT was acquired by unenhanced low-dose technique for clinical purposes. Lung MRI (1.5 T) comprised T2- and T1-weighted sequences before and after the application of 0.1-mmol·kg gadobutrol, also considering lung perfusion imaging. All CT and MR images were visually evaluated by using 2 different scoring systems: the modified Helbich and the Eichinger scores. Signal intensity of the peribronchial walls and detected mucus on T2-weighted images as well as signal enhancement of the peribronchial walls on contrast-enhanced T1-weighted sequences were additionally assessed on MRI. For the clinical evaluation, the pulmonary exacerbation rate, laboratory, and pulmonary functional parameters were determined. RESULTS The overall modified Helbich CT score had a mean (SD) of 15.3 (4.8) (range, 3-21) and median of 16.0 (interquartile range [IQR], 6.3). The overall modified Helbich MR score showed slightly, not significantly, lower values (Wilcoxon rank sum test and Student t test; P > 0.05): mean (SD) of 14.3 (4.7) (range, 3-20) and median of 15.0 (IQR, 7.3). Without assessment of perfusion, the overall Eichinger score resulted in the following values for CT vs MR examinations: mean (SD), 20.3 (7.2) (range, 4-31); and median, 21.0 (IQR, 9.5) vs mean (SD), 19.5 (7.1) (range, 4-33); and median, 20.0 (IQR, 9.0). All differences between CT and MR examinations were not significant (Wilcoxon rank sum tests and Student t tests; P > 0.05). In general, the correlations of the CT scores (overall and different imaging parameters) to the clinical parameters were slightly higher compared to the MRI scores. However, if all additional MRI parameters were integrated into the scoring systems, the correlations reached the values of the CT scores. The overall image quality was significantly higher for the CT examinations compared to the MRI sequences. CONCLUSIONS One major diagnostic benefit of lung MRI in CF is the possible acquisition of several different morphologic and functional imaging features without the use of any radiation exposure. Lung MRI shows reliable associations with CT and clinical parameters, which suggests its implementation in CF for routine diagnosis, which would be particularly important in follow-up imaging over the long term.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES Neonatal arterial ischemic stroke (NAIS) is associated with considerable lifetime burdens such as cerebral palsy, epilepsy, and cognitive impairment. Prospective epidemiologic studies that include outcome assessments are scarce. This study aimed to provide information on the epidemiology, clinical manifestations, infarct characteristics, associated clinical variables, treatment strategies, and outcomes of NAIS in a prospective, population-based cohort of Swiss children. METHODS This prospective study evaluated the epidemiology, clinical manifestations, vascular territories, associated clinical variables, and treatment of all full-term neonates diagnosed with NAIS and born in Switzerland between 2000 and 2010. Follow-up was performed 2 years (mean 23.3 months, SD 4.3 months) after birth. RESULTS One hundred neonates (67 boys) had a diagnosis of NAIS. The NAIS incidence in Switzerland during this time was 13 (95% confidence interval [CI], 11-17) per 100,000 live births. Seizures were the most common symptom (95%). Eighty-one percent had unilateral (80% left-sided) and 19% had bilateral lesions. Risk factors included maternal risk conditions (32%), birth complications (68%), and neonatal comorbidities (54%). Antithrombotic and antiplatelet therapy use was low (17%). No serious side effects were reported. Two years after birth, 39% were diagnosed with cerebral palsy and 31% had delayed mental performance. CONCLUSIONS NAIS in Switzerland shows a similar incidence as other population-based studies. About one-third of patients developed cerebral palsy or showed delayed mental performance 2 years after birth, and children with normal mental performance may still develop deficits later in life.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cancer of the oral cavity and pharynx remains one of the ten leading causes of cancer death in the United States (US). Besides smoking and alcohol consumption, there are no well established risk factors. While poor dental care had been implicated, it is unknown if the lack of dental care, implying poor dental hygiene predisposes to oral cavity cancer. This study aimed to assess the relationship between dental care utilization during the past twelve months and the prevalence of oral cavity cancer. A cross-sectional design of the National Health Interview Survey of adult, non-institutionalized US residents (n=30,475) was used to assess the association between dental care utilization and self reported diagnosis of oral cavity cancer. Chi square statistic was used to examine the crude association between the predictor variable, dental care utilization and other covariates, while unconditional logistic regression was used to assess the relationship between oral cavity cancer and dental care utilization. There were statistically significant differences between those who utilized dental care during the past twelve months and those who did not with respect to education, income, age, marital status, and gender (p < 0.05), but not health insurance coverage (p = 0.53). Also, those who utilized dental care relative to those who did not were 65% less likely to present with oral cavity cancer, prevalence odds ratio (POR), 0.35, 95% Confidence Interval (CI), 0.12–0.98. Further, higher income advanced age, people of African heritage, and unmarried status were statistically significantly associated with oral cavity cancer, (p < 0.05), but health insurance coverage, alcohol use and smoking were not, p > 0.05. However, after simultaneously controlling for the relevant covariates, the association between dental care and oral cavity cancer did not attenuate nor persist. Thus, compared with those who did not use dental care, those who did wee 62% less likely to present with oral cavity cancer adjusted POR, 0.38, 95% CI, 0.13-1.10. Among US adults residing in community settings, use of dental care during the past twelve months did not significantly reduce the predisposition to oral cavity cancer. However, due to the nature of the data used in this study, which restricts temporal sequence, a large sample prospective study that may identify modifiable factors associated with oral cancer development namely poor dental care, is needed. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim: The goal of this study was to evaluate the change in hemoglobin A1C and glycemic control after nutrition intervention among a population of type 1 diabetic pediatric patients. Methods: Data was collected from all type 1 diabetic patients who were scheduled for a consultation with the diabetes/endocrine RD from January 2006 through December 2006. Two groups were compared, those who kept their RD appointment and those who did not keep their appointment. The main outcome measure was HgbA1C. An independent samples t-test compared the two groups with respect to change in HbgA1C before and after the most recent scheduled appointment with the RD. Baseline characteristics were used as covariates and analyzed and controlled for using analysis of covariance (ANCOVA). Results: There was no difference in HgbA1c after either attending an RD appointment or not having attended an RD appointment. Those who arrived for and attended their RD appointment and those who did not arrive for and attend their RD appointment, had statistically different HgbA1C's before their scheduled appointment as well as after the RD appointment. However, the two groups were not equal at the beginning of the study period. Discussion: A study design with inclusion criteria of a specified range of HgbA1C values within which the study subjects needed to fall, would have potentially eliminated the difference between the two groups at the beginning of the study period. Conducting either another retrospective study that controlled for the initial HgbA1C value or conducting a prospective study that designated a range of HgbA1C values would be worth investigating to evaluate the impact of medical nutrition therapy intervention and the role of the RD in diabetes management. It is an interesting finding that there was a significant difference in the initial HgbA1c for those who came to the RD appointment compared to those who did not come. The fact that in this study those who did not arrive for their RD appointment had worse control of their diabetes suggests that this is a high-risk group. Targeting diabetes education toward this group of patients may prove to be beneficial. ^