801 resultados para feasibility study
Resumo:
BACKGROUND Delayed enhancement (DE) MRI can assess the fibrotic substrate of scar-related VT. MDCT has the advantage of inframillimetric spatial resolution and better 3D reconstructions. We sought to evaluate the feasibility and usefulness of integrating merged MDCT/MRI data in 3D-mapping systems for structure-function assessment and multimodal guidance of VT mapping and ablation. METHODS Nine patients, including 3 ischemic cardiomyopathy (ICM), 3 nonischemic cardiomyopathy (NICM), 2 myocarditis, and 1 redo procedure for idiopathic VT, underwent MRI and MDCT before VT ablation. Merged MRI/MDCT data were integrated in 3D-mapping systems and registered to high-density endocardial and epicardial maps. Low-voltage areas (<1.5 mV) and local abnormal ventricular activities (LAVA) during sinus rhythm were correlated to DE at MRI, and wall-thinning (WT) at MDCT. RESULTS Endocardium and epicardium were mapped with 391 ± 388 and 1098 ± 734 points per map, respectively. Registration of MDCT allowed visualization of coronary arteries during epicardial mapping/ablation. In the idiopathic patient, integration of MRI data identified previously ablated regions. In ICM patients, both DE at MRI and WT at MDCT matched areas of low voltage (overlap 94 ± 6% and 79 ± 5%, respectively). In NICM patients, wall-thinning areas matched areas of low voltage (overlap 63 ± 21%). In patients with myocarditis, subepicardial DE matched areas of epicardial low voltage (overlap 92 ± 12%). A total number of 266 LAVA sites were found in 7/9 patients. All LAVA sites were associated to structural substrate at imaging (90% inside, 100% within 18 mm). CONCLUSION The integration of merged MDCT and DEMRI data is feasible and allows combining substrate assessment with high-spatial resolution to better define structure-function relationship in scar-related VT.
Resumo:
HYPOTHESIS Facial nerve monitoring can be used synchronous with a high-precision robotic tool as a functional warning to prevent of a collision of the drill bit with the facial nerve during direct cochlear access (DCA). BACKGROUND Minimally invasive direct cochlear access (DCA) aims to eliminate the need for a mastoidectomy by drilling a small tunnel through the facial recess to the cochlea with the aid of stereotactic tool guidance. Because the procedure is performed in a blind manner, structures such as the facial nerve are at risk. Neuromonitoring is a commonly used tool to help surgeons identify the facial nerve (FN) during routine surgical procedures in the mastoid. Recently, neuromonitoring technology was integrated into a commercially available drill system enabling real-time monitoring of the FN. The objective of this study was to determine if this drilling system could be used to warn of an impending collision with the FN during robot-assisted DCA. MATERIALS AND METHODS The sheep was chosen as a suitable model for this study because of its similarity to the human ear anatomy. The same surgical workflow applicable to human patients was performed in the animal model. Bone screws, serving as reference fiducials, were placed in the skull near the ear canal. The sheep head was imaged using a computed tomographic scanner and segmentation of FN, mastoid, and other relevant structures as well as planning of drilling trajectories was carried out using a dedicated software tool. During the actual procedure, a surgical drill system was connected to a nerve monitor and guided by a custom built robot system. As the planned trajectories were drilled, stimulation and EMG response signals were recorded. A postoperative analysis was achieved after each surgery to determine the actual drilled positions. RESULTS Using the calibrated pose synchronized with the EMG signals, the precise relationship between distance to FN and EMG with 3 different stimulation intensities could be determined for 11 different tunnels drilled in 3 different subjects. CONCLUSION From the results, it was determined that the current implementation of the neuromonitoring system lacks sensitivity and repeatability necessary to be used as a warning device in robotic DCA. We hypothesize that this is primarily because of the stimulation pattern achieved using a noninsulated drill as a stimulating probe. Further work is necessary to determine whether specific changes to the design can improve the sensitivity and specificity.
Resumo:
What's known on the subject? and What does the study add? The EndoSew® prototype was first tested in a porcine model several years ago. The investigators found it both simple to master and reliable, its greatest advantage being a 2.4-fold time saving compared with straight laparoscopic suturing. In addition to that publication, there is a single case report describing the performance of an open EndoSew® suture to close parts (16 cm) of an ileal neobladder. The time for suturing the 16 cm ileum was 25 min, which is in line with our experience. The knowledge on this subject is limited to these two publications. We report on the first consecutive series of ileal conduits performed in humans using the novel prototype sewing device EndoSew®. The study shows that the beginning and the end of the suture process represent the critical procedural steps. It also shows that, overall, the prototype sewing machine has the potential to facilitate the intracorporeal suturing required in reconstructive urology for construction of urinary diversions. Objective To evaluate the feasibility and safety of the novel prototype sewing device EndoSew® in placing an extracorporeal resorbable running suture for ileal conduits. Patients and Methods We conducted a prospective single-centre pilot study of 10 consecutive patients undergoing ileal conduit, in whom the proximal end of the ileal conduit was closed extracorporeally using an EndoSew® running suture. The primary endpoint was the safety of the device and the feasibility of the sewing procedure which was defined as a complete watertight running suture line accomplished by EndoSew® only. Watertightness was assessed using methylene blue intraoperatively and by loopography on postoperative days 7 and 14. Secondary endpoints were the time requirements and complications ≤30 days after surgery. Results A complete EndoSew® running suture was feasible in nine patients; the suture had to be abandoned in one patient because of mechanical failure. In three patients, two additional single freehand stitches were needed to anchor the thread and to seal tiny leaks. Consequently, all suture lines in 6/10 patients were watertight with EndoSew® suturing alone and in 10/10 patients after additional freehand stitches. The median (range) sewing time was 5.5 (3–10) min and the median (range) suture length was 4.5 (2–5.5) cm. There were no suture-related complications. Conclusions The EndoSew® procedure is both feasible and safe. After additional freehand stitches in four patients all sutures were watertight. With further technical refinements, EndoSew® has the potential to facilitate the intracorporeal construction of urinary diversions.
Resumo:
BACKGROUND:
Robotics-assisted tilt table technology was introduced for early rehabilitation of neurological patients. It provides cyclical stepping movement and physiological loading of the legs. The aim of the present study was to assess the feasibility of this type of device for peak cardiopulmonary performance testing using able-bodied subjects.
METHODS:
A robotics-assisted tilt table was augmented with force sensors in the thigh cuffs and a work rate estimation algorithm. A custom visual feedback system was employed to guide the subjects' work rate and to provide real time feedback of actual work rate. Feasibility assessment focused on: (i) implementation (technical feasibility), and (ii) responsiveness (was there a measurable, high-level cardiopulmonary reaction?). For responsiveness testing, each subject carried out an incremental exercise test to the limit of functional capacity with a work rate increment of 5 W/min in female subjects and 8 W/min in males.
RESULTS:
11 able-bodied subjects were included (9 male, 2 female; age 29.6 ± 7.1 years: mean ± SD). Resting oxygen uptake (O
Resumo:
Bovine spongiform encephalopathy (BSE), popularly known as 'mad cow disease', led to an epidemic in Europe that peaked in the mid-1990s. Its impact on developing countries, such as Nigeria, has not been fully established as information on livestock and surveillance has eluded those in charge of this task. The BSE risk to Nigeria's cattle population currently remains undetermined, which has resulted in international trade restrictions on commodities from the cattle population. This is mainly because of a lack of updated BSE risk assessments and disease surveillance data. To evaluate the feasibility of BSE surveillance in Nigeria, we carried out a pilot study targeting cattle that were presented for emergency or casualty slaughter. In total, 1551 cattle of local breeds, aged 24 months and above were clinically examined. Ataxia, recumbency and other neurological signs were topmost on our list of criteria. A total of 96 cattle, which correspond to 6.2%, presented clinical signs that supported a suspect of BSE. The caudal brainstem tissues of these animals were collected post-mortem and analysed for the disease-specific form of the prion protein using a rapid test approved by the International Animal Health Organization (OIE). None of the samples were positive for BSE. Although our findings do not exclude the presence of BSE in Nigeria, they do demonstrate that targeted sampling of clinically suspected cases of BSE is feasible in developing countries. In addition, these findings point to the possibility of implementing clinical monitoring schemes for BSE and potentially other diseases with grave economic and public health consequences.
Resumo:
OBJECTIVE: To investigate the prevalence of discontinuation and nonpublication of surgical versus medical randomized controlled trials (RCTs) and to explore risk factors for discontinuation and nonpublication of surgical RCTs. BACKGROUND: Trial discontinuation has significant scientific, ethical, and economic implications. To date, the prevalence of discontinuation of surgical RCTs is unknown. METHODS: All RCT protocols approved between 2000 and 2003 by 6 ethics committees in Canada, Germany, and Switzerland were screened. Baseline characteristics were collected and, if published, full reports retrieved. Risk factors for early discontinuation for slow recruitment and nonpublication were explored using multivariable logistic regression analyses. RESULTS: In total, 863 RCT protocols involving adult patients were identified, 127 in surgery (15%) and 736 in medicine (85%). Surgical trials were discontinued for any reason more often than medical trials [43% vs 27%, risk difference 16% (95% confidence interval [CI]: 5%-26%); P = 0.001] and more often discontinued for slow recruitment [18% vs 11%, risk difference 8% (95% CI: 0.1%-16%); P = 0.020]. The percentage of trials not published as full journal article was similar in surgical and medical trials (44% vs 40%, risk difference 4% (95% CI: -5% to 14%); P = 0.373). Discontinuation of surgical trials was a strong risk factor for nonpublication (odds ratio = 4.18, 95% CI: 1.45-12.06; P = 0.008). CONCLUSIONS: Discontinuation and nonpublication rates were substantial in surgical RCTs and trial discontinuation was strongly associated with nonpublication. These findings need to be taken into account when interpreting surgical literature. Surgical trialists should consider feasibility studies before embarking on full-scale trials.
Resumo:
In recent decades the application of bioreactors has revolutionized the concept of culturing tissues and organs that require mechanical loading. In intervertebral disc (IVD) research, collaborative efforts of biomedical engineering, biology and mechatronics have led to the innovation of new loading devices that can maintain viable IVD organ explants from large animals and human cadavers in precisely defined nutritional and mechanical environments over extended culture periods. Particularly in spine and IVD research, these organ culture models offer appealing alternatives, as large bipedal animal models with naturally occurring IVD degeneration and a genetic background similar to the human condition do not exist. Latest research has demonstrated important concepts including the potential of homing of mesenchymal stem cells to nutritionally or mechanically stressed IVDs, and the regenerative potential of "smart" biomaterials for nucleus pulposus or annulus fibrosus repair. In this review, we summarize the current knowledge about cell therapy, injection of cytokines and short peptides to rescue the degenerating IVD. We further stress that most bioreactor systems simplify the real in vivo conditions providing a useful proof of concept. Limitations are that certain aspects of the immune host response and pain assessments cannot be addressed with ex vivo systems. Coccygeal animal disc models are commonly used because of their availability and similarity to human IVDs. Although in vitro loading environments are not identical to the human in vivo situation, 3D ex vivo organ culture models of large animal coccygeal and human lumbar IVDs should be seen as valid alternatives for screening and feasibility testing to augment existing small animal, large animal, and human clinical trial experiments.
Resumo:
Background: Access to hepatitis B viral load (VL) testing is poor in sub-Saharan Africa (SSA) due toeconomic and logistical reasons.Objectives: To demonstrate the feasibility of testing dried blood spots (DBS) for hepatitis B virus (HBV)VL in a laboratory in Lusaka, Zambia, and to compare HBV VLs between DBS and plasma samples.Study design: Paired plasma and DBS samples from HIV-HBV co-infected Zambian adults were analyzedfor HBV VL using the COBAS AmpliPrep/COBAS TaqMan HBV test (Version 2.0) and for HBV genotypeby direct sequencing. We used Bland-Altman analysis to compare VLs between sample types and bygenotype. Logistic regression analysis was conducted to assess the probability of an undetectable DBSresult by plasma VL.Results: Among 68 participants, median age was 34 years, 61.8% were men, and median plasma HBV VLwas 3.98 log IU/ml (interquartile range, 2.04–5.95). Among sequenced viruses, 28 were genotype A1 and27 were genotype E. Bland–Altman plots suggested strong agreement between DBS and plasma VLs. DBSVLs were on average 1.59 log IU/ml lower than plasma with 95% limits of agreement of −2.40 to −0.83 logIU/ml. At a plasma VL ≥2,000 IU/ml, the probability of an undetectable DBS result was 1.8% (95% CI:0.5–6.6). At plasma VL ≥20,000 IU/ml this probability reduced to 0.2% (95% CI: 0.03–1.7).
Resumo:
BACKGROUND: We evaluated the feasibility of an augmented robotics-assisted tilt table (RATT) for incremental cardiopulmonary exercise testing (CPET) and exercise training in dependent-ambulatory stroke patients. METHODS: Stroke patients (Functional Ambulation Category ≤ 3) underwent familiarization, an incremental exercise test (IET) and a constant load test (CLT) on separate days. A RATT equipped with force sensors in the thigh cuffs, a work rate estimation algorithm and real-time visual feedback to guide the exercise work rate was used. Feasibility assessment considered technical feasibility, patient tolerability, and cardiopulmonary responsiveness. RESULTS: Eight patients (4 female) aged 58.3 ± 9.2 years (mean ± SD) were recruited and all completed the study. For IETs, peak oxygen uptake (V'O2peak), peak heart rate (HRpeak) and peak work rate (WRpeak) were 11.9 ± 4.0 ml/kg/min (45 % of predicted V'O2max), 117 ± 32 beats/min (72 % of predicted HRmax) and 22.5 ± 13.0 W, respectively. Peak ratings of perceived exertion (RPE) were on the range "hard" to "very hard". All 8 patients reached their limit of functional capacity in terms of either their cardiopulmonary or neuromuscular performance. A ventilatory threshold (VT) was identified in 7 patients and a respiratory compensation point (RCP) in 6 patients: mean V'O2 at VT and RCP was 8.9 and 10.7 ml/kg/min, respectively, which represent 75 % (VT) and 85 % (RCP) of mean V'O2peak. Incremental CPET provided sufficient information to satisfy the responsiveness criteria and identification of key outcomes in all 8 patients. For CLTs, mean steady-state V'O2 was 6.9 ml/kg/min (49 % of V'O2 reserve), mean HR was 90 beats/min (56 % of HRmax), RPEs were > 2, and all patients maintained the active work rate for 10 min: these values meet recommended intensity levels for bouts of training. CONCLUSIONS: The augmented RATT is deemed feasible for incremental cardiopulmonary exercise testing and exercise training in dependent-ambulatory stroke patients: the approach was found to be technically implementable, acceptable to the patients, and it showed substantial cardiopulmonary responsiveness. This work has clinical implications for patients with severe disability who otherwise are not able to be tested.
Resumo:
BACKGROUND Ulnar nerve decompression at the elbow traditionally requires regional or general anesthesia. We wished to assess the feasibility of performing ulnar nerve decompression and transposition at the elbow under local anesthesia. METHODS We examined retrospectively the charts of 50 consecutive patients having undergone ulnar nerve entrapment surgery either under general or local anesthesia. Patients were asked to estimate pain on postoperative days 1 and 7 and satisfaction was assessed at 1 year. RESULTS On day 1, pain was comparable among all groups. On day 7, pain scores were twice as high when transposition was performed under general anesthesia when compared with local anesthesia. Patient satisfaction was slightly increased in the local anesthesia group. These patients were significantly more willing to repeat the surgery. CONCLUSION Ulnar nerve decompression and transposition at the elbow can be performed under local anesthesia without added morbidity when compared with general anesthesia.
Resumo:
Abstract We explored the feasibility of unrelated donor haematopoietic stem cell transplant (HSCT) upfront without prior immunosuppressive therapy (IST) in paediatric idiopathic severe aplastic anaemia (SAA). This cohort was then compared to matched historical controls who had undergone first-line therapy with a matched sibling/family donor (MSD) HSCT (n = 87) or IST with horse antithymocyte globulin and ciclosporin (n = 58) or second-line therapy with unrelated donor HSCT post-failed IST (n = 24). The 2-year overall survival in the upfront cohort was 96 ± 4% compared to 91 ± 3% in the MSD controls (P = 0·30) and 94 ± 3% in the IST controls (P = 0·68) and 74 ± 9% in the unrelated donor HSCT post-IST failure controls (P = 0·02).The 2-year event-free survival in the upfront cohort was 92 ± 5% compared to 87 ± 4% in MSD controls (P = 0·37), 40 ± 7% in IST controls (P = 0·0001) and 74 ± 9% in the unrelated donor HSCT post-IST failure controls (n = 24) (P = 0·02). Outcomes for upfront-unrelated donor HSCT in paediatric idiopathic SAA were similar to MSD HSCT and superior to IST and unrelated donor HSCT post-IST failure. Front-line therapy with matched unrelated donor HSCT is a novel treatment approach and could be considered as first-line therapy in selected paediatric patients who lack a MSD. © 2015 John Wiley & Sons Ltd.
Resumo:
Children and adults frequently skip breakfast and rates are currently increasing. In addition, the food choices made for breakfast are not always healthy ones. Breakfast skipping, in conjunction with unhealthy breakfast choices, leads to impaired cognitive functioning, poor nutrient intake, and overweight. In response to these public health issues, Skip To Breakfast, a behaviorally based school and family program, was created to increase consistent and healthful breakfast consumption among ethnically diverse fifth grade students and their families, using Intervention Mapping™. Four classroom lessons and four parent newsletters were used to deliver the intervention. For this project, a healthy, "3 Star Breakfast" was promoted, and included a serving each of dairy product, whole grain, and fruit, each with an emphasis on being low in fat and sugar. The goal of this project was to evaluate the feasibility and acceptability of the intervention. A pilot-test of the intervention was conducted in one classroom, in a school in Houston, during the Fall 2007 semester. A qualitative evaluation of the intervention was conducted, which included focus groups with students, phone interviews of parents, process evaluation data from the classroom teacher, and direct observation. Sixteen students and six parents participated in the study. Data were recorded and themes were identified. Initial results showed there is a need for such programs. Based on the initial feedback, edits were made to the intervention and program. Results showed high acceptability among the teacher, students, and parents. It became apparent that students were not reliably getting the parent newsletters to their parents to read, so a change to the protocol was made, in which students will receive incentives for having parents read newsletters and return signed forms, to increase parent participation. Other changes included small modifications to the curriculum, such as, clarifying instructions, changing in-class assignments to homework assignments, and including background reading materials for the teacher. The main trial is planned to be carried out in Spring 2008, in two elementary schools, utilizing four, fifth grade classes from each, with one school acting as the control and one as the intervention school. Results from this study can be used as an adjunct to the Coordinated Approach To Child Health (CATCH) program. ^
Resumo:
Obesity rates around the nation have risen to epidemic proportions. Rates of childhood obesity are at very high levels with 24.4% of preschool-aged children in the U.S. currently considered as overweight or obese. The percentage of childhood obesity is much higher in the southern part of the United States as compared to the rest of the nation. Minority populations, especially African American and Hispanic, are affected more than other ethnic groups. Obesity prevention programs are needed targeting young children <6 years of age from minority populations. Currently, there are few obesity prevention programs that have been implemented and evaluated in children <6 years of age. Gardening programs have been successful in improving the health status of elementary school children by increasing fruit and vegetable intake and increasing preferences for healthier food choices. However, there is no evidence of the feasibility and acceptability of a garden-based obesity prevention program among preschoolers. This pretest study, a classroom-based gardening curriculum program with 16 lesson plans and coordinating activities for preschool age children (3-5 years old) enrolled in Head Start, provides the opportunity to address this need. The study included 103 preschoolers from two centers and 9 teachers or teachers' aides. Qualitative data on feasibility and acceptability was collected from process evaluation forms of individual lesson plans and focus groups with teachers. Teacher questionnaires assessed individual teacher characteristics and provided feedback regarding the curriculum. Quantitative measures of teachers' self-efficacy, attitudes, and knowledge pertaining to nutrition were analyzed from pre and post-test surveys. Results revealed this preschool garden-based nutrition curriculum was both feasible and acceptable. The program improved teacher's self-efficacy, knowledge, and attitudes about nutrition, with teacher's confidence in ability to teach a gardening curriculum increasing from a mean score of 2.14 to 3.00 from pre to post test (P value = 0.0046). These results indicate implementing garden-based nutrition lessons within preschools is achievable. Employing garden-based nutrition lessons in the classroom is the first step in teaching children about nutrition and gardening concepts. Constructing gardening beds for more hands-on learning is the next proposed step in the larger parent study of this program.^
Resumo:
Background. Consistent adherence to antiretroviral treatment is necessary for a treatment success. Improving and maintaining adherence rate >95% are challenging for health care professionals. This pilot randomized controlled study aimed to evaluate the impact of the interactive intervention on adherence to GPO-VIR, to describe the feasibility of the interactive intervention in Thailand, and to illustrate the adherence self-efficacy concept among HIV treatment-naïve patients in Thailand who were starting antiretroviral treatment. ^ Methods. The study took place at three HIV clinics located in Phayao, Thailand. Twenty-three patients were randomly assigned into the experimental (n=11) and the control groups (n=12). Each participant in the experimental group and a significant person to the patient received 5 educational sessions with a nurse at the clinics and at their homes. They also received 3 follow-up evaluations during the 6-month period of the study. The participants in the control group received the standard of care provided by HIV clinical personnel plus three follow-up evaluations at the clinic. ^ Results. Seventeen patients (7 in the experimental and 10 in the control group) completed the study. The 4-day recall on the Thai ACTG Adherence Scale demonstrated adherence rate >95% for most participants from both groups. After the first measurement, no experimental group patients reporting missing ART, while one control group participant continuously skipped ART. Participants from both groups had significantly increased CD4 cell counts after the study (F(1, 15) = 29.30, p = .000), but no differences were found between two groups (F(1, 15) = .001, p = .98). Examination of the intervention showed limitations and possibilities to implement it in Thailand. Qualitative data demonstrated self-efficacy expectations, resignation and acceptance as related concepts to improve adherence outcomes. ^ Conclusions. This interactive intervention, after appropriate modifications, is feasible to apply for Thai HIV-treatment naïve patients. Because of limitations the study could not demonstrate whether the interactive intervention improved adherence to ART among HIV-treatment naïve in Thailand. A longitudinal study in a larger sample would be required to test the impact of the intervention. ^ Keyword: antiretroviral treatment, adherence, treatment-naïve, Thailand, randomized controlled study ^
Resumo:
The central objective of this dissertation was to determine the feasibility of self-completed advance directives (AD) in older persons suffering from mild and moderate stages of dementia. This was accomplished by identifying differences in ability to complete AD among elderly subjects with increasing degrees of dementia and cognitive incompetence. Secondary objectives were to describe and compare advance directives completed by elders and identified proxy decision makers. Secondary objectives were accomplished by measuring the agreement between advance directives completed by proxy and elder, and comparing that agreement across groups defined by the elder's cognitive status. This cross-sectional study employed a structured interview to elicit AD, followed by a similar interview with a proxy decision maker identified by the elder. A stratified sampling scheme recruited elders with normal cognition, mild, and moderate forms of dementia using the Mini Mental-State Exam (MMSE). The Hopkins Competency Assessment Test (HCAT) was used for evaluation of competency to make medical decisions. Analysis was conducted on "between group" (non-demented $\leftrightarrow$ mild dementia $\leftrightarrow$ moderate dementia, and competent $\leftrightarrow$ incompetent) and "within group" (elder $\leftrightarrow$ family member) variation.^ The 118 elderly subjects interviewed were generally male, Caucasian, and of low socioeconomic status. Mean age was 77. Overall, elders preferred a "trial of therapy" regarding AD rather than to "always receive the therapy". No intervention was refused outright more often than it was accepted. A test-retest of elders' AD revealed stable responses. Eleven logic checks measured appropriateness of AD responses independent of preference. No difference was found in logic error rates between elders grouped by MMSE or HCAT. Agreement between proxy and elder responses showed significant dissimilarity, indicating that proxies were not making the same medical decisions as the elders.^ Conclusions based on these data are: (1) Self reporting AD is feasible among elders showing signs of cognitive impairment and they should be given all opportunities to complete advance directives, (2) variation in preferences for advance directives in cognitively impaired elders should not be assumed to be the effects of their impairment alone, (3) proxies do not appear to forego life-prolonging interventions in the face of increasing impairment in their ward, however, their advance directives choices are frequently not those of the elder they represent. ^