974 resultados para Software -- Evaluation
Resumo:
HYPOTHESIS A previously developed image-guided robot system can safely drill a tunnel from the lateral mastoid surface, through the facial recess, to the middle ear, as a viable alternative to conventional mastoidectomy for cochlear electrode insertion. BACKGROUND Direct cochlear access (DCA) provides a minimally invasive tunnel from the lateral surface of the mastoid through the facial recess to the middle ear for cochlear electrode insertion. A safe and effective tunnel drilled through the narrow facial recess requires a highly accurate image-guided surgical system. Previous attempts have relied on patient-specific templates and robotic systems to guide drilling tools. In this study, we report on improvements made to an image-guided surgical robot system developed specifically for this purpose and the resulting accuracy achieved in vitro. MATERIALS AND METHODS The proposed image-guided robotic DCA procedure was carried out bilaterally on 4 whole head cadaver specimens. Specimens were implanted with titanium fiducial markers and imaged with cone-beam CT. A preoperative plan was created using a custom software package wherein relevant anatomical structures of the facial recess were segmented, and a drill trajectory targeting the round window was defined. Patient-to-image registration was performed with the custom robot system to reference the preoperative plan, and the DCA tunnel was drilled in 3 stages with progressively longer drill bits. The position of the drilled tunnel was defined as a line fitted to a point cloud of the segmented tunnel using principle component analysis (PCA function in MatLab). The accuracy of the DCA was then assessed by coregistering preoperative and postoperative image data and measuring the deviation of the drilled tunnel from the plan. The final step of electrode insertion was also performed through the DCA tunnel after manual removal of the promontory through the external auditory canal. RESULTS Drilling error was defined as the lateral deviation of the tool in the plane perpendicular to the drill axis (excluding depth error). Errors of 0.08 ± 0.05 mm and 0.15 ± 0.08 mm were measured on the lateral mastoid surface and at the target on the round window, respectively (n =8). Full electrode insertion was possible for 7 cases. In 1 case, the electrode was partially inserted with 1 contact pair external to the cochlea. CONCLUSION The purpose-built robot system was able to perform a safe and reliable DCA for cochlear implantation. The workflow implemented in this study mimics the envisioned clinical procedure showing the feasibility of future clinical implementation.
Resumo:
Many studies in biostatistics deal with binary data. Some of these studies involve correlated observations, which can complicate the analysis of the resulting data. Studies of this kind typically arise when a high degree of commonality exists between test subjects. If there exists a natural hierarchy in the data, multilevel analysis is an appropriate tool for the analysis. Two examples are the measurements on identical twins, or the study of symmetrical organs or appendages such as in the case of ophthalmic studies. Although this type of matching appears ideal for the purposes of comparison, analysis of the resulting data while ignoring the effect of intra-cluster correlation has been shown to produce biased results.^ This paper will explore the use of multilevel modeling of simulated binary data with predetermined levels of correlation. Data will be generated using the Beta-Binomial method with varying degrees of correlation between the lower level observations. The data will be analyzed using the multilevel software package MlwiN (Woodhouse, et al, 1995). Comparisons between the specified intra-cluster correlation of these data and the estimated correlations, using multilevel analysis, will be used to examine the accuracy of this technique in analyzing this type of data. ^
Resumo:
Clock synchronization in the order of nanoseconds is one of the critical factors for time-based localization. Currently used time synchronization methods are developed for the more relaxed needs of network operation. Their usability for positioning should be carefully evaluated. In this paper, we are particularly interested in GPS-based time synchronization. To judge its usability for localization we need a method that can evaluate the achieved time synchronization with nanosecond accuracy. Our method to evaluate the synchronization accuracy is inspired by signal processing algorithms and relies on fine grain time information. The method is able to calculate the clock offset and skew between devices with nanosecond accuracy in real time. It was implemented using software defined radio technology. We demonstrate that GPS-based synchronization suffers from remaining clock offset in the range of a few hundred of nanoseconds but the clock skew is negligible. Finally, we determine a corresponding lower bound on the expected positioning error.
Resumo:
The aim of the study was to compare fissure sealant quality after mechanical conditioning of erbium-doped yttrium aluminium garnet (Er:YAG) laser or air abrasion prior to chemical conditioning of phosphoric acid etching or of a self-etch adhesive. Twenty-five permanent molars were initially divided into three groups: control group (n = 5), phosphoric acid etching; test group 1 (n = 10), air abrasion; and test group 2, (n = 10) Er:YAG laser. After mechanical conditioning, the test group teeth were sectioned buccolingually and the occlusal surface of one half tooth (equal to one sample) was acid etched, while a self-etch adhesive was applied on the other half. The fissure system of each sample was sealed, thermo-cycled and immersed in 5% methylene dye for 24 h. Each sample was sectioned buccolingually, and one slice was analysed microscopically. Using specialized software microleakage, unfilled margin, sealant failure and unfilled area proportions were calculated. A nonparametric ANOVA model was applied to compare the Er:YAG treatment with that of air abrasion and the self-etch adhesive with phosphoric acid (α = 0.05). Test groups were compared to the control group using Wilcoxon rank sum tests (α = 0.05). The control group displayed significantly lower microleakage but higher unfilled area proportions than the Er:YAG laser + self-etch adhesive group and displayed significantly higher unfilled margin and unfilled area proportions than the air-abrasion + self-etch adhesive group. There was no statistically significant difference in the quality of sealants applied in fissures treated with either Er:YAG laser or air abrasion prior to phosphoric acid etching, nor in the quality of sealants applied in fissures treated with either self-etch adhesive or phosphoric acid following Er:YAG or air-abrasion treatment.
Resumo:
Few studies have investigated causal pathways linking psychosocial factors to each other and to screening mammography. Conflicting hypotheses exist in the theoretic literature regarding the role and importance of subjective norms, a person's perceived social pressure to perform the behavior and his/her motivation to comply. The Theory of Reasoned Action (TRA) hypothesizes that subjective norms directly affect intention; while the Transtheoretical Model (TTM) hypothesizes that attitudes mediate the influence of subjective norms on stage of change. No one has examined which hypothesis best predicts the effect of subjective norms on mammography intention and stage of change. Two statistical methods are available for testing mediation, sequential regression analysis (SRA) and latent variable structural equation modeling (LVSEM); however, software to apply LVSEM to dichotomous variables like intention has only recently become available. No one has compared the methods to determine whether or not they yield similar results for dichotomous variables. ^ Study objectives were to: (1) determine whether the effect of subjective norms on mammography intention and stage of change are mediated by pros and cons; and (2) compare mediation results from the SRA and LVSEM approaches when the outcome is dichotomous. We conducted a secondary analysis of data from a national sample of women veterans enrolled in Project H.O.M.E. (H&barbelow;ealthy O&barbelow;utlook on the M&barbelow;ammography E&barbelow;xperience), a behavioral intervention trial. ^ Results showed that the TTM model described the causal pathways better than the TRA one; however, we found support for only one of the TTM causal mechanisms. Cons was the sole mediator. The mediated effect of subjective norms on intention and stage of change by cons was very small. These findings suggest that interventionists focus their efforts on reducing negative attitudes toward mammography when resources are limited. ^ Both the SRA and LVSEM methods provided evidence for complete mediation, and the direction, magnitude, and standard errors of the parameter estimates were very similar. Because SRA parameter estimates were not biased toward the null, we can probably assume negligible measurement error in the independent and mediator variables. Simulation studies are needed to further our understanding of how these two methods perform under different data conditions. ^
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^
Resumo:
Purpose. To determine the usability of two video games to prevent type 2 diabetes and obesity among youth through analysis of data collected during alpha-testing. ^ Subjects. Ten children aged 9 to 12 were selected for three 2-hour alpha testing sessions.^ Methods. "Escape from Diab" and "Nanoswarm" were designed to change dietary and physical inactivity behaviors, based on a theoretical framework of mediating variables obtained from social cognitive theory, self-determination theory, elaboration likelihood model, and behavioral inoculation theory. Thirteen mini-games developed by the software company were divided into 3 groups based on completion date. Children tested 4-5 mini-games in each of three sessions. Observed game play was followed by a scripted interview. Results from observation forms and interview transcripts were tabulated and coded to determine usability. Suggestions for game modifications were delivered to the software design firm, and a follow-up table reports rationale for inclusion or exclusion of such modifications.^ Results. Participants were 50% frequent video game players and 20% non game-players. Most (60%) were female. The mean grade (indicating likeability as a subset of usability) across all games given by children was significantly greater than a neutral grade of 80% (89%, p < 0.01), indicating a positive likeability score. The games on average also received positive ratings for fun, helpfulness of instructions and length compared to neutral values (midpoint on likert scales) (all p < 0.01). Observation notes indicated that participants paid attention to the instructions, did not appear to have much difficulty with the games, and were "not frustrated", "not bored", "very engaged", "not fidgety" and "very calm" (all p < 0.01). The primary issues noted in observations and interviews were unclear instructions and unclear purpose of some games. Player suggestions primarily involved ways to make on screen cues more visible or noticeable, instructions more clear, and games more elaborate or difficult.^ Conclusions. The present study highlights the importance of alpha testing video game components for usability prior to completion to enhance usability and likeability. Results indicate that creating clear instructions, making peripheral screen cues more eye-catching or noticeable, and vigorously stating the purpose of the game to improve understandability are important elements. However, future interventions will each present unique materials and user-interfaces and should therefore also be thoroughly alpha-tested. ^
Resumo:
This study addresses the responses to a postcard campaign with health messages targeting the parents of children in a sample of low-income elementary schools and assesses the feasibility and areas of possible improvements in such a project. The campaign was implemented in Spring 2009 with 4 th grade students (n=1070) in fifteen economically disadvantaged elementary schools in Travis County, Texas. Postcards were sent home with children, and parents filled out a feedback card that the children returned to school. Response data, in the form of self-administered feedback cards (n=2665) and one-on-one teacher interviews (n=8), were qualitatively analyzed using NVivo 8 software. Postcard reception and points of improvement were then identified from the significant themes that emerged including health, cessation or reduction of unhealthy behaviors, motivation, family, and the comprehension of abstract health concepts. ^ Responses to the postcard campaign were almost completely positive, with less than 1% of responses reporting some sort of dislike, and many parents reported a modification of their behavior. However, possible improvements that could be made to the campaign are: increased focus of the postcards on the parents as the target population, increased information about serving size, greater emphasis on the link between obesity and health, alteration of certain skin tones used in the graphical depiction of people on the cards, and smaller but more frequent incentives to return the feedback cards for the students. The program appears to be an effective method of communicating health messages to the parents of 4th grade children.^
Resumo:
Introduction. Lake Houston serves as a reservoir for both recreational and drinking water for residents of Houston, Texas, and the metropolitan area. The Texas Commission on Environmental Quality (TCEQ) expressed concerns about the water quality and increasing amounts of pathogenic bacteria in Lake Houston (3). The objective of this investigation is to evaluate water quality for the presence of bacteria, nitrates, nitrites, carbon, phosphorus, dissolved oxygen, pH, turbidity, suspended solids, dissolved solids, and chlorine in Cypress Creek. The aims of this project are to analyze samples of water from Cypress Creek and to render a quantitative and graphical representation of the results. The collected information will allow for a better understanding of the aqueous environment in Cypress Creek.^ Methods. Water samples were collected in August 2009 and analyzed in the field and at UTSPH laboratory by spectrophotometry and other methods. Mapping software was utilized to develop novel maps of the sample sites using coordinates attained with the Global Positioning System (GPS). Sample sites and concentrations were mapped using Geographic Information System (GIS) software and correlated with permitted outfalls and other land use characteristic.^ Results. All areas sampled were positive for the presence of total coliform and Escherichia coli (E. coli). The presences of other water contaminants varied at each location in Cypress Creek but were under the maximum allowable limits designated by the Texas Commission on Environmental Quality. However, dissolved oxygen concentrations were elevated above the TCEQ limit of 5.0 mg/L at majority of the sites. One site had near-limit concentration of nitrates at 9.8 mg/L. Land use above this site included farm land, agricultural land, golf course, parks, residential neighborhoods, and nine permitted TCEQ effluent discharge sites within 0.5 miles upstream.^ Significance. Lake Houston and its tributary, Cypress Creek, are used as recreational waters where individuals may become exposed to microbial contamination. Lake Houston also is the source of drinking water for much of Houston/Harris and Galveston Counties. This research identified the presence of microbial contaminates in Cypress Creek above TCEQ regulatory requirements. Other water quality variables measured were in line with TCEQ regulations except for near-limit for nitrate at sample site #10, at Jarvis and Timberlake in Cypress Texas.^
Resumo:
Background: Obesity is a major health problem in the United States that has reached epidemic proportions. With most U.S adults spending the majority of their waking hours at work, the influence of the workplace environment on obesity is gaining in importance. Recent research implicates worksites as providing an 'obesogenic' environment as they encourage overeating and reduce the opportunity for physical activity. Objective: The aim of this study is to describe the nutrition and physical activity environment of Texas Medical Center (TMC) hospitals participating in the Shape Up Houston evaluation study to develop a scoring system to quantify the environmental data collected using the Environmental Assessment Tool (EAT) survey and to assess the inter-observer reliability of using the EAT survey. Methods: A survey instrument that was adapted from the Environmental Assessment Tool (EAT) developed by Dejoy DM et al in 2008 to measure the hospital environmental support for nutrition and physical activity was used for this study. The inter-observer reliability of using the EAT survey was measured and total percent agreement scores were computed. Most responses on the EAT survey are dichotomous (Yes and No) and these responses were coded with a '0' for a 'no' response and a '1' for a 'yes' response. A summative scoring system was developed to quantify these responses. Each hospital was given a score for each scale and subscale on the EAT survey in addition to a total score. All analyses were conducted using Stata 11 software. Results: High inter-observer reliability is observed using EAT. The percentage agreement scores ranged from 94.4%–100%. Only 2 of the 5 hospitals had a fitness facility onsite and scores for exercise programs and outdoor facilities available for hospital employees ranged from 0–62% and 0–37.5%, respectively. The healthy eating percentage for hospital cafeterias range from 42%–92% across the different hospitals while the healthy vending scores were 0%–40%. The total TMC 'healthy hospital' score was 49%. Conclusion: The EAT survey is a reliable instrument for measuring the physical activity and nutrition support environment of hospital worksites. The study results showed a large variability among the TMC hospitals in the existing physical activity and nutrition support environment. This study proposes cost effective policy changes that can increase environmental support to healthy eating and active living among TMC hospital employees.^
Resumo:
With continuous new improvements in brachytherapy source designs and techniques, method of 3D dosimetry for treatment dose verifications would better ensure accurate patient radiotherapy treatment. This study was aimed to first evaluate the 3D dose distributions of the low-dose rate (LDR) Amersham 6711 OncoseedTM using PRESAGE® dosimeters to establish PRESAGE® as a suitable brachytherapy dosimeter. The new AgX100 125I seed model (Theragenics Corporation) was then characterized using PRESAGE® following the TG-43 protocol. PRESAGE® dosimeters are solid, polyurethane-based, 3D dosimeters doped with radiochromic leuco dyes that produce a linear optical density response to radiation dose. For this project, the radiochromic response in PRESAGE® was captured using optical-CT scanning (632 nm) and the final 3D dose matrix was reconstructed using the MATLAB software. An Amersham 6711 seed with an air-kerma strength of approximately 9 U was used to irradiate two dosimeters to 2 Gy and 11 Gy at 1 cm to evaluate dose rates in the r=1 cm to r=5 cm region. The dosimetry parameters were compared to the values published in the updated AAPM Report No. 51 (TG-43U1). An AgX100 seed with an air-kerma strength of about 6 U was used to irradiate two dosimeters to 3.6 Gy and 12.5 Gy at 1 cm. The dosimetry parameters for the AgX100 were compared to the values measured from previous Monte-Carlo and experimental studies. In general, the measured dose rate constant, anisotropy function, and radial dose function for the Amersham 6711 showed agreements better than 5% compared to consensus values in the r=1 to r=3 cm region. The dose rates and radial dose functions measured for the AgX100 agreed with the MCNPX and TLD-measured values within 3% in the r=1 to r=3 cm region. The measured anisotropy function in PRESAGE® showed relative differences of up to 9% with the MCNPX calculated values. It was determined that post-irradiation optical density change over several days was non-linear in different dose regions, and therefore the dose values in the r=4 to r=5 cm regions had higher uncertainty due to this effect. This study demonstrated that within the radial distance of 3 cm, brachytherapy dosimetry in PRESAGE® can be accurate within 5% as long as irradiation times are within 48 hours.
Resumo:
Software architectural evaluation is a key discipline used to identify, at early stages of a real-time system (RTS) development, the problems that may arise during its operation. Typical mechanisms supporting concurrency, such as semaphores, mutexes or monitors, usually lead to concurrency problems in execution time that are difficult to be identified, reproduced and solved. For this reason, it is crucial to understand the root causes of these problems and to provide support to identify and mitigate them at early stages of the system lifecycle. This paper aims to present the results of a research work oriented to the development of the tool called ‘Deadlock Risk Evaluation of Architectural Models’ (DREAM) to assess deadlock risk in architectural models of an RTS. A particular architectural style, Pipelines of Processes in Object-Oriented Architectures–UML (PPOOA) was used to represent platform-independent models of an RTS architecture supported by the PPOOA –Visio tool. We validated the technique presented here by using several case studies related to RTS development and comparing our results with those from other deadlock detection approaches, supported by different tools. Here we present two of these case studies, one related to avionics and the other to planetary exploration robotics. Copyright © 2011 John Wiley & Sons, Ltd.
Resumo:
We have analyzed the performance of a PET demonstrator formed by two sectors of four monolithic detector blocks placed face-to-face. Both front-end and read-out electronics have been evaluated by means of coincidence measurements using a rotating 22Na source placed at the center of the sectors in order to emulate the behavior of a complete full ring. A continuous training method based on neural network (NN) algorithms has been carried out to determine the entrance points over the surface of the detectors. Reconstructed images from 1 MBq 22Na point source and 22Na Derenzo phantom have been obtained using both filtered back projection (FBP) analytic methods and the OSEM 3D iterative algorithm available in the STIR software package [1]. Preliminary data on image reconstruction from a 22Na point source with Ø = 0.25 mm show spatial resolutions from 1.7 to 2.1 mm FWHM in the transverse plane. The results confirm the viability of this design for the development of a full-ring brain PET scanner compatible with magnetic resonance imaging for human studies.
Resumo:
This report addresses speculative parallelism (the assignment of spare processing resources to tasks which are not known to be strictly required for the successful completion of a computation) at the user and application level. At this level, the execution of a program is seen as a (dynamic) tree —a graph, in general. A solution for a problem is a traversal of this graph from the initial state to a node known to be the answer. Speculative parallelism then represents the assignment of resources to múltiple branches of this graph even if they are not positively known to be on the path to a solution. In highly non-deterministic programs the branching factor can be very high and a naive assignment will very soon use up all the resources. This report presents work assignment strategies other than the usual depth-first and breadth-first. Instead, best-first strategies are used. Since their definition is application-dependent, the application language contains primitives that allow the user (or application programmer) to a) indícate when intelligent OR-parallelism should be used; b) provide the functions that define "best," and c) indícate when to use them. An abstract architecture enables those primitives to perform the search in a "speculative" way, using several processors, synchronizing them, killing the siblings of the path leading to the answer, etc. The user is freed from worrying about these interactions. Several search strategies are proposed and their implementation issues are addressed. "Armageddon," a global pruning method, is introduced, together with both a software and a hardware implementation for it. The concepts exposed are applicable to áreas of Artificial Intelligence such as extensive expert systems, planning, game playing, and in general to large search problems. The proposed strategies, although showing promise, have not been evaluated by simulation or experimentation.