932 resultados para case-based design
Resumo:
Performance and manufacturability are two important issues that must be taken into account during MEMS design. Existing MEMS design models or systems follow a process-driven design paradigm, that is, design starts from the specification of process sequence or the customization of foundry-ready process template. There has been essentially no methodology or model that supports generic, high-level design synthesis for MEMS conceptual design. As a result, there lacks a basis for specifying the initial process sequences. To address this problem, this paper proposes a performance-driven, microfabrication-oriented methodology for MEMS conceptual design. A unified behaviour representation method is proposed which incorporates information of both physical interactions and chemical/biological/other reactions. Based on this method, a behavioural process based design synthesis model is proposed, which exploits multidisciplinary phenomena for design solutions, including both the structural components and their configuration for the MEMS device, as well as the necessary substances for the chemical/biological/other reactions. The model supports both forward and backward synthetic search for suitable phenomena. To ensure manufacturability, a strategy of using microfabrication-oriented phenomena as design knowledge is proposed, where the phenomena are developed from existing MEMS devices that have associated MEMS-specific microfabrication processes or foundry-ready process templates. To test the applicability of the proposed methodology, the paper also studies microfluidic device design and uses a micro-pump design for the case study.
Resumo:
El sistema de fangs activats és el tractament biològic més àmpliament utilitzat arreu del món per la depuració d'aigües residuals. El seu funcionament depèn de la correcta operació tant del reactor biològic com del decantador secundari. Quan la fase de sedimentació no es realitza correctament, la biomassa no decantada s'escapa amb l'efluent causant un impacte sobre el medi receptor. Els problemes de separació de sòlids, són actualment una de les principals causes d'ineficiència en l'operació dels sistemes de fangs activats arreu del món. Inclouen: bulking filamentós, bulking viscós, escumes biològiques, creixement dispers, flòcul pin-point i desnitrificació incontrolada. L'origen dels problemes de separació generalment es troba en un desequilibri entre les principals comunitats de microorganismes implicades en la sedimentació de la biomassa: els bacteris formadors de flòcul i els bacteris filamentosos. Degut a aquest origen microbiològic, la seva identificació i control no és una tasca fàcil pels caps de planta. Els Sistemes de Suport a la Presa de Decisions basats en el coneixement (KBDSS) són un grup d'eines informàtiques caracteritzades per la seva capacitat de representar coneixement heurístic i tractar grans quantitats de dades. L'objectiu de la present tesi és el desenvolupament i validació d'un KBDSS específicament dissenyat per donar suport als caps de planta en el control dels problemes de separació de sòlids d'orígen microbiològic en els sistemes de fangs activats. Per aconseguir aquest objectiu principal, el KBDSS ha de presentar les següents característiques: (1) la implementació del sistema ha de ser viable i realista per garantir el seu correcte funcionament; (2) el raonament del sistema ha de ser dinàmic i evolutiu per adaptar-se a les necessitats del domini al qual es vol aplicar i (3) el raonament del sistema ha de ser intel·ligent. En primer lloc, a fi de garantir la viabilitat del sistema, s'ha realitzat un estudi a petita escala (Catalunya) que ha permès determinar tant les variables més utilitzades per a la diagnosi i monitorització dels problemes i els mètodes de control més viables, com la detecció de les principals limitacions que el sistema hauria de resoldre. Els resultats d'anteriors aplicacions han demostrat que la principal limitació en el desenvolupament de KBDSSs és l'estructura de la base de coneixement (KB), on es representa tot el coneixement adquirit sobre el domini, juntament amb els processos de raonament a seguir. En el nostre cas, tenint en compte la dinàmica del domini, aquestes limitacions es podrien veure incrementades si aquest disseny no fos òptim. En aquest sentit, s'ha proposat el Domino Model com a eina per dissenyar conceptualment el sistema. Finalment, segons el darrer objectiu referent al seguiment d'un raonament intel·ligent, l'ús d'un Sistema Expert (basat en coneixement expert) i l'ús d'un Sistema de Raonament Basat en Casos (basat en l'experiència) han estat integrats com els principals sistemes intel·ligents encarregats de dur a terme el raonament del KBDSS. Als capítols 5 i 6 respectivament, es presenten el desenvolupament del Sistema Expert dinàmic (ES) i del Sistema de Raonament Basat en Casos temporal, anomenat Sistema de Raonament Basat en Episodis (EBRS). A continuació, al capítol 7, es presenten detalls de la implementació del sistema global (KBDSS) en l'entorn G2. Seguidament, al capítol 8, es mostren els resultats obtinguts durant els 11 mesos de validació del sistema, on aspectes com la precisió, capacitat i utilitat del sistema han estat validats tant experimentalment (prèviament a la implementació) com a partir de la seva implementació real a l'EDAR de Girona. Finalment, al capítol 9 s'enumeren les principals conclusions derivades de la present tesi.
Resumo:
Objective: ‘Music Therapeutic Caregiving’, when caregivers sing for or together with persons with dementia during morning care situations, has been shown to increase verbal and nonverbal communication between persons with dementia and their caregivers, as well as enhance positive and decrease negative emotions in persons with dementia. No studies about singing during mealtimes have been conducted, and this pilot project was designed to elucidate this. However, since previous studies have shown that there is a risk that persons with dementia will start to sing along with the caregiver, the caregiver in this study hummed such that the person with dementia did not sing instead of eat. The aim of this pilot project was threefold: to describe expressed emotions in a woman with severe dementia, and describe communication between her and her caregivers without and with the caregiver humming. The aim was also to measure food and liquid intake without and with humming. Method: The study was constructed as a Single Case ABA design in which the ordinary mealtime constituted a baseline which comprised a woman with severe dementia being fed by her caregivers in the usual way. The intervention included the same woman being fed by the same caregiver who hummed while feeding her. Data comprised video observations that were collected once per week over 5 consecutive weeks. The Verbal and Nonverbal Interaction Scale and Observed Emotion Rating Scale were used to analyze the recorded interactions. Results: A slightly positive influence of communication was shown for the woman with dementia, as well as for the caregiver. Further, the women with dementia showed a slight increase in expressions of positive emotions, and she ate more during the intervention. Conclusion: Based on this pilot study no general conclusions can be drawn. It can be concluded, however, that humming while feeding persons with dementia might slightly enhance communication, and positive expressed emotions in persons with dementia. To confirm this, more studies on group levels are needed. Because previous studies have found that caregiver singing during caring situations influences persons with dementia positively it might be desirable to test the same during mealtime.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
Background Increasing attention is being paid to improvement in undergraduate science, technology, engineering, and mathematics (STEM) education through increased adoption of research-based instructional strategies (RBIS), but high-quality measures of faculty instructional practice do not exist to monitor progress. Purpose/Hypothesis The measure of how well an implemented intervention follows the original is called fidelity of implementation. This theory was used to address the research questions: What is the fidelity of implementation of selected RBIS in engineering science courses? That is, how closely does engineering science classroom practice reflect the intentions of the original developers? Do the critical components that characterize an RBIS discriminate between engineering science faculty members who claimed use of the RBIS and those who did not? Design/Method A survey of 387 U.S. faculty teaching engineering science courses (e.g., statics, circuits, thermodynamics) included questions about class time spent on 16 critical components and use of 11 corresponding RBIS. Fidelity was quantified as the percentage of RBIS users who also spent time on corresponding critical components. Discrimination between users and nonusers was tested using chi square. Results Overall fidelity of the 11 RBIS ranged from 11% to 80% of users spending time on all required components. Fidelity was highest for RBIS with one required component: case-based teaching, just-in-time teaching, and inquiry learning. Thirteen of 16 critical components discriminated between users and nonusers for all RBIS to which they were mapped. Conclusions Results were consistent with initial mapping of critical components to RBIS. Fidelity of implementation is a potentially useful framework for future work in STEM undergraduate education.
Resumo:
The European foundry business is a traditional less RTD intensive industry which is dominated by SMEs and which forms a significant part of Europe’s manufacturing industry. The efficient design and manufacturing of cast components and corresponding tooling is a crucial success factor for these companies. To achieve this, information and knowledge around the design, planning and manufacturing of cast components needs to be accessible in a fast and structured way.
Resumo:
The impact of health promotion programs is related to both program effectiveness and the extent to which the program is implemented among the target population. The purpose of this dissertation was to describe the development and evaluation of a school-based program diffusion intervention designed to increase the rate of dissemination and adoption of the Child and Adolescent Trial for Cardiovascular Health, or CATCH program (recently renamed the Coordinated Approach to Child Health). ^ The first study described the process by which schools across the state of Texas spontaneously began to adopt the CATCH program after it was tested and proven effective in a multi-site randomized efficacy trial. A survey of teachers and administrator representatives of all schools on record that purchased the CATCH program, but were not involved in the efficacy trial, was used to find out who brought CATCH into the schools, how they garnered support for its adoption, why they decided to adopt the program, and what was involved in deciding to adopt. ^ The second study described how the Intervention Mapping framework guided the planning, development and implementation of a program for the diffusion of CATCH. An iterative process was used to integrate theory, literature, the experience of project staff and data from the target population into a meaningful set of program determinants and performance objectives. Proximal program objectives were specified and translated into both media and interpersonal communication strategies for program diffusion. ^ The third study assessed the effectiveness of the diffusion program in a case-comparison design. Three of the twenty Education Service Center regions in Texas were chosen, selected based on similar demographic criteria, and were followed for adoption of the CATCH curriculum. One of these regions received the full media and interpersonal channel intervention; a second received a reduced media-only intervention, and a third received no intervention. Results suggested the use of the interpersonal channels with media follow-up is an effective means to facilitate program dissemination and adoption. The media-alone condition was not effective in facilitating program adoption. ^
Resumo:
The paper deals with a problem of intelligent system’s design for complex environments. There is discussed a possibility to integrate several technologies into one basic structure. One possible structure is proposed in order to form a basis for intelligent system that would be able to operate in complex environments. The basic elements of the proposed structure have found their implemented in software system. This software system is shortly presented in the paper. The most important results of experiments are outlined and discussed at the end of the paper. Some possible directions of further research are sketched.
Resumo:
This thesis, titled Governance and Community Capitals, explores the kinds of practical processes that have made governance work in three faith-based schools in the Western Highlands of Papua New Guinea (PNG). To date, the nation of PNG has been unable to meet its stated educational goals; however, some faith-based primary schools have overcome educational challenges by changing their local governance systems. What constitutes good governance in developing countries and how it can be achieved in a PNG schooling context has received very little scholarly attention. In this study, the subject of governance is approached at the nexus between the administrative sciences and asset-based community development. In this space, the researcher provides an understanding of the contribution that community capitals have made to understandings of local forms of governance in the development context. However, by and large, conceptions of governance have a history of being positioned within a Euro-centric frame and very little, if anything is known about the naming of capitals by indigenous peoples. In this thesis, six indigenous community capitals are made visible, expanding the repertoire of extant capitals published to date. The capitals identified and named in this thesis are: Story, Wisdom, Action, Blessing, Name and Unity. In-depth insights into these capitals are provided and through the theoretical idea of performativity, the researcher advances an understanding of how the habitual enactment of the practical components of the capitals made governance work in this unique setting. The study draws from a grounded and appreciative methodology and is based on a case study design incorporating a three-stage cycle of investigation. The first stage tested the application of an assets-based method to documentary sources of data including most significant change stories, community mapping and visual diaries. In the second stage, a group process method relevant to a PNG context was developed and employed. The third stage involved building theory from case study evidence using content analysis, language and metaphorical speech acts as guides for complex analysis. The thesis demonstrates the contribution that indigenous community capitals can make to understanding local forms of governance and how PNG faith-based schools meet their local governance challenges.
Resumo:
Power efficiency is one of the most important constraints in the design of embedded systems since such systems are generally driven by batteries with limited energy budget or restricted power supply. In every embedded system, there are one or more processor cores to run the software and interact with the other hardware components of the system. The power consumption of the processor core(s) has an important impact on the total power dissipated in the system. Hence, the processor power optimization is crucial in satisfying the power consumption constraints, and developing low-power embedded systems. A key aspect of research in processor power optimization and management is “power estimation”. Having a fast and accurate method for processor power estimation at design time helps the designer to explore a large space of design possibilities, to make the optimal choices for developing a power efficient processor. Likewise, understanding the processor power dissipation behaviour of a specific software/application is the key for choosing appropriate algorithms in order to write power efficient software. Simulation-based methods for measuring the processor power achieve very high accuracy, but are available only late in the design process, and are often quite slow. Therefore, the need has arisen for faster, higher-level power prediction methods that allow the system designer to explore many alternatives for developing powerefficient hardware and software. The aim of this thesis is to present fast and high-level power models for the prediction of processor power consumption. Power predictability in this work is achieved in two ways: first, using a design method to develop power predictable circuits; second, analysing the power of the functions in the code which repeat during execution, then building the power model based on average number of repetitions. In the first case, a design method called Asynchronous Charge Sharing Logic (ACSL) is used to implement the Arithmetic Logic Unit (ALU) for the 8051 microcontroller. The ACSL circuits are power predictable due to the independency of their power consumption to the input data. Based on this property, a fast prediction method is presented to estimate the power of ALU by analysing the software program, and extracting the number of ALU-related instructions. This method achieves less than 1% error in power estimation and more than 100 times speedup in comparison to conventional simulation-based methods. In the second case, an average-case processor energy model is developed for the Insertion sort algorithm based on the number of comparisons that take place in the execution of the algorithm. The average number of comparisons is calculated using a high level methodology called MOdular Quantitative Analysis (MOQA). The parameters of the energy model are measured for the LEON3 processor core, but the model is general and can be used for any processor. The model has been validated through the power measurement experiments, and offers high accuracy and orders of magnitude speedup over the simulation-based method.
Resumo:
The authors present a proposal to develop intelligent assisted living environments for home based healthcare. These environments unite the chronical patient clinical history sematic representation with the ability of monitoring the living conditions and events recurring to a fully managed Semantic Web of Things (SWoT). Several levels of acquired knowledge and the case based reasoning that is possible by knowledge representation of the health-disease history and acquisition of the scientific evidence will deliver, through various voice based natural interfaces, the adequate support systems for disease auto management but prominently by activating the less differentiated caregiver for any specific need. With these capabilities at hand, home based healthcare providing becomes a viable possibility reducing the institutionalization needs. The resulting integrated healthcare framework will provide significant savings while improving the generality of health and satisfaction indicators.
Resumo:
Background: Zenker`s diverticulum (ZD) is a rare condition with a reported prevalence of 0.01% to 0.11% in the general population. Endoscopic treatment consists of the division of the septum between the diverticulum and the esophagus, within which the cricopharyngeal muscle is contained. Diathermic monopolar current, argon plasma coagulation, and laser have been used to incise the muscular septum with satisfactory results. The main limitation of endoscopic treatment is the occurrence of complications. Perforation and hemorrhage are reported in as many as 23% and 10% of patients, respectively. Objective: The aim of this study was to use the technique of endoscopic diverticulotomy by using a harmonic scalpel in patients with ZD and to demonstrate the feasibility of using flexible and rigid devices in ZD treatment. Design: Case series study. Standard protocol was used for patient management, endoscopic procedure, and data collection. Setting: Single endoscopist demonstrating preliminary results. Patients: Five patients (4 men; median standard deviation [SD] age 69.6 +/- 9.06 years, range 59-83 years) with ZD were treated with this technique. All patients reported dysphagia and halitosis. The diagnosis was based on clinical, endoscopic, and radiographic findings. Interventions: All patients received general anesthesia and were placed in the left lateral position. A standard videogastroscope (9.8 mm) and a stiff guidewire were used to insert and achieve an adequate exposure of the ZD septum. The septum was divided using a harmonic scalpel under thin endoscope (5.2 mm) visualization through a soft diverticuloscope. Main Outcome Measurement: Feasibility of an endoscopic technique by using rigid and flexible devices to treat ZD. Results: Four patients (80%) were successfully treated in 1 session. The median SD size of the diverticulum was 3.6 +/- 0.89 cm (range 3-5 cm). Median SD procedure time was 17.33 +/- 2.33 minutes (range 15-20 minutes) in 6 procedures. No hemorrhage or perforation occurred. One patient (20%) required a second session to complete dissection of the ZD septum. All patients demonstrated improvement of dysphagia score after treatment. Limitations: Small case series design. Conclusions: Endoscopic treatment of ZD by harmonic scalpel through a soft diverticuloscope was feasible and effective in this small case series. Larger studies are warranted to further evaluate this technique.
Resumo:
Object. The goal of this paper is to analyze the extension and relationships of glomus jugulare tumor with the temporal bone and the results of its surgical treatment aiming at preservation of the facial nerve. Based on the tumor extension and its relationships with the facial nerve, new criteria to be used in the selection of different surgical approaches are proposed. Methods. Between December 1997 and December 2007, 34 patients (22 female and 12 male) with glomus jugulare tumors were treated. Their mean age was 48 years. The mean follow-up was 52.5 months. Clinical findings included hearing loss in 88%, swallowing disturbance in 50%, and facial nerve palsy in 41%. Magnetic resonance imaging demonstrated a mass in the jugular foramen in all cases, a mass in the middle ear in 97%, a cervical mass in 85%, and an intradural mass in 41%. The tumor was supplied by the external carotid artery in all cases, the internal carotid artery in 44%, and the vertebral artery in 32%. Preoperative embolization was performed in 15 cases. The approach was tailored to each patient, and 4 types of approaches were designed. The infralabyrinthine retrofacial approach (Type A) was used in 32.5%; infralabyrinthine pre- and retrofacial approach without occlusion of the external acoustic meatus (Type B) in 20.5%; infralabyrinthine pre- and retrofacial approach with occlusion of the external acoustic meatus (Type C) in 41 W. and the infralabyrinthine approach with transposition of the facial nerve and removal of the middle ear structures (Type D) in 6% of the patients. Results. Radical removal was achieved in 91% of the cases and partial removal in 9%. Among 20 patients without preoperative facial nerve dysfunction, the nerve was kept in anatomical position in 19 (95%), and facial nerve function was normal during the immediate postoperative period in 17 (85%). Six patients (17.6%) had a new lower cranial nerve deficit, but recovery of swallowing function was adequate in all cases. Voice disturbance remained in all 6 cases. Cerebrospinal fluid leakage occurred in 6 patients (17.6%), with no need for reoperation in any of them. One patient died in the postoperative period due to pulmonary complications. The global recovery, based on the Karnofsky Performance Scale (KPS), was 100% in 15% of the patients, 90% in 45%, 80% in 33%, and 70% in 6%. Conclusions. Radical removal of glomus jugulare tumor can be achieved without anterior transposition of the facial nerve. The extension of dissection, however, should be tailored to each case based on tumor blood supply, preoperative symptoms, and tumor extension. The operative field provided by the retrofacial infralabyrinthine approach, or the pre- and retrofacial approaches. with or without Closure of the external acoustic meatus, allows a wide exposure of the jugular foramen area. Global functional recovery based on the KPS is acceptable in 94% of the patients. (DOI: 10.3171/2008.10.JNS08612)
Resumo:
General practitioners wanting to practise evidence-based medicine (EBM) are constrained by time factors and the great diversity of clinical problems they deal with. They need experience in knowing what questions to ask, in locating and evaluating the evidence, and in applying it. Conventional searching for the best evidence can be achieved in daily general practice. Sometimes the search can be performed during the consultation, but more often it can be done later and the patient can return for the result. Case-based journal clubs provide a supportive environment for GPs to work together to find the best evidence at regular meetings. An evidence-based literature search service is being piloted to enhance decision-making for individual patients. A central facility provides the search and interprets the evidence in relation to individual cases. A request form and a results format make the service akin to pathology testing or imaging. Using EBM in general practice appears feasible. Major difficulties still exist before it can be practised by all GPs, but it has the potential to change the way doctors update their knowledge.