975 resultados para data capture
Resumo:
This article reports about the internet based, second multicenter study (MCS II) of the spine study group (AG WS) of the German trauma association (DGU). It represents a continuation of the first study conducted between the years 1994 and 1996 (MCS I). For the purpose of one common, centralised data capture methodology, a newly developed internet-based data collection system ( http://www.memdoc.org ) of the Institute for Evaluative Research in Orthopaedic Surgery of the University of Bern was used. The aim of this first publication on the MCS II was to describe in detail the new method of data collection and the structure of the developed data base system, via internet. The goal of the study was the assessment of the current state of treatment for fresh traumatic injuries of the thoracolumbar spine in the German speaking part of Europe. For that reason, we intended to collect large number of cases and representative, valid information about the radiographic, clinical and subjective treatment outcomes. Thanks to the new study design of MCS II, not only the common surgical treatment concepts, but also the new and constantly broadening spectrum of spine surgery, i.e. vertebro-/kyphoplasty, computer assisted surgery and navigation, minimal-invasive, and endoscopic techniques, documented and evaluated. We present a first statistical overview and preliminary analysis of 18 centers from Germany and Austria that participated in MCS II. A real time data capture at source was made possible by the constant availability of the data collection system via internet access. Following the principle of an application service provider, software, questionnaires and validation routines are located on a central server, which is accessed from the periphery (hospitals) by means of standard Internet browsers. By that, costly and time consuming software installation and maintenance of local data repositories are avoided and, more importantly, cumbersome migration of data into one integrated database becomes obsolete. Finally, this set-up also replaces traditional systems wherein paper questionnaires were mailed to the central study office and entered by hand whereby incomplete or incorrect forms always represent a resource consuming problem and source of error. With the new study concept and the expanded inclusion criteria of MCS II 1, 251 case histories with admission and surgical data were collected. This remarkable number of interventions documented during 24 months represents an increase of 183% compared to the previously conducted MCS I. The concept and technical feasibility of the MEMdoc data collection system was proven, as the participants of the MCS II succeeded in collecting data ever published on the largest series of patients with spinal injuries treated within a 2 year period.
Resumo:
The first appearance of skeletal metazoans in the late Ediacaran (~550 million years ago; Ma) has been linked to the widespread development of oxygenated oceanic conditions, but a precise spatial and temporal reconstruction of their evolution has not been resolved. Here we consider the evolution of ocean chemistry from ~550 to ~541 Ma across shelf-to-basin transects in the Zaris and Witputs Sub-Basins of the Nama Group, Namibia. New carbon isotope data capture the final stages of the Shuram/Wonoka deep negative C-isotope excursion, and these are complemented with a reconstruction of water column redox dynamics utilising Fe-S-C systematics and the distribution of skeletal and soft-bodied metazoans. Combined, these inter-basinal datasets provide insight into the potential role of ocean redox chemistry during this pivotal interval of major biological innovation. The strongly negative d13C values in the lower parts of the sections reflect both a secular, global change in the C-isotopic composition of Ediacaran seawater, as well as the influence of 'local' basinal effects as shown by the most negative d13C values occurring in the transition from distal to proximal ramp settings. Critical, though, is that the transition to positive d13C values postdates the appearance of calcified metazoans, indicating that the onset of biomineralization did not occur under post-excursion conditions. Significantly, we find that anoxic and ferruginous deeper water column conditions were prevalent during and after the transition to positive d13C that marks the end of the Shuram/Wonoka excursion. Thus, if the C isotope trend reflects the transition to global-scale oxygenation in the aftermath of the oxidation of a large-scale, isotopically light organic carbon pool, it was not sufficient to fully oxygenate the deep ocean. Both sub-basins reveal highly dynamic redox structures, where shallow, inner ramp settings experienced transient oxygenation. Anoxic conditions were caused either by episodic upwelling of deeper anoxic waters or higher rates of productivity. These settings supported short-lived and monospecific skeletal metazoan communities. By contrast, microbial (thrombolite) reefs, found in deeper inner- and mid-ramp settings, supported more biodiverse communities with complex ecologies and large skeletal metazoans. These long-lived reef communities, as well as Ediacaran soft-bodied biotas, are found particularly within transgressive systems, where oxygenation was persistent. We suggest that a mid-ramp position enabled physical ventilation mechanisms for shallow water column oxygenation to operate during flooding and transgressive sea-level rise. Our data support a prominent role for oxygen, and for stable oxygenated conditions in particular, in controlling both the distribution and ecology of Ediacaran skeletal metazoan communities.
Resumo:
Los estudios realizados hasta el momento para la determinación de la calidad de medida del instrumental geodésico han estado dirigidos, fundamentalmente, a las medidas angulares y de distancias. Sin embargo, en los últimos años se ha impuesto la tendencia generalizada de utilizar equipos GNSS (Global Navigation Satellite System) en el campo de las aplicaciones geomáticas sin que se haya establecido una metodología que permita obtener la corrección de calibración y su incertidumbre para estos equipos. La finalidad de esta Tesis es establecer los requisitos que debe satisfacer una red para ser considerada Red Patrón con trazabilidad metrológica, así como la metodología para la verificación y calibración de instrumental GNSS en redes patrón. Para ello, se ha diseñado y elaborado un procedimiento técnico de calibración de equipos GNSS en el que se han definido las contribuciones a la incertidumbre de medida. El procedimiento, que se ha aplicado en diferentes redes para distintos equipos, ha permitido obtener la incertidumbre expandida de dichos equipos siguiendo las recomendaciones de la Guide to the Expression of Uncertainty in Measurement del Joint Committee for Guides in Metrology. Asimismo, se han determinado mediante técnicas de observación por satélite las coordenadas tridimensionales de las bases que conforman las redes consideradas en la investigación, y se han desarrollado simulaciones en función de diversos valores de las desviaciones típicas experimentales de los puntos fijos que se han utilizado en el ajuste mínimo cuadrático de los vectores o líneas base. Los resultados obtenidos han puesto de manifiesto la importancia que tiene el conocimiento de las desviaciones típicas experimentales en el cálculo de incertidumbres de las coordenadas tridimensionales de las bases. Basándose en estudios y observaciones de gran calidad técnica, llevados a cabo en estas redes con anterioridad, se ha realizado un exhaustivo análisis que ha permitido determinar las condiciones que debe satisfacer una red patrón. Además, se han diseñado procedimientos técnicos de calibración que permiten calcular la incertidumbre expandida de medida de los instrumentos geodésicos que proporcionan ángulos y distancias obtenidas por métodos electromagnéticos, ya que dichos instrumentos son los que van a permitir la diseminación de la trazabilidad metrológica a las redes patrón para la verificación y calibración de los equipos GNSS. De este modo, ha sido posible la determinación de las correcciones de calibración local de equipos GNSS de alta exactitud en las redes patrón. En esta Tesis se ha obtenido la incertidumbre de la corrección de calibración mediante dos metodologías diferentes; en la primera se ha aplicado la propagación de incertidumbres, mientras que en la segunda se ha aplicado el método de Monte Carlo de simulación de variables aleatorias. El análisis de los resultados obtenidos confirma la validez de ambas metodologías para la determinación de la incertidumbre de calibración de instrumental GNSS. ABSTRACT The studies carried out so far for the determination of the quality of measurement of geodetic instruments have been aimed, primarily, to measure angles and distances. However, in recent years it has been accepted to use GNSS (Global Navigation Satellite System) equipment in the field of Geomatic applications, for data capture, without establishing a methodology that allows obtaining the calibration correction and its uncertainty. The purpose of this Thesis is to establish the requirements that a network must meet to be considered a StandardNetwork with metrological traceability, as well as the methodology for the verification and calibration of GNSS instrumental in those standard networks. To do this, a technical calibration procedure has been designed, developed and defined for GNSS equipment determining the contributions to the uncertainty of measurement. The procedure, which has been applied in different networks for different equipment, has alloweddetermining the expanded uncertainty of such equipment following the recommendations of the Guide to the Expression of Uncertainty in Measurement of the Joint Committee for Guides in Metrology. In addition, the three-dimensional coordinates of the bases which constitute the networks considered in the investigationhave been determined by satellite-based techniques. There have been several developed simulations based on different values of experimental standard deviations of the fixed points that have been used in the least squares vectors or base lines calculations. The results have shown the importance that the knowledge of experimental standard deviations has in the calculation of uncertainties of the three-dimensional coordinates of the bases. Based on high technical quality studies and observations carried out in these networks previously, it has been possible to make an exhaustive analysis that has allowed determining the requirements that a standard network must meet. In addition, technical calibration procedures have been developed to allow the uncertainty estimation of measurement carried outby geodetic instruments that provide angles and distances obtained by electromagnetic methods. These instruments provide the metrological traceability to standard networks used for verification and calibration of GNSS equipment. As a result, it has been possible the estimation of local calibration corrections for high accuracy GNSS equipment in standardnetworks. In this Thesis, the uncertainty of calibration correction has been calculated using two different methodologies: the first one by applying the law of propagation of uncertainty, while the second has applied the propagation of distributions using the Monte Carlo method. The analysis of the obtained results confirms the validity of both methodologies for estimating the calibration uncertainty of GNSS equipment.
Resumo:
RESUMEN Las enfermedades cardiovasculares constituyen en la actualidad la principal causa de mortalidad en el mundo y se prevé que sigan siéndolo en un futuro, generando además elevados costes para los sistemas de salud. Los dispositivos cardiacos implantables constituyen una de las opciones para el diagnóstico y el tratamiento de las alteraciones del ritmo cardiaco. La investigación clínica con estos dispositivos alcanza gran relevancia para combatir estas enfermedades que tanto afectan a nuestra sociedad. Tanto la industria farmacéutica y de tecnología médica, como los propios investigadores, cada día se ven involucrados en un mayor número de proyectos de investigación clínica. No sólo el incremento en su volumen, sino el aumento de la complejidad, están generando mayores gastos en las actividades asociadas a la investigación médica. Esto está conduciendo a las compañías del sector sanitario a estudiar nuevas soluciones que les permitan reducir los costes de los estudios clínicos. Las Tecnologías de la Información y las Comunicaciones han facilitado la investigación clínica, especialmente en la última década. Los sistemas y aplicaciones electrónicos han proporcionado nuevas posibilidades en la adquisición, procesamiento y análisis de los datos. Por otro lado, la tecnología web propició la aparición de los primeros sistemas electrónicos de adquisición de datos, que han ido evolucionando a lo largo de los últimos años. Sin embargo, la mejora y perfeccionamiento de estos sistemas sigue siendo crucial para el progreso de la investigación clínica. En otro orden de cosas, la forma tradicional de realizar los estudios clínicos con dispositivos cardiacos implantables precisaba mejorar el tratamiento de los datos almacenados por estos dispositivos, así como para su fusión con los datos clínicos recopilados por investigadores y pacientes. La justificación de este trabajo de investigación se basa en la necesidad de mejorar la eficiencia en la investigación clínica con dispositivos cardiacos implantables, mediante la reducción de costes y tiempos de desarrollo de los proyectos, y el incremento de la calidad de los datos recopilados y el diseño de soluciones que permitan obtener un mayor rendimiento de los datos mediante la fusión de datos de distintas fuentes o estudios. Con este fin se proponen como objetivos específicos de este proyecto de investigación dos nuevos modelos: - Un modelo de recuperación y procesamiento de datos para los estudios clínicos con dispositivos cardiacos implantables, que permita estructurar y estandarizar estos procedimientos, con el fin de reducir tiempos de desarrollo Modelos de Métrica para Sistemas Electrónicos de Adquisición de Datos y de Procesamiento para Investigación Clínica con Dispositivos Cardiacos Implantables de estas tareas, mejorar la calidad del resultado obtenido, disminuyendo en consecuencia los costes. - Un modelo de métrica integrado en un Sistema Electrónico de Adquisición de Datos (EDC) que permita analizar los resultados del proyecto de investigación y, particularmente del rendimiento obtenido del EDC, con el fin de perfeccionar estos sistemas y reducir tiempos y costes de desarrollo del proyecto y mejorar la calidad de los datos clínicos recopilados. Como resultado de esta investigación, el modelo de procesamiento propuesto ha permitido reducir el tiempo medio de procesamiento de los datos en más de un 90%, los costes derivados del mismo en más de un 85% y todo ello, gracias a la automatización de la extracción y almacenamiento de los datos, consiguiendo una mejora de la calidad de los mismos. Por otro lado, el modelo de métrica posibilita el análisis descriptivo detallado de distintos indicadores que caracterizan el rendimiento del proyecto de investigación clínica, haciendo factible además la comparación entre distintos estudios. La conclusión de esta tesis doctoral es que los resultados obtenidos han demostrado que la utilización en estudios clínicos reales de los dos modelos desarrollados ha conducido a una mejora en la eficiencia de los proyectos, reduciendo los costes globales de los mismos, disminuyendo los tiempos de ejecución, e incrementando la calidad de los datos recopilados. Las principales aportaciones de este trabajo de investigación al conocimiento científico son la implementación de un sistema de procesamiento inteligente de los datos almacenados por los dispositivos cardiacos implantables, la integración en el mismo de una base de datos global y optimizada para todos los modelos de dispositivos, la generación automatizada de un repositorio unificado de datos clínicos y datos de dispositivos cardiacos implantables, y el diseño de una métrica aplicada e integrable en los sistemas electrónicos de adquisición de datos para el análisis de resultados de rendimiento de los proyectos de investigación clínica. ABSTRACT Cardiovascular diseases are the main cause of death worldwide and it is expected to continue in the future, generating high costs for health care systems. Implantable cardiac devices have become one of the options for diagnosis and treatment of cardiac rhythm disorders. Clinical research with these devices has acquired great importance to fight against these diseases that affect so many people in our society. Both pharmaceutical and medical technology companies, and also investigators, are involved in an increasingly number of clinical research projects. The growth in volume and the increase in medical research complexity are contributing to raise the expenditure level associated with clinical investigation. This situation is driving health care sector companies to explore new solutions to reduce clinical trial costs. Information and Communication Technologies have facilitated clinical research, mainly in the last decade. Electronic systems and software applications have provided new possibilities in the acquisition, processing and analysis of clinical studies data. On the other hand, web technology contributed to the appearance of the first electronic data capture systems that have evolved during the last years. Nevertheless, improvement of these systems is still a key aspect for the progress of clinical research. On a different matter, the traditional way to develop clinical studies with implantable cardiac devices needed an improvement in the processing of the data stored by these devices, and also in the merging of these data with the data collected by investigators and patients. The rationale of this research is based on the need to improve the efficiency in clinical investigation with implantable cardiac devices, by means of reduction in costs and time of projects development, as well as improvement in the quality of information obtained from the studies and to obtain better performance of data through the merging of data from different sources or trials. The objective of this research project is to develop the next two models: • A model for the retrieval and processing of data for clinical studies with implantable cardiac devices, enabling structure and standardization of these procedures, in order to reduce the time of development of these tasks, to improve the quality of the results, diminish therefore costs. • A model of metric integrated in an Electronic Data Capture system (EDC) that allow to analyze the results of the research project, and particularly the EDC performance, in order to improve those systems and to reduce time and costs of the project, and to get a better quality of the collected clinical data. As a result of this work, the proposed processing model has led to a reduction of the average time for data processing by more than 90 per cent, of related costs by more than 85 per cent, and all of this, through automatic data retrieval and storage, achieving an improvement of quality of data. On the other hand, the model of metrics makes possible a detailed descriptive analysis of a set of indicators that characterize the performance of each research project, allowing inter‐studies comparison. This doctoral thesis results have demonstrated that the application of the two developed models in real clinical trials has led to an improvement in projects efficiency, reducing global costs, diminishing time in execution, and increasing quality of data collected. The main contributions to scientific knowledge of this research work are the implementation of an intelligent processing system for data stored by implantable cardiac devices, the integration in this system of a global and optimized database for all models of devices, the automatic creation of an unified repository of clinical data and data stored by medical devices, and the design of a metric to be applied and integrated in electronic data capture systems to analyze the performance results of clinical research projects.
Resumo:
Whole genome linkage analysis of type 1 diabetes using affected sib pair families and semi-automated genotyping and data capture procedures has shown how type 1 diabetes is inherited. A major proportion of clustering of the disease in families can be accounted for by sharing of alleles at susceptibility loci in the major histocompatibility complex on chromosome 6 (IDDM1) and at a minimum of 11 other loci on nine chromosomes. Primary etiological components of IDDM1, the HLA-DQB1 and -DRB1 class II immune response genes, and of IDDM2, the minisatellite repeat sequence in the 5' regulatory region of the insulin gene on chromosome 11p15, have been identified. Identification of the other loci will involve linkage disequilibrium mapping and sequencing of candidate genes in regions of linkage.
Resumo:
Psoriatic arthritis is a multisystem disorder which, from a measurement standpoint, demands consideration of its cutaneous manifestations and both axial and peripheral musculoskeletal involvement. Measurements of various aspects of impairment, ability/disability, and participation/ handicap are feasible using existing measurement techniques, which are for the most part valid, reliable, and responsive. Nevertheless, there remain opportunities for the further development of consensus around core set measures and responder criteria, as well as for instrument development and refinement, standardised assessor training, cross-cultural adaptation of health status questionnaires, electronic data capture, and the introduction of standardised quantitative measurement into routine clinical care.
Resumo:
Initially the study focussed on the factors affecting the ability of the police to solve crimes. An analysts of over twenty thousand police deployments revealed the proportion of time spent investigating crime contrasted to its perceived importance and the time spent on other activities. The fictional portrayal of skills believed important in successful crime investigation were identified and compared to the professional training and 'taught skills’ given to police and detectives. Police practitioners and middle management provided views on the skills needed to solve crimes. The relative importance of the forensic science role. fingerprint examination and interrogation skills were contrasted with changes in police methods resulting from the Police and Criminal Evidence Act and its effect on confessions. The study revealed that existing police systems for investigating crime excluding specifically cases of murder and other serious offences, were unsystematic, uncoordinated, unsupervised and unproductive in using police resources. The study examined relevant and contemporary research in the United States and United Kingdom and with organisational support introduced an experimental system of data capture and initial investigation with features of case screening and management. Preliminary results indicated increases in the collection of essential information and more effective use of investigative resources. In the managerial framework within which this study has been conducted, research has been undertaken in the knowledge elicitation area as a basis for an expert system of crime investigation and the potential organisational benefits of utilising the Lap computer in the first stages of data gathering and investigation. The conclusions demonstrate the need for a totally integrated system of criminal investigation with emphasis on an organisational rather than individual response. In some areas the evidence produced is sufficient to warrant replication, in others additional research is needed to further explore other concepts and proposed systems pioneered by this study.
Resumo:
In recent years there has been a great effort to combine the technologies and techniques of GIS and process models. This project examines the issues of linking a standard current generation 2½d GIS with several existing model codes. The focus for the project has been the Shropshire Groundwater Scheme, which is being developed to augment flow in the River Severn during drought periods by pumping water from the Shropshire Aquifer. Previous authors have demonstrated that under certain circumstances pumping could reduce the soil moisture available for crops. This project follows earlier work at Aston in which the effects of drawdown were delineated and quantified through the development of a software package that implemented a technique which brought together the significant spatially varying parameters. This technique is repeated here, but using a standard GIS called GRASS. The GIS proved adequate for the task and the added functionality provided by the general purpose GIS - the data capture, manipulation and visualisation facilities - were of great benefit. The bulk of the project is concerned with examining the issues of the linkage of GIS and environmental process models. To this end a groundwater model (Modflow) and a soil moisture model (SWMS2D) were linked to the GIS and a crop model was implemented within the GIS. A loose-linked approach was adopted and secondary and surrogate data were used wherever possible. The implications of which relate to; justification of a loose-linked versus a closely integrated approach; how, technically, to achieve the linkage; how to reconcile the different data models used by the GIS and the process models; control of the movement of data between models of environmental subsystems, to model the total system; the advantages and disadvantages of using a current generation GIS as a medium for linking environmental process models; generation of input data, including the use of geostatistic, stochastic simulation, remote sensing, regression equations and mapped data; issues of accuracy, uncertainty and simply providing adequate data for the complex models; how such a modelling system fits into an organisational framework.
Resumo:
The principle theme of this thesis is the advancement and expansion of ophthalmic research via the collaboration between professional Engineers and professional Optometrists. The aim has been to develop new and novel approaches and solutions to contemporary problems in the field. The work is sub divided into three areas of investigation; 1) High technology systems, 2) Modification of current systems to increase functionality, and 3) Development of smaller more portable and cost effective systems. High Technology Systems: A novel high speed Optical Coherence Tomography (OCT) system with integrated simultaneous high speed photography was developed achieving better operational speed than is currently available commercially. The mechanical design of the system featured a novel 8 axis alignment system. A full set of capture, analysis, and post processing software was developed providing custom analysis systems for ophthalmic OCT imaging, expanding the current capabilities of the technology. A large clinical trial was undertaken to test the dynamics of contact lens edge interaction with the cornea in-vivo. The interaction between lens edge design, lens base curvature, post insertion times and edge positions was investigated. A novel method for correction of optical distortion when assessing lens indentation was also demonstrated. Modification of Current Systems: A commercial autorefractor, the WAM-5500, was modified with the addition of extra hardware and a custom software and firmware solution to produce a system that was capable of measuring dynamic accommodative response to various stimuli in real time. A novel software package to control the data capture process was developed allowing real time monitoring of data by the practitioner, adding considerable functionality of the instrument further to the standard system. The device was used to assess the accommodative response differences between subjects who had worn UV blocking contact lens for 5 years, verses a control group that had not worn UV blocking lenses. While the standard static measurement of accommodation showed no differences between the two groups, it was determined that the UV blocking group did show better accommodative rise and fall times (faster), thus demonstrating the benefits of the modification of this commercially available instrumentation. Portable and Cost effective Systems: A new instrument was developed to expand the capability of the now defunct Keeler Tearscope. A device was developed that provided a similar capability in allowing observation of the reflected mires from the tear film surface, but with the added advantage of being able to record the observations. The device was tested comparatively with the tearscope and other tear film break-up techniques, demonstrating its potential. In Conclusion: This work has successfully demonstrated the advantages of interdisciplinary research between engineering and ophthalmic research has provided new and novel instrumented solutions as well as having added to the sum of scientific understanding in the ophthalmic field.
Resumo:
This chapter looks at how the current global economic crisis has impacted upon the global automotive industry from an operations and supply chain perspective. It presents an empirical and theoretical background to help long and short term planning for organisations experiencing adverse trading conditions. The empirical research study (conducted between 2004-07 primarily in Germany) revealed that organisations are able to make short term improvements to performance by reducing costs and making process and structural improvements, but in the long term the deeper rooted causes of the industry can in part only be dealt with by improving interfirm R&D collaborations based upon competencies rather than cost related issues. A new approach known as Collaborative Enterprise Governance is presented which supports the design and management of competitive sustainable enterprises; it consists of a data capture tool, a body of knowledge and a dynamic reference grid to show how many part-to-part company relationships can exist simultaneously to make up productprocess focused enterprises. Examples from the German automotive industry are given, impact upon the overall product development lifecycle and the implications for organisational strategists are discussed. © 2010 Nova Science Publishers, Inc. All rights reserved.
Resumo:
Increasingly, lab evaluations of mobile applications are incorporating mobility. The inclusion of mobility alone, however, is insufficient to generate a realistic evaluation context since real-life users will typically be required to monitor their environment while moving through it. While field evaluations represent a more realistic evaluation context, such evaluations pose difficulties, including data capture and environmental control, which mean that a lab-based evaluation is often a more practical choice. This paper describes a novel evaluation technique that mimics a realistic mobile usage context in a lab setting. The technique requires that participants monitor their environment and change the route they are walking to avoid dynamically changing hazards (much as reallife users would be required to do). Two studies that employed this technique are described, and the results (which indicate the technique is useful) are discussed.
Resumo:
This chapter presents Radio Frequency Identification (RFID), which is one of the Automatic Identification and Data Capture (AIDC) technologies (Wamba and Boeck, 2008) and discusses the application of RFID in E-Commerce. Firstly RFID is defined and the tag and reader components of the RFID system are explained. Then historical context of RFID is briefly discussed. Next, RFID is contrasted with other AIDC technologies, especially the use of barcodes which are commonly applied in E-Commerce. Lastly, RFID applications in E-Commerce are discussed with the focus on achievable benefits and obstacles to successful applications of RFID in E-Commerce, and ways to alleviate them.
Resumo:
The miniaturization, sophistication, proliferation, and accessibility of technologies are enabling the capture of more and previously inaccessible phenomena in Parkinson's disease (PD). However, more information has not translated into a greater understanding of disease complexity to satisfy diagnostic and therapeutic needs. Challenges include noncompatible technology platforms, the need for wide-scale and long-term deployment of sensor technology (among vulnerable elderly patients in particular), and the gap between the "big data" acquired with sensitive measurement technologies and their limited clinical application. Major opportunities could be realized if new technologies are developed as part of open-source and/or open-hardware platforms that enable multichannel data capture sensitive to the broad range of motor and nonmotor problems that characterize PD and are adaptable into self-adjusting, individualized treatment delivery systems. The International Parkinson and Movement Disorders Society Task Force on Technology is entrusted to convene engineers, clinicians, researchers, and patients to promote the development of integrated measurement and closed-loop therapeutic systems with high patient adherence that also serve to (1) encourage the adoption of clinico-pathophysiologic phenotyping and early detection of critical disease milestones, (2) enhance the tailoring of symptomatic therapy, (3) improve subgroup targeting of patients for future testing of disease-modifying treatments, and (4) identify objective biomarkers to improve the longitudinal tracking of impairments in clinical care and research. This article summarizes the work carried out by the task force toward identifying challenges and opportunities in the development of technologies with potential for improving the clinical management and the quality of life of individuals with PD. © 2016 International Parkinson and Movement Disorder Society.
Resumo:
This present research focus at the teaching of music in social programs, it discuss about the teaching concepts that permeates the educational-musical practice presents in Serviço de Convivência e Fortalecimento de Vínculos (SCFV), of Complexo Dom Bosco, in Natal-RN. The Objective is to reflect upon the music teaching concepts and the relations between musical knowledge and culture. For this, it was problematized the concepts of music teaching and learning in governmental social projects through theoretical and empirical research. After this step, it was studied the cultural aspects involving the routine in the institutions the influenced the music learning at Serviço de Convivência e Fortalecimento de Vínculos, furthermore, and how these aspects are present in the constitution of the paradigms that involve music teaching. For this, it was used a qualitative approach and a case study as type of research. As data capture tool it was used the ethnographic write, photography, interviews as a Facilitator of music with the students of the program, and video recordings of musical learning situations. Theoretical support it was used authors who study the complexity, culture and music teaching in social projects. Finally, the conclusion is that musical learning, in the SCFV context, is involved of cultural conceptions steeped to the Oratório Dom Bosco space and the same time the Brazilian Social Assistance Policy. Sometimes these concepts are contradictory: discipline, leadership and combat social exclusion refer to dialogical cultural hologram of the institutions involved in the music education process.
Resumo:
Background: Sickle Cell Disease (SCD) is a genetic hematological disorder that affects more than 7 million people globally (NHLBI, 2009). It is estimated that 50% of adults with SCD experience pain on most days, with 1/3 experiencing chronic pain daily (Smith et al., 2008). Persons with SCD also experience higher levels of pain catastrophizing (feelings of helplessness, pain rumination and magnification) than other chronic pain conditions, which is associated with increases in pain intensity, pain behavior, analgesic consumption, frequency and duration of hospital visits, and with reduced daily activities (Sullivan, Bishop, & Pivik, 1995; Keefe et al., 2000; Gil et al., 1992 & 1993). Therefore effective interventions are needed that can successfully be used manage pain and pain-related outcomes (e.g., pain catastrophizing) in persons with SCD. A review of the literature demonstrated limited information regarding the feasibility and efficacy of non-pharmacological approaches for pain in persons with SCD, finding an average effect size of .33 on pain reduction across measurable non-pharmacological studies. Second, a prospective study on persons with SCD that received care for a vaso-occlusive crisis (VOC; N = 95) found: (1) high levels of patient reported depression (29%) and anxiety (34%), and (2) that unemployment was significantly associated with increased frequency of acute care encounters and hospital admissions per person. Research suggests that one promising category of non-pharmacological interventions for managing both physical and affective components of pain are Mindfulness-based Interventions (MBIs; Thompson et al., 2010; Cox et al., 2013). The primary goal of this dissertation was thus to develop and test the feasibility, acceptability, and efficacy of a telephonic MBI for pain catastrophizing in persons with SCD and chronic pain.
Methods: First, a telephonic MBI was developed through an informal process that involved iterative feedback from patients, clinical experts in SCD and pain management, social workers, psychologists, and mindfulness clinicians. Through this process, relevant topics and skills were selected to adapt in each MBI session. Second, a pilot randomized controlled trial was conducted to test the feasibility, acceptability, and efficacy of the telephonic MBI for pain catastrophizing in persons with SCD and chronic pain. Acceptability and feasibility were determined by assessment of recruitment, attrition, dropout, and refusal rates (including refusal reasons), along with semi-structured interviews with nine randomly selected patients at the end of study. Participants completed assessments at baseline, Week 1, 3, and 6 to assess efficacy of the intervention on decreasing pain catastrophizing and other pain-related outcomes.
Results: A telephonic MBI is feasible and acceptable for persons with SCD and chronic pain. Seventy-eight patients with SCD and chronic pain were approached, and 76% (N = 60) were enrolled and randomized. The MBI attendance rate, approximately 57% of participants completing at least four mindfulness sessions, was deemed acceptable, and participants that received the telephonic MBI described it as acceptable, easy to access, and consume in post-intervention interviews. The amount of missing data was undesirable (MBI condition, 40%; control condition, 25%), but fell within the range of expected missing outcome data for a RCT with multiple follow-up assessments. Efficacy of the MBI on pain catastrophizing could not be determined due to small sample size and degree of missing data, but trajectory analyses conducted for the MBI condition only trended in the right direction and pain catastrophizing approached statistically significance.
Conclusion: Overall results showed that at telephonic group-based MBI is acceptable and feasible for persons with SCD and chronic pain. Though the study was not able to determine treatment efficacy nor powered to detect a statistically significant difference between conditions, participants (1) described the intervention as acceptable, and (2) the observed effect sizes for the MBI condition demonstrated large effects of the MBI on pain catastrophizing, mental health, and physical health. Replication of this MBI study with a larger sample size, active control group, and additional assessments at the end of each week (e.g., Week 1 through Week 6) is needed to determine treatment efficacy. Many lessons were learned that will guide the development of future studies including which MBI strategies were most helpful, methods to encourage continued participation, and how to improve data capture.