687 resultados para Practice Development
Resumo:
During recent years, an increasingly comprehensive set of rules and guidelines has been developed around clinical trials, to ensure their proper ethical, methodological, administrative and financial conduct. While initially limited to new drug development, this regulation is progressively invading all areas of clinical research, with limited respect for the heterogeneity in aims, resources, sponsors and epistemological grounds. No clinical study should be planned without consideration of a series of legal requirements, which are reviewed. Concerns about their practical implications are critically assessed.
Resumo:
Since 2004, four antiangiogenic drugs have been approved for clinical use in patients with advanced solid cancers, on the basis of their capacity to improve survival in phase III clinical studies. These achievements validated the concept introduced by Judah Folkman that the inhibition of tumor angiogenesis could control tumor growth. It has been suggested that biomarkers of angiogenesis would greatly facilitate the clinical development of antiangiogenic therapies. For these four drugs, the pharmacodynamic effects observed in early clinical studies were important to corroborate activities, but were not essential for the continuation of clinical development and approval. Furthermore, no validated biomarkers of angiogenesis or antiangiogenesis are available for routine clinical use. Thus, the quest for biomarkers of angiogenesis and their successful use in the development of antiangiogenic therapies are challenges in clinical oncology and translational cancer research. We review critical points resulting from the successful clinical trials, review current biomarkers, and discuss their potential impact on improving the clinical use of available antiangiogenic drugs and the development of new ones.
Resumo:
BACKGROUND: Over the years, somatic care has become increasingly specialized. Furthermore, a rising number of patients requiring somatic care also present with a psychiatric comorbidity. As a consequence, the time and resources needed to care for these patients can interfere with the course of somatic treatment and influence the patient-caregiver relationship. In the light of these observations, the Liaison Psychiatry Unit at the University Hospital in Lausanne (CHUV) has educated its nursing staff in order to strengthen its action within the general care hospital. What has been developed is a reflexive approach through supervision of somatic staff, in order to improve the efficiency of liaison psychiatry interventions with the caregivers in charge of patients. The kind of supervision we have developed is the result of a real partnership with somatic staff. Besides, in order to better understand the complexity of interactions between the two systems involved, the patient's and the caregivers', we use several theoretical references in an integrative manner. PSYCHOANALYTICAL REFERENCE: The psychoanalytical model allows us to better understand the dynamics between the supervisor and the supervised group in order to contain and give meaning to the affects arising in the supervision space. "Containing function" and "transitional phenomena" refer to the experience in which emotions can find a space where they can be taken in and processed in a secure and supportive manner. These concepts, along with that of the "psychic envelope", were initially developed to explain the psychological development of the baby in its early interactions with its mother or its surrogate. In the field of supervision, they allow us to be aware of these complex phenomena and the diverse qualities to which a supervisor needs to resort, such as attention, support and incentive, in order to offer a secure environment. SYSTEMIC REFERENCE: A new perspective of the patient's complexity is revealed by the group's dynamics. The supervisor's attention is mainly focused on the work of affects. However, these are often buried under a defensive shell, serving as a temporary protection, which prevents the caregiver from recognizing his or her own emotions, thereby enhancing the difficulties in the relationship with the patient. Whenever the work of putting emotions into words fail, we use "sculpting", a technique derived from the systemic model. Through the use of this type of analogical language, affects can emerge without constraint or feelings of danger. Through "playing" in that "transitional space", new exchanges appear between group members and allow new behaviors to be conceived. In practice, we ask the supervisee who is presenting a complex situation, to design a spatial representation of his or her understanding of the situation, through the display of characters significant to the situation: the patient, somatic staff members, relatives of the patient, etc. In silence, the supervisee shapes the characters into postures and arranges them in the room. Each sculpted character is identified, named, and positioned, with his or her gaze being set in a specific direction. Finally the sculptor shapes him or herself in his or her own role. When the sculpture is complete and after a few moments of fixation, we ask participants to express themselves about their experience. By means of this physical representation, participants to the sculpture discover perceptions and feelings that were unknown up to then. Hence from this analogical representation a reflection and hypotheses of understanding can arise and be developed within the group. CONCLUSION: Through the use of the concepts of "containing function" and "transitional space" we position ourselves in the scope of the encounter and the dialog. Through the use of the systemic technique of "sculpting" we promote the process of understanding, rather than that of explaining, which would place us in the position of experts. The experience of these encounters has shown us that what we need to focus on is indeed what happens in this transitional space in terms of dynamics and process. The encounter and the sharing of competencies both allow a new understanding of the situation at hand, which has, of course, to be verified in the reality of the patient-caregiver relationship. It is often a source of adjustment for interpersonal skills to recover its containing function in order to enable caregiver to better respond to the patient's needs.
Resumo:
This literature review serves as a foundation for a transportation and land use public policy education program for Iowa. The objective of the review is to summarize relevant research findings, to review the state of practice and policies of other state and local governments, and to explore land use trends both within the state of Iowa and the nation as a whole. Much of what we learned has been incorporated into the course materials. Because we expect to identify more useful sources throughout the project, this literature review should be considered a work in progress.
Resumo:
To enable a mathematically and physically sound execution of the fatigue test and a correct interpretation of its results, statistical evaluation methods are used to assist in the analysis of fatigue testing data. The main objective of this work is to develop step-by-stepinstructions for statistical analysis of the laboratory fatigue data. The scopeof this project is to provide practical cases about answering the several questions raised in the treatment of test data with application of the methods and formulae in the document IIW-XIII-2138-06 (Best Practice Guide on the Statistical Analysis of Fatigue Data). Generally, the questions in the data sheets involve some aspects: estimation of necessary sample size, verification of the statistical equivalence of the collated sets of data, and determination of characteristic curves in different cases. The series of comprehensive examples which are given in this thesis serve as a demonstration of the various statistical methods to develop a sound procedure to create reliable calculation rules for the fatigue analysis.
Resumo:
BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process. RESULTS: The ADAPTE process consists of three phases (set-up, adaptation, finalisation), 9 modules and 24 steps. The adaptation phase involves identifying specific clinical questions, searching for, retrieving and assessing available guidelines, and preparing the draft adapted guideline. Among 330 registered individuals (46 countries), 144 completed the questionnaire. A majority found the ADAPTE process clear (78%), comprehensive (69%) and feasible (60%), and the manual useful (79%). However, 21% found the ADAPTE process complex. 44% feared that they will not find appropriate and high-quality source guidelines. DISCUSSION: A comprehensive framework for guideline adaptation has been developed to meet the challenges of timely guideline development and implementation. The ADAPTE process generated important interest among guideline developers and implementers. The majority perceived the ADAPTE process to be feasible, useful and leading to improved methodological rigour and guideline quality. However, some de novo development might be needed if no high quality guideline exists for a given topic.
Resumo:
Tools to predict fracture risk are useful for selecting patients for pharmacological therapy in order to reduce fracture risk and redirect limited healthcare resources to those who are most likely to benefit. FRAX® is a World Health Organization fracture risk assessment algorithm for estimating the 10-year probability of hip fracture and major osteoporotic fracture. Effective application of FRAX® in clinical practice requires a thorough understanding of its limitations as well as its utility. For some patients, FRAX® may underestimate or overestimate fracture risk. In order to address some of the common issues encountered with the use of FRAX® for individual patients, the International Society for Clinical Densitometry (ISCD) and International Osteoporosis Foundation (IOF) assigned task forces to review the medical evidence and make recommendations for optimal use of FRAX® in clinical practice. Among the issues addressed were the use of bone mineral density (BMD) measurements at skeletal sites other than the femoral neck, the use of technologies other than dual-energy X-ray absorptiometry, the use of FRAX® without BMD input, the use of FRAX® to monitor treatment, and the addition of the rate of bone loss as a clinical risk factor for FRAX®. The evidence and recommendations were presented to a panel of experts at the Joint ISCD-IOF FRAX® Position Development Conference, resulting in the development of Joint ISCD-IOF Official Positions addressing FRAX®-related issues.
Resumo:
In the era of fast product development and customized product requirements, the concept of product platform has proven its power in practice. The product platform approach has enabledcompanies to increase the speed of product introductions while simultaneously benefit from efficiency and effectiveness in the development and production activities. The product platforms are technological bases, which can be used to develop several derivative products, and hence, the differentiation can be pushed closer to the product introduction. The product platform development has some specific features, which differ somewhat from the product development of single products. The time horizon is longer, since the product platform¿slife cycle is longer than individual product's. The long time-horizon also proposes higher market risks and the use of new technologies increases the technological risks involved. The end-customer interface might be far away, but there is not a lack of needs aimed at the product platforms ¿ in fact, the product platform development is very much balancing between the varying needs set to it by thederivative products. This dissertation concentrated on product platform development from the internal product lines' perspective of a singlecase. Altogether six product platform development factors were identified: 'Strategic and business fit of product platform', 'Project communication and deliverables', 'Cooperation with product platform development', 'Innovativeness of product platform architecture and features', 'Reliability and quality of product platform', and 'Promised schedules and final product platform meeting the needs'. From the six factors, three were found to influence quite strongly the overall satisfaction, namely 'Strategic and business fit of product platform', 'Reliability and quality of product platform', and 'Promised schedules and final product platform meeting the needs'. Hence, these three factors might be the ones a new product platform development unit should concentrate first in order to satisfy their closest customers, the product lines. The 'Project communication and deliverables' and 'Innovativeness of product platform architecture and features' were weaker contributors to the overall satisfaction. Overall, the factors explained quite well the satisfaction of the product lines with product platform development. Along the research, several interesting aspects about the very basic nature of the product platform development were found. The long time horizon of the product platform development caused challenges in the area of strategic fIT - a conflict between the short-term requirements and long term needs. The fact that a product platform was used as basis of several derivative products resulted into varying needs, and hence the match with the needs and the strategies. The opinions, that the releases of the larger product lines were given higher priorities, give an interesting contribution to the strategy theory of powerand politics. The varying needs of the product lines, the strengths of them as well as large number of concurrent releases set requirements to prioritization. Hence, the research showed the complicated nature of the product platform development in the case unIT - the very basic nature of the product platform development might be its strength (gaining efficiency and effectiveness in product development and product launches) but also the biggest challenge (developing products to meet several needs). As a single case study, the results of this research are not directly generalizable to all the product platform development activities. Instead, the research serves best as a starting point for additional research as well as gives some insights about the factors and challengesof one product development unit.
Resumo:
Quality management has become a strategic issue for organisations and is very valuable to produce quality software. However, quality management systems (QMS) are not easy to implement and maintain. The authors' experience shows the benefits of developing a QMS by first formalising it using semantic web ontologies and then putting them into practice through a semantic wiki. The QMS ontology that has been developed captures the core concepts of a traditional QMS and combines them with concepts coming from the MPIu'a development process model, which is geared towards obtaining usable and accessible software products. Then, the ontology semantics is directly put into play by a semantics-aware tool, the Semantic MediaWiki. The developed QMS tool is being used for 2 years by the GRIHO research group, where it has manages almost 50 software development projects taking into account the quality management issues. It has also been externally audited by a quality certification organisation. Its users are very satisfied with their daily work with the tool, which manages all the documents created during project development and also allows them to collaborate, thanks to the wiki features.
Resumo:
Postmortem MRI (PMMR) examinations are seldom performed in legal medicine due to long examination times, unfamiliarity with the technique, and high costs. Furthermore, it is difficult to obtain access to an MRI device used for patients in clinical settings to image an entire human body. An alternative is available: ex situ organ examination. To our knowledge, there is no standardized protocol that includes ex situ organ preparation and scanning parameters for postmortem MRI. Thus, our objective was to develop a standard procedure for ex situ heart PMMR examinations. We also tested the oily contrast agent Angiofil® commonly used for PMCT angiography, for its applicability in MRI. We worked with a 3 Tesla MRI device and 32-channel head coils. Twelve porcine hearts were used to test different materials to find the best way to prepare and place organs in the device and to test scanning parameters. For coronary MR angiography, we tested different mixtures of Angiofil® and different injection materials. In a second step, 17 human hearts were examined to test the procedure and its applicability to human organs. We established two standardized protocols: one for preparation of the heart and another for scanning parameters based on experience in clinical practice. The established protocols enabled a standardized technical procedure with comparable radiological images, allowing for easy radiological reading. The performance of coronary MR angiography enabled detailed coronary assessment and revealed the utility of Angiofil® as a contrast agent for PMMR. Our simple, reproducible method for performing heart examinations ex situ yields high quality images and visualization of the coronary arteries.
Resumo:
Atherosclerosis is a chronic cardiovascular disease that involves the thicken¬ing of the artery walls as well as the formation of plaques (lesions) causing the narrowing of the lumens, in vessels such as the aorta, the coronary and the carotid arteries. Magnetic resonance imaging (MRI) is a promising modality for the assessment of atherosclerosis, as it is a non-invasive and patient-friendly procedure that does not use ionizing radiation. MRI offers high soft tissue con¬trast already without the need of intravenous contrast media; while modifica¬tion of the MR pulse sequences allows for further adjustment of the contrast for specific diagnostic needs. As such, MRI can create angiographic images of the vessel lumens to assess stenoses at the late stage of the disease, as well as blood flow-suppressed images for the early investigation of the vessel wall and the characterization of the atherosclerotic plaques. However, despite the great technical progress that occurred over the past two decades, MRI is intrinsically a low sensitive technique and some limitations still exist in terms of accuracy and performance. A major challenge for coronary artery imaging is respiratory motion. State- of-the-art diaphragmatic navigators rely on an indirect measure of motion, per¬form a ID correction, and have long and unpredictable scan time. In response, self-navigation (SM) strategies have recently been introduced that offer 100% scan efficiency and increased ease of use. SN detects respiratory motion di¬rectly from the image data obtained at the level of the heart, and retrospectively corrects the same data before final image reconstruction. Thus, SN holds po-tential for multi-dimensional motion compensation. To this regard, this thesis presents novel SN methods that estimate 2D and 3D motion parameters from aliased sub-images that are obtained from the same raw data composing the final image. Combination of all corrected sub-images produces a final image with reduced motion artifacts for the visualization of the coronaries. The first study (section 2.2, 2D Self-Navigation with Compressed Sensing) consists of a method for 2D translational motion compensation. Here, the use of com- pressed sensing (CS) reconstruction is proposed and investigated to support motion detection by reducing aliasing artifacts. In healthy human subjects, CS demonstrated an improvement in motion detection accuracy with simula¬tions on in vivo data, while improved coronary artery visualization was demon¬strated on in vivo free-breathing acquisitions. However, the motion of the heart induced by respiration has been shown to occur in three dimensions and to be more complex than a simple translation. Therefore, the second study (section 2.3,3D Self-Navigation) consists of a method for 3D affine motion correction rather than 2D only. Here, different techniques were adopted to reduce background signal contribution in respiratory motion tracking, as this can be adversely affected by the static tissue that surrounds the heart. The proposed method demonstrated to improve conspicuity and vi¬sualization of coronary arteries in healthy and cardiovascular disease patient cohorts in comparison to a conventional ID SN method. In the third study (section 2.4, 3D Self-Navigation with Compressed Sensing), the same tracking methods were used to obtain sub-images sorted according to the respiratory position. Then, instead of motion correction, a compressed sensing reconstruction was performed on all sorted sub-image data. This process ex¬ploits the consistency of the sorted data to reduce aliasing artifacts such that the sub-image corresponding to the end-expiratory phase can directly be used to visualize the coronaries. In a healthy volunteer cohort, this strategy improved conspicuity and visualization of the coronary arteries when compared to a con¬ventional ID SN method. For the visualization of the vessel wall and atherosclerotic plaques, the state- of-the-art dual inversion recovery (DIR) technique is able to suppress the signal coming from flowing blood and provide positive wall-lumen contrast. How¬ever, optimal contrast may be difficult to obtain and is subject to RR variability. Furthermore, DIR imaging is time-inefficient and multislice acquisitions may lead to prolonged scanning times. In response and as a fourth study of this thesis (chapter 3, Vessel Wall MRI of the Carotid Arteries), a phase-sensitive DIR method has been implemented and tested in the carotid arteries of a healthy volunteer cohort. By exploiting the phase information of images acquired after DIR, the proposed phase-sensitive method enhances wall-lumen contrast while widens the window of opportunity for image acquisition. As a result, a 3-fold increase in volumetric coverage is obtained at no extra cost in scanning time, while image quality is improved. In conclusion, this thesis presented novel methods to address some of the main challenges for MRI of atherosclerosis: the suppression of motion and flow artifacts for improved visualization of vessel lumens, walls and plaques. Such methods showed to significantly improve image quality in human healthy sub¬jects, as well as scan efficiency and ease-of-use of MRI. Extensive validation is now warranted in patient populations to ascertain their diagnostic perfor¬mance. Eventually, these methods may bring the use of atherosclerosis MRI closer to the clinical practice. Résumé L'athérosclérose est une maladie cardiovasculaire chronique qui implique le épaississement de la paroi des artères, ainsi que la formation de plaques (lé¬sions) provoquant le rétrécissement des lumières, dans des vaisseaux tels que l'aorte, les coronaires et les artères carotides. L'imagerie par résonance magné¬tique (IRM) est une modalité prometteuse pour l'évaluation de l'athérosclérose, car il s'agit d'une procédure non-invasive et conviviale pour les patients, qui n'utilise pas des rayonnements ionisants. L'IRM offre un contraste des tissus mous très élevé sans avoir besoin de médias de contraste intraveineux, tan¬dis que la modification des séquences d'impulsions de RM permet en outre le réglage du contraste pour des besoins diagnostiques spécifiques. À ce titre, l'IRM peut créer des images angiographiques des lumières des vaisseaux pour évaluer les sténoses à la fin du stade de la maladie, ainsi que des images avec suppression du flux sanguin pour une première enquête des parois des vais¬seaux et une caractérisation des plaques d'athérosclérose. Cependant, malgré les grands progrès techniques qui ont eu lieu au cours des deux dernières dé¬cennies, l'IRM est une technique peu sensible et certaines limitations existent encore en termes de précision et de performance. Un des principaux défis pour l'imagerie de l'artère coronaire est le mou¬vement respiratoire. Les navigateurs diaphragmatiques de pointe comptent sur une mesure indirecte de mouvement, effectuent une correction 1D, et ont un temps d'acquisition long et imprévisible. En réponse, les stratégies d'auto- navigation (self-navigation: SN) ont été introduites récemment et offrent 100% d'efficacité d'acquisition et une meilleure facilité d'utilisation. Les SN détectent le mouvement respiratoire directement à partir des données brutes de l'image obtenue au niveau du coeur, et rétrospectivement corrigent ces mêmes données avant la reconstruction finale de l'image. Ainsi, les SN détiennent un poten¬tiel pour une compensation multidimensionnelle du mouvement. A cet égard, cette thèse présente de nouvelles méthodes SN qui estiment les paramètres de mouvement 2D et 3D à partir de sous-images qui sont obtenues à partir des mêmes données brutes qui composent l'image finale. La combinaison de toutes les sous-images corrigées produit une image finale pour la visualisation des coronaires ou les artefacts du mouvement sont réduits. La première étude (section 2.2,2D Self-Navigation with Compressed Sensing) traite d'une méthode pour une compensation 2D de mouvement de translation. Ici, on étudie l'utilisation de la reconstruction d'acquisition comprimée (compressed sensing: CS) pour soutenir la détection de mouvement en réduisant les artefacts de sous-échantillonnage. Chez des sujets humains sains, CS a démontré une amélioration de la précision de la détection de mouvement avec des simula¬tions sur des données in vivo, tandis que la visualisation de l'artère coronaire sur des acquisitions de respiration libre in vivo a aussi été améliorée. Pourtant, le mouvement du coeur induite par la respiration se produit en trois dimensions et il est plus complexe qu'un simple déplacement. Par conséquent, la deuxième étude (section 2.3, 3D Self-Navigation) traite d'une méthode de cor¬rection du mouvement 3D plutôt que 2D uniquement. Ici, différentes tech¬niques ont été adoptées pour réduire la contribution du signal du fond dans le suivi de mouvement respiratoire, qui peut être influencé négativement par le tissu statique qui entoure le coeur. La méthode proposée a démontré une amélioration, par rapport à la procédure classique SN de correction 1D, de la visualisation des artères coronaires dans le groupe de sujets sains et des pa¬tients avec maladies cardio-vasculaires. Dans la troisième étude (section 2.4,3D Self-Navigation with Compressed Sensing), les mêmes méthodes de suivi ont été utilisées pour obtenir des sous-images triées selon la position respiratoire. Au lieu de la correction du mouvement, une reconstruction de CS a été réalisée sur toutes les sous-images triées. Cette procédure exploite la cohérence des données pour réduire les artefacts de sous- échantillonnage de telle sorte que la sous-image correspondant à la phase de fin d'expiration peut directement être utilisée pour visualiser les coronaires. Dans un échantillon de volontaires en bonne santé, cette stratégie a amélioré la netteté et la visualisation des artères coronaires par rapport à une méthode classique SN ID. Pour la visualisation des parois des vaisseaux et de plaques d'athérosclérose, la technique de pointe avec double récupération d'inversion (DIR) est capa¬ble de supprimer le signal provenant du sang et de fournir un contraste posi¬tif entre la paroi et la lumière. Pourtant, il est difficile d'obtenir un contraste optimal car cela est soumis à la variabilité du rythme cardiaque. Par ailleurs, l'imagerie DIR est inefficace du point de vue du temps et les acquisitions "mul- tislice" peuvent conduire à des temps de scan prolongés. En réponse à ce prob¬lème et comme quatrième étude de cette thèse (chapitre 3, Vessel Wall MRI of the Carotid Arteries), une méthode de DIR phase-sensitive a été implémenté et testé
Resumo:
BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process.
Resumo:
Currently there is a vogue for Agile Software Development methods and many software development organizations have already implemented or they are planning to implement agile methods. Objective of this thesis is to define how agile software development methods are implemented in a small organization. Agile methods covered in this thesis are Scrum and XP. From both methods the key practices are analysed and compared to waterfall method. This thesis also defines implementation strategy and actions how agile methods are implemented in a small organization. In practice organization must prepare well and all needed meters are defined before the implementation starts. In this work three different sample projects are introduced where agile methods were implemented. Experiences from these projects were encouraging although sample set of projects were too small to get trustworthy results.