11 resultados para Software Process Improvement
em Université de Lausanne, Switzerland
Resumo:
We propose a new approach and related indicators for globally distributed software support and development based on a 3-year process improvement project in a globally distributed engineering company. The company develops, delivers and supports a complex software system with tailored hardware components and unique end-customer installations. By applying the domain knowledge from operations management on lead time reduction and its multiple benefits to process performance, the workflows of globally distributed software development and multitier support processes were measured and monitored throughout the company. The results show that the global end-to-end process visibility and centrally managed reporting at all levels of the organization catalyzed a change process toward significantly better performance. Due to the new performance indicators based on lead times and their variation with fixed control procedures, the case company was able to report faster bug-fixing cycle times, improved response times and generally better customer satisfaction in its global operations. In all, lead times to implement new features and to respond to customer issues and requests were reduced by 50%.
Resumo:
Current explanatory models for binge eating in binge eating disorder (BED) mostly rely onmodels for bulimianervosa (BN), although research indicates different antecedents for binge eating in BED. This studyinvestigates antecedents and maintaining factors in terms of positive mood, negative mood and tension in asample of 22 women with BED using ecological momentary assessment over a 1-week. Values for negativemood were higher and those for positive mood lower during binge days compared with non-binge days.During binge days, negative mood and tension both strongly and significantly increased and positive moodstrongly and significantly decreased at the first binge episode, followed by a slight though significant, andlonger lasting decrease (negative mood, tension) or increase (positive mood) during a 4-h observation periodfollowing binge eating. Binge eating in BED seems to be triggered by an immediate breakdown of emotionregulation. There are no indications of an accumulation of negative mood triggering binge eating followed byimmediate reinforcing mechanisms in terms of substantial and stable improvement of mood as observed inBN. These differences implicate a further specification of etiological models and could serve as a basis fordeveloping new treatment approaches for BED.
Resumo:
For several years, all five medical faculties of Switzerland have embarked on a reform of their training curricula for two reasons: first, according to a new federal act issued in 2006 by the administration of the confederation, faculties needed to meet international standards in terms of content and pedagogic approaches; second, all Swiss universities and thus all medical faculties had to adapt the structure of their curriculum to the frame and principles which govern the Bologna process. This process is the result of the Bologna Declaration of June 1999 which proposes and requires a series of reforms to make European Higher Education more compatible and comparable, more competitive and more attractive for Europeans students. The present paper reviews some of the results achieved in the field, focusing on several issues such as the shortage of physicians and primary care practitioners, the importance of public health, community medicine and medical humanities, and the implementation of new training approaches including e-learning and simulation. In the future, faculties should work on several specific challenges such as: students' mobility, the improvement of students' autonomy and critical thinking as well as their generic and specific skills and finally a reflection on how to improve the attractiveness of the academic career, for physicians of both sexes.
Resumo:
Unlike fragmental rockfall runout assessments, there are only few robust methods to quantify rock-mass-failure susceptibilities at regional scale. A detailed slope angle analysis of recent Digital Elevation Models (DEM) can be used to detect potential rockfall source areas, thanks to the Slope Angle Distribution procedure. However, this method does not provide any information on block-release frequencies inside identified areas. The present paper adds to the Slope Angle Distribution of cliffs unit its normalized cumulative distribution function. This improvement is assimilated to a quantitative weighting of slope angles, introducing rock-mass-failure susceptibilities inside rockfall source areas previously detected. Then rockfall runout assessment is performed using the GIS- and process-based software Flow-R, providing relative frequencies for runout. Thus, taking into consideration both susceptibility results, this approach can be used to establish, after calibration, hazard and risk maps at regional scale. As an example, a risk analysis of vehicle traffic exposed to rockfalls is performed along the main roads of the Swiss alpine valley of Bagnes.
Resumo:
The DNA microarray technology has arguably caught the attention of the worldwide life science community and is now systematically supporting major discoveries in many fields of study. The majority of the initial technical challenges of conducting experiments are being resolved, only to be replaced with new informatics hurdles, including statistical analysis, data visualization, interpretation, and storage. Two systems of databases, one containing expression data and one containing annotation data are quickly becoming essential knowledge repositories of the research community. This present paper surveys several databases, which are considered "pillars" of research and important nodes in the network. This paper focuses on a generalized workflow scheme typical for microarray experiments using two examples related to cancer research. The workflow is used to reference appropriate databases and tools for each step in the process of array experimentation. Additionally, benefits and drawbacks of current array databases are addressed, and suggestions are made for their improvement.
Resumo:
Percutaneous transluminal renal angioplasty (PTRA) is an invasive technique that is costly and involves the risk of complications and renal failure. The ability of PTRA to reduce the administration of antihypertensive drugs has been demonstrated. A potentially greater benefit, which nevertheless remains to be proven, is the deferral of the need for chronic dialysis. The aim of the study (ANPARIA) was to assess the appropriateness of PTRA to impact on the evolution of renal function. A standardized expert panel method was used to assess the appropriateness of medical treatment alone or medical treatment with revascularization in various clinical situations. The choice of revascularization by either PTRA or surgery was examined for each clinical situation. Analysis was based on a detailed literature review and on systematically elicited expert opinion, which were obtained during a two-round modified Delphi process. The study provides detailed responses on the appropriateness of PTRA for 1848 distinct clinical scenarios. Depending on the major clinical presentation, appropriateness of revascularization varied from 32% to 75% for individual scenarios (overal 48%). Uncertainty as to revascularization was 41% overall. When revascularization was appropriate, PTRA was favored over surgery in 94% of the scenarios, except in certain cases of aortic atheroma where sugery was the preferred choice. Kidney size [7 cm, absence of coexisting disease, acute renal failure, a high degree of stenosis (C70%), and absence of multiple arteries were identified as predictive variables of favorable appropriateness ratings. Situations such as cardiac failure with pulmonary edema or acute thrombosis of the renal artery were defined as indications for PTRA. This study identified clinical situations in which PTRA or surgery are appropriate for renal artery disease. We built a decision tree which can be used via Internet: the ANPARIA software (http://www.chu-clermontferrand.fr/anparia/). In numerous clinical situations uncertainty remains as to whether PTRA prevents deterioration of renal function.
Resumo:
Early revascularization of pancreatic islet cells after transplantation is crucial for engraftment, and it has been suggested that vascular endothelial growth factor-A (VEGF-A) plays a significant role in this process. Although VEGF gene therapy can improve angiogenesis, uncontrolled VEGF secretion can lead to vascular tumor formation. Here we have explored the role of temporal VEGF expression, controlled by a tetracycline (TC)-regulated promoter, on revascularization and engraftment of genetically modified beta cells following transplantation. To this end, we modified the CDM3D beta cell line using a lentiviral vector to promote secretion of VEGF-A either in a TC-regulated (TET cells) or a constitutive (PGK cells) manner. VEGF secretion, angiogenesis, cell proliferation, and stimulated insulin secretion were assessed in vitro. VEGF secretion was increased in TET and PGK cells, and VEGF delivery resulted in angiogenesis, whereas addition of TC inhibited these processes. Insulin secretion by the three cell types was similar. We used a syngeneic mouse model of transplantation to assess the effects of this controlled VEGF expression in vivo. Time to normoglycemia, intraperitoneal glucose tolerance test, graft vascular density, and cellular mass were evaluated. Increased expression of VEGF resulted in significantly better revascularization and engraftment after transplantation when compared to control cells. In vivo, there was a significant increase in vascular density in grafted TET and PGK cells versus control cells. Moreover, the time for diabetic mice to return to normoglycemia and the stimulated plasma glucose clearance were also significantly accelerated in mice transplanted with TET and PGK cells when compared to control cells. VEGF was only needed during the first 2-3 weeks after transplantation; when removed, normoglycemia and graft vascularization were maintained. TC-treated mice grafted with TC-treated cells failed to restore normoglycemia. This approach allowed us to switch off VEGF secretion when the desired effects had been achieved. TC-regulated temporal expression of VEGF using a gene therapy approach presents a novel way to improve early revascularization and engraftment after islet cell transplantation.
Resumo:
In Switzerland there is a strong movement at a national policy level towards strengthening patient rights and patient involvement in health care decisions. Yet, there is no national programme promoting shared decision making. First decision support tools (prenatal diagnosis and screening) for the counselling process have been developed and implemented. Although Swiss doctors acknowledge that shared decision making is important, hierarchical structures and asymmetric physician-patient relationships are still prevailing. The last years have seen some promising activities regarding the training of medical students and the development of patient support programmes. Swiss direct democracy and the habit of consensual decision making and citizen involvement in general may provide a fertile ground for SDM development in the primary care setting.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
Allocentric spatial memory, the memory for locations coded in relation to objects comprising our environment, is a fundamental component of episodic memory and is dependent on the integrity of the hippocampal formation in adulthood. Previous research from different laboratories reported that basic allocentric spatial memory abilities are reliably observed in children after 2 years of age. Based on work performed in monkeys and rats, we had proposed that the functional maturation of direct entorhinal cortex projections to the CA1 field of the hippocampus might underlie the emergence of basic allocentric spatial memory. We also proposed that the protracted development of the dentate gyrus and its projections to the CA3 field of the hippocampus might underlie the development of high-resolution allocentric spatial memory capacities, based on the essential contribution of these structures to the process known as pattern separation. Here, we present an experiment designed to assess the development of spatial pattern separation capacities and its impact on allocentric spatial memory performance in children from 18 to 48 months of age. We found that: (1) allocentric spatial memory performance improved with age, (2) as compared to younger children, a greater number of children older than 36 months advanced to the final stage requiring the highest degree of spatial resolution, and (3) children that failed at different stages exhibited difficulties in discriminating locations that required higher spatial resolution abilities. These results are consistent with the hypothesis that improvements in human spatial memory performance might be linked to improvements in pattern separation capacities.