30 resultados para Capability Maturity Model for Software
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Software must be constantly adapted to changing requirements. The time scale, abstraction level and granularity of adaptations may vary from short-term, fine-grained adaptation to long-term, coarse-grained evolution. Fine-grained, dynamic and context-dependent adaptations can be particularly difficult to realize in long-lived, large-scale software systems. We argue that, in order to effectively and efficiently deploy such changes, adaptive applications must be built on an infrastructure that is not just model-driven, but is both model-centric and context-aware. Specifically, this means that high-level, causally-connected models of the application and the software infrastructure itself should be available at run-time, and that changes may need to be scoped to the run-time execution context. We first review the dimensions of software adaptation and evolution, and then we show how model-centric design can address the adaptation needs of a variety of applications that span these dimensions. We demonstrate through concrete examples how model-centric and context-aware designs work at the level of application interface, programming language and runtime. We then propose a research agenda for a model-centric development environment that supports dynamic software adaptation and evolution.
Resumo:
Peritoneal transport characteristics and residual renal function require regular control and subsequent adjustment of the peritoneal dialysis (PD) prescription. Prescription models shall facilitate the prediction of the outcome of such adaptations for a given patient. In the present study, the prescription model implemented in the PatientOnLine software was validated in patients requiring a prescription change. This multicenter, international prospective cohort study with the aim to validate a PD prescription model included patients treated with continuous ambulatory peritoneal dialysis. Patients were examined with the peritoneal function test (PFT) to determine the outcome of their current prescription and the necessity for a prescription change. For these patients, a new prescription was modeled using the PatientOnLine software (Fresenius Medical Care, Bad Homburg, Germany). Two to four weeks after implementation of the new PD regimen, a second PFT was performed. The validation of the prescription model included 54 patients. Predicted and measured peritoneal Kt/V were 1.52 ± 0.31 and 1.66 ± 0.35, and total (peritoneal + renal) Kt/V values were 1.96 ± 0.48 and 2.06 ± 0.44, respectively. Predicted and measured peritoneal creatinine clearances were 42.9 ± 8.6 and 43.0 ± 8.8 L/1.73 m2/week and total creatinine clearances were 65.3 ± 26.0 and 63.3 ± 21.8 L/1.73 m2/week, respectively. The analysis revealed a Pearson's correlation coefficient for peritoneal Kt/V of 0.911 and Lin's concordance coefficient of 0.829. The value of both coefficients was 0.853 for peritoneal creatinine clearance. Predicted and measured daily net ultrafiltration was 0.77 ± 0.49 and 1.16 ± 0.63 L/24 h, respectively. Pearson's correlation and Lin's concordance coefficient were 0.518 and 0.402, respectively. Predicted and measured peritoneal glucose absorption was 125.8 ± 38.8 and 79.9 ± 30.7 g/24 h, respectively, and Pearson's correlation and Lin's concordance coefficient were 0.914 and 0.477, respectively. With good predictability of peritoneal Kt/V and creatinine clearance, the present model provides support for individual dialysis prescription in clinical practice. Peritoneal glucose absorption and ultrafiltration are less predictable and are likely to be influenced by additional clinical factors to be taken into consideration.
Resumo:
The biggest challenge facing software developers today is how to gracefully evolve complex software systems in the face of changing requirements. We clearly need software systems to be more dynamic, compositional and model-centric, but instead we continue to build systems that are static, baroque and inflexible. How can we better build change-enabled systems in the future? To answer this question, we propose to look back to one of the most successful systems to support change, namely Smalltalk. We briefly introduce Smalltalk with a few simple examples, and draw some lessons for software evolution. Smalltalk's simplicity, its reflective design, and its highly dynamic nature all go a long way towards enabling change in Smalltalk applications. We then illustrate how these lessons work in practice by reviewing a number of research projects that support software evolution by exploiting Smalltalk's design. We conclude by summarizing open issues and challenges for change-enabled systems of the future.
Resumo:
This paper examines the accuracy of software-based on-line energy estimation techniques. It evaluates today’s most widespread energy estimation model in order to investigate whether the current methodology of pure software-based energy estimation running on a sensor node itself can indeed reliably and accurately determine its energy consumption - independent of the particular node instance, the traffic load the node is exposed to, or the MAC protocol the node is running. The paper enhances today’s widely used energy estimation model by integrating radio transceiver switches into the model, and proposes a methodology to find the optimal estimation model parameters. It proves by statistical validation with experimental data that the proposed model enhancement and parameter calibration methodology significantly increases the estimation accuracy.
Resumo:
Salmonella enterica serovar Typhimurium has long been recognised as a zoonotic pathogen of economic significance in animals and humans. Attempts to protect humans and livestock may be based on immunization with vaccines aimed to induce a protective response. We recently demonstrated that the oral administration of a Salmonella enterica serovar Typhimurium strain unable to synthesize the zinc transporter ZnuABC is able to protect mice against systemic salmonellosis induced by a virulent homologous challenge. This finding suggested that this mutant strain could represent an interesting candidate vaccine for mucosal delivery. In this study, the protective effect of this Salmonella strain was tested in a streptomycin-pretreated mouse model of salmonellosis that is distinguished by the capability of evoking typhlitis and colitis. The here reported results demonstrate that mice immunized with Salmonella enterica serovar Typhimurium (S. Typhimurium) SA186 survive to the intestinal challenge and, compared to control mice, show a reduced number of virulent bacteria in the gut, with milder signs of inflammation. This study demonstrates that the oral administration a of S. Typhimurium strain lacking ZnuABC is able to elicit an effective immune response which protects mice against intestinal S. Typhimurium infection. These results, collectively, suggest that the streptomycin-pretreated mouse model of S. typhimurium infection can represent a valuable tool to screen S. typhimurium attenuated mutant strains and potentially help to assess their protective efficacy as potential live vaccines.
Resumo:
Drug-induced respiratory depression is a common side effect of the agents used in anesthesia practice to provide analgesia and sedation. Depression of the ventilatory drive in the spontaneously breathing patient can lead to severe cardiorespiratory events and it is considered a primary cause of morbidity. Reliable predictions of respiratory inhibition in the clinical setting would therefore provide a valuable means to improve the safety of drug delivery. Although multiple studies investigated the regulation of breathing in man both in the presence and absence of ventilatory depressant drugs, a unified description of respiratory pharmacodynamics is not available. This study proposes a mathematical model of human metabolism and cardiorespiratory regulation integrating several isolated physiological and pharmacological aspects of acute drug-induced ventilatory depression into a single theoretical framework. The description of respiratory regulation has a parsimonious yet comprehensive structure with substantial predictive capability. Simulations relative to the synergistic interaction of the hypercarbic and hypoxic respiratory drive and the global effect of drugs on the control of breathing are in good agreement with published experimental data. Besides providing clinically relevant predictions of respiratory depression, the model can also serve as a test bed to investigate issues of drug tolerability and dose finding/control under non-steady-state conditions.
Resumo:
OBJECTIVES: Implementation of an experimental model to compare cartilage MR imaging by means of histological analyses. MATERIAL AND METHODS: MRI was obtained from 4 patients expecting total knee replacement at 1.5 and/or 3T prior surgery. The timeframe between pre-op MRI and knee replacement was within two days. Resected cartilage-bone samples were tagged with Ethi((R))-pins to reproduce the histological cutting course. Pre-operative scanning at 1.5T included following parameters for fast low angle shot (FLASH: TR/TE/FA=33ms/6ms/30 degrees , BW=110kHz, 120mmx120mm FOV, 256x256 matrix, 0.65mm slice-thickness) and double echo steady state (DESS: TR/TE/FA=23.7ms/6.9ms/40 degrees , BW=130kHz, 120x120mm FOV, 256x256 matrix, 0.65mm slice-thickness). At 3T, scan parameters were: FLASH (TR/TE/FA=12.2ms/5.1ms/10 degrees , BW=130kHz, 170x170mm FOV, 320x320, 0.5mm slice-thickness) and DESS (TR/TE/FA=15.6ms/4.5ms/25 degrees , BW=200kHz, 135mmx150mm FOV, 288x320matrix, 0.5mm slice-thickness). Imaging of the specimens was done the same day at 1.5T. MRI (Noyes) and histological (Mankin) score scales were correlated using the paired t-test. Sensitivity and specificity for the detection of different grades of cartilage degeneration were assessed. Inter-reader and intra-reader reliability was determined using Kappa analysis. RESULTS: Low correlation (sensitivity, specificity) was found for both sequences in normal to mild Mankin grades. Only moderate to severe changes were diagnosed with higher significance and specificity. The use of higher field-strengths was advantageous for both protocols with sensitivity values ranging from 13.6% to 93.3% (FLASH) and 20.5% to 96.2% (DESS). Kappa values ranged from 0.488 to 0.944. CONCLUSIONS: Correlating MR images with continuous histological slices was feasible by using three-dimensional imaging, multi-planar-reformat and marker pins. The capability of diagnosing early cartilage changes with high accuracy could not be proven for both FLASH and DESS.
Resumo:
BACKGROUND: Gene therapy has been recently introduced as a novel approach to treat ischemic tissues by using the angiogenic potential of certain growth factors. We investigated the effect of adenovirus-mediated gene therapy with transforming growth factor-beta (TGF-beta) delivered into the subdermal space to treat ischemically challenged epigastric skin flaps in a rat model. MATERIAL AND METHODS: A pilot study was conducted in a group of 5 animals pretreated with Ad-GFP and expression of green fluorescent protein in the skin flap sections was demonstrated under fluorescence microscopy at 2, 4, and 7 days after the treatment, indicating a successful transfection of the skin flaps following subdermal gene therapy. Next, 30 male Sprague Dawley rats were divided into 3 groups of 10 rats each. An epigastric skin flap model, based solely on the right inferior epigastric vessels, was used as the model in this study. Rats received subdermal injections of adenovirus encoding TGF-beta (Ad-TGF-beta) or green fluorescent protein (Ad-GFP) as treatment control. The third group (n = 10) received saline and served as a control group. A flap measuring 8 x 8 cm was outlined on the abdominal skin extending from the xiphoid process proximally and the pubic region distally, to the anterior axillary lines bilaterally. Just prior to flap elevation, the injections were given subdermally in the left upper corner of the flap. The flap was then sutured back to its bed. Flap viability was evaluated seven days after the initial operation. Digital images of the epigastric flaps were taken and areas of necrotic zones relative to total flap surface area were measured and expressed as percentages by using a software program. RESULTS: There was a significant increase in mean percent surviving area between the Ad-TGF-beta group and the two other control groups (P < 0.05). (Ad-TGF-beta: 90.3 +/- 4.0% versus Ad-GFP: 82.2 +/- 8.7% and saline group: 82.6 +/- 4.3%.) CONCLUSIONS: In this study, the authors were able to demonstrate that adenovirus-mediated gene therapy using TGF-beta ameliorated ischemic necrosis in an epigastric skin flap model, as confirmed by significant reduction in the necrotic zones of the flap. The results of this study raise the possibility of using adenovirus-mediated TGF-beta gene therapy to promote perfusion in random portion of skin flaps, especially in high-risk patients.
Resumo:
OBJECTIVES: To analyze computer-assisted diagnostics and virtual implant planning and to evaluate the indication for template-guided flapless surgery and immediate loading in the rehabilitation of the edentulous maxilla. MATERIALS AND METHODS: Forty patients with an edentulous maxilla were selected for this study. The three-dimensional analysis and virtual implant planning was performed with the NobelGuide software program (Nobel Biocare, Göteborg, Sweden). Prior to the computer tomography aesthetics and functional aspects were checked clinically. Either a well-fitting denture or an optimized prosthetic setup was used and then converted to a radiographic template. This allowed for a computer-guided analysis of the jaw together with the prosthesis. Accordingly, the best implant position was determined in relation to the bone structure and prospective tooth position. For all jaws, the hypothetical indication for (1) four implants with a bar overdenture and (2) six implants with a simple fixed prosthesis were planned. The planning of the optimized implant position was then analyzed as follows: the number of implants was calculated that could be placed in sufficient quantity of bone. Additional surgical procedures (guided bone regeneration, sinus floor elevation) that would be necessary due the reduced bone quality and quantity were identified. The indication of template-guided, flapless surgery or an immediate loaded protocol was evaluated. RESULTS: Model (a) - bar overdentures: for 28 patients (70%), all four implants could be placed in sufficient bone (total 112 implants). Thus, a full, flapless procedure could be suggested. For six patients (15%), sufficient bone was not available for any of their planned implants. The remaining six patients had exhibited a combination of sufficient or insufficient bone. Model (b) - simple fixed prosthesis: for 12 patients (30%), all six implants could be placed in sufficient bone (total 72 implants). Thus, a full, flapless procedure could be suggested. For seven patients (17%), sufficient bone was not available for any of their planned implants. The remaining 21 patients had exhibited a combination of sufficient or insufficient bone. DISCUSSION: In the maxilla, advanced atrophy is often observed, and implant placement becomes difficult or impossible. Thus, flapless surgery or an immediate loading protocol can be performed just in a selected number of patients. Nevertheless, the use of a computer program for prosthetically driven implant planning is highly efficient and safe. The three-dimensional view of the maxilla allows the determination of the best implant position, the optimization of the implant axis, and the definition of the best surgical and prosthetic solution for the patient. Thus, a protocol that combines a computer-guided technique with conventional surgical procedures becomes a promising option, which needs to be further evaluated and improved.
Resumo:
Few real software systems are built completely from scratch nowadays. Instead, systems are built iteratively and incrementally, while integrating and interacting with components from many other systems. Adaptation, reconfiguration and evolution are normal, ongoing processes throughout the lifecycle of a software system. Nevertheless the platforms, tools and environments we use to develop software are still largely based on an outmoded model that presupposes that software systems are closed and will not significantly evolve after deployment. We claim that in order to enable effective and graceful evolution of modern software systems, we must make these systems more amenable to change by (i) providing explicit, first-class models of software artifacts, change, and history at the level of the platform, (ii) continuously analysing static and dynamic evolution to track emergent properties, and (iii) closing the gap between the domain model and the developers' view of the evolving system. We outline our vision of dynamic, evolving software systems and identify the research challenges to realizing this vision.
Resumo:
As more and more open-source software components become available on the internet we need automatic ways to label and compare them. For example, a developer who searches for reusable software must be able to quickly gain an understanding of retrieved components. This understanding cannot be gained at the level of source code due to the semantic gap between source code and the domain model. In this paper we present a lexical approach that uses the log-likelihood ratios of word frequencies to automatically provide labels for software components. We present a prototype implementation of our labeling/comparison algorithm and provide examples of its application. In particular, we apply the approach to detect trends in the evolution of a software system.
Resumo:
For popular software systems, the number of daily submitted bug reports is high. Triaging these incoming reports is a time consuming task. Part of the bug triage is the assignment of a report to a developer with the appropriate expertise. In this paper, we present an approach to automatically suggest developers who have the appropriate expertise for handling a bug report. We model developer expertise using the vocabulary found in their source code contributions and compare this vocabulary to the vocabulary of bug reports. We evaluate our approach by comparing the suggested experts to the persons who eventually worked on the bug. Using eight years of Eclipse development as a case study, we achieve 33.6\% top-1 precision and 71.0\% top-10 recall.
Resumo:
In conventional software applications, synchronization code is typically interspersed with functional code, thereby impacting understandability and maintainability of the code base. At the same time, the synchronization defined statically in the code is not capable of adapting to different runtime situations. We propose a new approach to concurrency control which strictly separates the functional code from the synchronization requirements to be used and which adapts objects to be synchronized dynamically to their environment. First-class synchronization specifications express safety requirements, and a dynamic synchronization system dynamically adapts objects to different runtime situations. We present an overview of a prototype of our approach together with several classical concurrency problems, and we discuss open issues for further research.
Resumo:
The increasing practice of offshore outsourcing software maintenance has posed the challenge of effectively transferring knowledge to individual software engineers of the vendor. In this theoretical paper, we discuss the implications of two learning theories, the model of work-based learning (MWBL) and cognitive load theory (CLT), for knowledge transfer during the transition phase. Taken together, the theories suggest that learning mechanisms need to be aligned with the type of knowledge (tacit versus explicit), task characteristics (complexity and recurrence), and the recipients’ expertise. The MWBL proposes that learning mechanisms need to include conceptual and practical activities based on the relative importance of explicit and tacit knowledge. CLT explains how effective portfolios of learning mechanisms change over time. While jobshadowing, completion tasks, and supportive information may prevail at the outset of transition, they may be replaced by the work on conventional tasks towards the end of transition.