887 resultados para Software Process Improvement
Resumo:
In den westlichen Industrieländern ist das Mammakarzinom der häufigste bösartige Tumor der Frau. Sein weltweiter Anteil an allen Krebserkrankungen der Frau beläuft sich auf etwa 21 %. Inzwischen ist jede neunte Frau bedroht, während ihres Lebens an Brustkrebs zu erkranken. Die alterstandardisierte Mortalitätrate liegt derzeit bei knapp 27 %.rnrnDas Mammakarzinom hat eine relative geringe Wachstumsrate. Die Existenz eines diagnostischen Verfahrens, mit dem alle Mammakarzinome unter 10 mm Durchmesser erkannt und entfernt werden, würden den Tod durch Brustkrebs praktisch beseitigen. Denn die 20-Jahres-Überlebungsrate bei Erkrankung durch initiale Karzinome der Größe 5 bis 10 mm liegt mit über 95 % sehr hoch.rnrnMit der Kontrastmittel gestützten Bildgebung durch die MRT steht eine relativ junge Untersuchungsmethode zur Verfügung, die sensitiv genug zur Erkennung von Karzinomen ab einer Größe von 3 mm Durchmesser ist. Die diagnostische Methodik ist jedoch komplex, fehleranfällig, erfordert eine lange Einarbeitungszeit und somit viel Erfahrung des Radiologen.rnrnEine Computer unterstützte Diagnosesoftware kann die Qualität einer solch komplexen Diagnose erhöhen oder zumindest den Prozess beschleunigen. Das Ziel dieser Arbeit ist die Entwicklung einer vollautomatischen Diagnose Software, die als Zweitmeinungssystem eingesetzt werden kann. Meines Wissens existiert eine solche komplette Software bis heute nicht.rnrnDie Software führt eine Kette von verschiedenen Bildverarbeitungsschritten aus, die dem Vorgehen des Radiologen nachgeahmt wurden. Als Ergebnis wird eine selbstständige Diagnose für jede gefundene Läsion erstellt: Zuerst eleminiert eine 3d Bildregistrierung Bewegungsartefakte als Vorverarbeitungsschritt, um die Bildqualität der nachfolgenden Verarbeitungsschritte zu verbessern. Jedes kontrastanreichernde Objekt wird durch eine regelbasierte Segmentierung mit adaptiven Schwellwerten detektiert. Durch die Berechnung kinetischer und morphologischer Merkmale werden die Eigenschaften der Kontrastmittelaufnahme, Form-, Rand- und Textureeigenschaften für jedes Objekt beschrieben. Abschließend werden basierend auf den erhobenen Featurevektor durch zwei trainierte neuronale Netze jedes Objekt in zusätzliche Funde oder in gut- oder bösartige Läsionen klassifiziert.rnrnDie Leistungsfähigkeit der Software wurde auf Bilddaten von 101 weiblichen Patientinnen getested, die 141 histologisch gesicherte Läsionen enthielten. Die Vorhersage der Gesundheit dieser Läsionen ergab eine Sensitivität von 88 % bei einer Spezifität von 72 %. Diese Werte sind den in der Literatur bekannten Vorhersagen von Expertenradiologen ähnlich. Die Vorhersagen enthielten durchschnittlich 2,5 zusätzliche bösartige Funde pro Patientin, die sich als falsch klassifizierte Artefakte herausstellten.rn
Resumo:
This is the first part of a study investigating a model-based transient calibration process for diesel engines. The motivation is to populate hundreds of parameters (which can be calibrated) in a methodical and optimum manner by using model-based optimization in conjunction with the manual process so that, relative to the manual process used by itself, a significant improvement in transient emissions and fuel consumption and a sizable reduction in calibration time and test cell requirements is achieved. Empirical transient modelling and optimization has been addressed in the second part of this work, while the required data for model training and generalization are the focus of the current work. Transient and steady-state data from a turbocharged multicylinder diesel engine have been examined from a model training perspective. A single-cylinder engine with external air-handling has been used to expand the steady-state data to encompass transient parameter space. Based on comparative model performance and differences in the non-parametric space, primarily driven by a high engine difference between exhaust and intake manifold pressures (ΔP) during transients, it has been recommended that transient emission models should be trained with transient training data. It has been shown that electronic control module (ECM) estimates of transient charge flow and the exhaust gas recirculation (EGR) fraction cannot be accurate at the high engine ΔP frequently encountered during transient operation, and that such estimates do not account for cylinder-to-cylinder variation. The effects of high engine ΔP must therefore be incorporated empirically by using transient data generated from a spectrum of transient calibrations. Specific recommendations on how to choose such calibrations, how many data to acquire, and how to specify transient segments for data acquisition have been made. Methods to process transient data to account for transport delays and sensor lags have been developed. The processed data have then been visualized using statistical means to understand transient emission formation. Two modes of transient opacity formation have been observed and described. The first mode is driven by high engine ΔP and low fresh air flowrates, while the second mode is driven by high engine ΔP and high EGR flowrates. The EGR fraction is inaccurately estimated at both modes, while EGR distribution has been shown to be present but unaccounted for by the ECM. The two modes and associated phenomena are essential to understanding why transient emission models are calibration dependent and furthermore how to choose training data that will result in good model generalization.
Resumo:
Software repositories have been getting a lot of attention from researchers in recent years. In order to analyze software repositories, it is necessary to first extract raw data from the version control and problem tracking systems. This poses two challenges: (1) extraction requires a non-trivial effort, and (2) the results depend on the heuristics used during extraction. These challenges burden researchers that are new to the community and make it difficult to benchmark software repository mining since it is almost impossible to reproduce experiments done by another team. In this paper we present the TA-RE corpus. TA-RE collects extracted data from software repositories in order to build a collection of projects that will simplify extraction process. Additionally the collection can be used for benchmarking. As the first step we propose an exchange language capable of making sharing and reusing data as simple as possible.
Resumo:
OBJECTIVE: To describe outcome after an alternative unilateral approach to the thoracolumbar spine for dorsal laminectomy. STUDY DESIGN: Retrospective clinical study. ANIMALS: Dogs (n=14) with thoracolumbar spinal cord compression. METHODS: Thoracolumbar spinal cord compression was lateral (6 dogs), dorsal (4), and dorsolateral (4) caused by subarachnoid (7) and synovial cysts (2) and intradural-extramedullary neoplasia (5). All dogs were treated by dorsal laminectomy with osteotomy of the spinous process using a unilateral paramedian approach. The contralateral paraspinal muscles were not stripped from the spinous process and the osteoligamentous complexes were preserved. Retraction of the spinous process and muscles to the contralateral side resulted in complete visualization of the dorsal vertebral arch thereby allowing dorsal laminectomy to be performed. RESULTS: No technique complications occurred. Approximately 75% exposure of the spinal cord (dorsal and lateral compartments) was achieved providing adequate visualization and treatment of the lesions. Transient deterioration of neurologic state occurred in 5 dogs because of extensive spinal cord manipulation. At long-term follow-up, 6 dogs were normal, 6 had clinical improvement, and 2 were unchanged. CONCLUSION: Dorsal laminectomy after osteotomy and retraction of the spinous process may be considered in canine patients with dorsal, dorsolateral, or lateral compression to facilitate adequate decompression of the spinal cord. CLINICAL SIGNIFICANCE: This surgical technique offers an alternative approach to the thoracolumbar spine and spinal cord by a modified dorsal laminectomy that preserves the paraspinal muscle integrity on the contralateral side.
Resumo:
The Modeling method of teaching has demonstrated well--‐documented success in the improvement of student learning. The teacher/researcher in this study was introduced to Modeling through the use of a technique called White Boarding. Without formal training, the researcher began using the White Boarding technique for a limited number of laboratory experiences with his high school physics classes. The question that arose and was investigated in this study is “What specific aspects of the White Boarding process support student understanding?” For the purposes of this study, the White Boarding process was broken down into three aspects – the Analysis of data through the use of Logger Pro software, the Preparation of White Boards, and the Presentations each group gave about their specific lab data. The lab used in this study, an Acceleration of Gravity Lab, was chosen because of the documented difficulties students experience in the graphing of motion. In the lab, students filmed a given motion, utilized Logger Pro software to analyze the motion, prepared a White Board that described the motion with position--‐time and velocity--‐time graphs, and then presented their findings to the rest of the class. The Presentation included a class discussion with minimal contribution from the teacher. The three different aspects of the White Boarding experience – Analysis, Preparation, and Presentation – were compared through the use of student learning logs, video analysis of the Presentations, and follow--‐up interviews with participants. The information and observations gathered were used to determine the level of understanding of each participant during each phase of the lab. The researcher then looked for improvement in the level of student understanding, the number of “aha” moments students had, and the students’ perceptions about which phase was most important to their learning. The results suggest that while all three phases of the White Boarding experience play a part in the learning process for students, the Presentations provided the most significant changes. The implications for instruction are discussed.
Resumo:
Waste effluents from the forest products industry are sources of lignocellulosic biomass that can be converted to ethanol by yeast after pretreatment. However, the challenge of improving ethanol yields from a mixed pentose and hexose fermentation of a potentially inhibitory hydrolysate still remains. Hardboard manufacturing process wastewater (HPW) was evaluated at a potential feedstream for lignocellulosic ethanol production by native xylose-fermenting yeast. After screening of xylose-fermenting yeasts, Scheffersomyces stipitis CBS 6054 was selected as the ideal organism for conversion of the HPW hydrolysate material. The individual and synergistic effects of inhibitory compounds present in the hydrolysate were evaluated using response surface methodology. It was concluded that organic acids have an additive negative effect on fermentations. Fermentation conditions were also optimized in terms of aeration and pH. Methods for improving productivity and achieving higher ethanol yields were investigated. Adaptation to the conditions present in the hydrolysate through repeated cell sub-culturing was used. The objectives of this present study were to adapt S. stipitis CBS6054 to a dilute-acid pretreated lignocellulosic containing waste stream; compare the physiological, metabolic, and proteomic profiles of the adapted strain to its parent; quantify changes in protein expression/regulation, metabolite abundance, and enzyme activity; and determine the biochemical and molecular mechanism of adaptation. The adapted culture showed improvement in both substrate utilization and ethanol yields compared to the unadapted parent strain. The adapted strain also represented a growth phenotype compared to its unadapted parent based on its physiological and proteomic profiles. Several potential targets that could be responsible for strain improvement were identified. These targets could have implications for metabolic engineering of strains for improved ethanol production from lignocellulosic feedstocks. Although this work focuses specifically on the conversion of HPW to ethanol, the methods developed can be used for any feedstock/product systems that employ a microbial conversion step. The benefit of this research is that the organisms will the optimized for a company's specific system.
Resumo:
Many schools do not begin to introduce college students to software engineering until they have had at least one semester of programming. Since software engineering is a large, complex, and abstract subject it is difficult to construct active learning exercises that build on the students’ elementary knowledge of programming and still teach basic software engineering principles. It is also the case that beginning students typically know how to construct small programs, but they have little experience with the techniques necessary to produce reliable and long-term maintainable modules. I have addressed these two concerns by defining a local standard (Montana Tech Method (MTM) Software Development Standard for Small Modules Template) that step-by-step directs students toward the construction of highly reliable small modules using well known, best-practices software engineering techniques. “Small module” is here defined as a coherent development task that can be unit tested, and can be car ried out by a single (or a pair of) software engineer(s) in at most a few weeks. The standard describes the process to be used and also provides a template for the top-level documentation. The instructional module’s sequence of mini-lectures and exercises associated with the use of this (and other) local standards are used throughout the course, which perforce covers more abstract software engineering material using traditional reading and writing assignments. The sequence of mini-lectures and hands-on assignments (many of which are done in small groups) constitutes an instructional module that can be used in any similar software engineering course.
Resumo:
Software metrics offer us the promise of distilling useful information from vast amounts of software in order to track development progress, to gain insights into the nature of the software, and to identify potential problems. Unfortunately, however, many software metrics exhibit highly skewed, non-Gaussian distributions. As a consequence, usual ways of interpreting these metrics --- for example, in terms of "average" values --- can be highly misleading. Many metrics, it turns out, are distributed like wealth --- with high concentrations of values in selected locations. We propose to analyze software metrics using the Gini coefficient, a higher-order statistic widely used in economics to study the distribution of wealth. Our approach allows us not only to observe changes in software systems efficiently, but also to assess project risks and monitor the development process itself. We apply the Gini coefficient to numerous metrics over a range of software projects, and we show that many metrics not only display remarkably high Gini values, but that these values are remarkably consistent as a project evolves over time.
Resumo:
The central question for this paper is how to improve the production process by closing the gap between industrial designers and software engineers of television(TV)-based User Interfaces (UI) in an industrial environment. Software engineers are highly interested whether one UI design can be converted into several fully functional UIs for TV products with different screen properties. The aim of the software engineers is to apply automatic layout and scaling in order to speed up and improve the production process. However, the question is whether a UI design lends itself for such automatic layout and scaling. This is investigated by analysing a prototype UI design done by industrial designers. In a first requirements study, industrial designers had created meta-annotations on top of their UI design in order to disclose their design rationale for discussions with software engineers. In a second study, five (out of ten) industrial designers assessed the potential of four different meta-annotation approaches. The question was which annotation method industrial designers would prefer and whether it could satisfy the technical requirements of the software engineering process. One main result is that the industrial designers preferred the method they were already familiar with, which therefore seems to be the most effective one although the main objective of automatic layout and scaling could still not be achieved.
Resumo:
The article introduces the E-learning Circle, a tool developed to assure the quality of the software design process of e-learning systems, considering pedagogical principles as well as technology. The E-learning Circle consists of a number of concentric circles which are divided into three sectors. The content of the inner circles is based on pedagogical principles, while the outer circle specifies how the pedagogical principles may be implemented with technology. The circle’s centre is dedicated to the subject taught, ensuring focus on the specific subject’s properties. The three sectors represent the student, the teacher and the learning objectives. The strengths of the E-learning Circle are the compact presentation combined with the overview it provides, as well as the usefulness of a design tool dealing with complexity, providing a common language and embedding best practice. The E-learning Circle is not a prescriptive method, but is useful in several design models and processes. The article presents two projects where the E-learning Circle was used as a design tool.
Resumo:
This paper provides an insight to the development of a process model for the essential expansion of the automatic miniload warehouse. The model is based on the literature research and covers four phases of a warehouse expansion: the preparatory phase, the current state analysis, the design phase and the decision making phase. In addition to the literature research, the presented model is based on a reliable data set and can be applicable with a reasonable effort to ensure the informed decision on the warehouse layout. The model is addressed to users who are usually employees of logistics department, and is oriented on the improvement of the daily business organization combined with the warehouse expansion planning.
Resumo:
In the context of expensive numerical experiments, a promising solution for alleviating the computational costs consists of using partially converged simulations instead of exact solutions. The gain in computational time is at the price of precision in the response. This work addresses the issue of fitting a Gaussian process model to partially converged simulation data for further use in prediction. The main challenge consists of the adequate approximation of the error due to partial convergence, which is correlated in both design variables and time directions. Here, we propose fitting a Gaussian process in the joint space of design parameters and computational time. The model is constructed by building a nonstationary covariance kernel that reflects accurately the actual structure of the error. Practical solutions are proposed for solving parameter estimation issues associated with the proposed model. The method is applied to a computational fluid dynamics test case and shows significant improvement in prediction compared to a classical kriging model.
Resumo:
An Advanced Planning System (APS) offers support at all planning levels along the supply chain while observing limited resources. We consider an APS for process industries (e.g. chemical and pharmaceutical industries) consisting of the modules network design (for long–term decisions), supply network planning (for medium–term decisions), and detailed production scheduling (for short–term decisions). For each module, we outline the decision problem, discuss the specifi cs of process industries, and review state–of–the–art solution approaches. For the module detailed production scheduling, a new solution approach is proposed in the case of batch production, which can solve much larger practical problems than the methods known thus far. The new approach decomposes detailed production scheduling for batch production into batching and batch scheduling. The batching problem converts the primary requirements for products into individual batches, where the work load is to be minimized. We formulate the batching problem as a nonlinear mixed–integer program and transform it into a linear mixed–binary program of moderate size, which can be solved by standard software. The batch scheduling problem allocates the batches to scarce resources such as processing units, workers, and intermediate storage facilities, where some regular objective function like the makespan is to be minimized. The batch scheduling problem is modelled as a resource–constrained project scheduling problem, which can be solved by an efficient truncated branch–and–bound algorithm developed recently. The performance of the new solution procedures for batching and batch scheduling is demonstrated by solving several instances of a case study from process industries.
Resumo:
Global environmental change includes changes in a wide range of global scale phenomena, which are expected to affect a number of physical processes, as well as the vulnerability of the communities that will experience their impact. Decision-makers are in need of tools that will enable them to assess the loss of such processes under different future scenarios and to design risk reduction strategies. In this paper, a tool is presented that can be used by a range of end-users (e.g. local authorities, decision makers, etc.) for the assessment of the monetary loss from future landslide events, with a particular focus on torrential processes. The toolbox includes three functions: a) enhancement of the post-event damage data collection process, b) assessment of monetary loss of future events and c) continuous updating and improvement of an existing vulnerability curve by adding data of recent events. All functions of the tool are demonstrated through examples of its application.