901 resultados para design or documentation process
Resumo:
La Quantitative Risk Analysis costituisce un valido strumento per la determinazione del rischio associato ad un’installazione industriale e per la successiva attuazione di piani di emergenza. Tuttavia, la sua applicazione nella progettazione di un lay-out richiede la scelta di un criterio in grado di valutare quale sia la disposizione ottimale al fine di minimizzare il rischio. In tal senso, le numerose procedure esistenti, sebbene efficaci, risultano piuttosto faticose e time-consuming. Nel presente lavoro viene dunque proposto un criterio semplice ed oggettivo per comparare i risultati di QRA applicate a differenti designs. Valutando l’area racchiusa nelle curve iso-rischio, vengono confrontate dapprima le metodologie esistenti per lo studio dell’effetto domino, e successivamente, viene applicata al caso di serbatoi in pressione una procedura integrata di Quantitative Risk Domino Assessment. I risultati ottenuti dimostrano chiaramente come sia possibile ridurre notevolmente il rischio di un’attività industriale agendo sulla disposizione delle apparecchiature, con l’obiettivo di limitare gli effetti di possibili scenari accidentali.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
In questa tesi viene presentato un bioreattore in grado di mantenere nel tempo condizioni biologiche tali che consentano di massimizzare i cicli di evoluzione molecolare di vettori di clonazione fagici: litico (T7) o lisogeno (M13). Verranno quindi introdtti concetti legati alla Teoria della Quasispecie e alla relazione tra errori di autoreplicazione e pressioni selettive naturali o artificiali su popolazioni di virus: il modello naturale del sistema evolutivo. Tuttavia, mantenere delle popolazioni di virus significa formire loro un substrato dove replicare. Per fare ciò, altri gruppi di ricerca hanno giá sviluppato complessi e costosi prototipi di macchinari per la crescita continua di popolazioni batteriche: i compartimenti dei sistemi evolutivi. Il bioreattore, oggetto di questo lavoro, fa parte del progetto europeo Evoprog: general purpose programmable machine evolution on a chip (Jaramillo’s Lab, University of Warwick) che, utilizzando tecnologie fagiche e regolazioni sintetiche esistenti, sará in grado di produrre funzionalità biocomputazionali di due ordini di grandezza più veloci rispetto alle tecniche convenzionali, riducendo allo stesso tempo i costi complessivi. Il primo prototipo consiste in uno o piú fermentatori, dove viene fatta crescere la cultura batterica in condizioni ottimizzate di coltivazione continua, e in un cellstat, un volume separato, dove avviene solo la replicazione dei virus. Entrambi i volumi sono di pochi millilitri e appropriatamente interconnessi per consentire una sorta di screening continuo delle biomolecole prodotte all’uscita. Nella parte finale verranno presentati i risultati degli esperimenti preliminari, a dimostrazione dell’affidabilità del prototipo costruito e dei protocolli seguiti per la sterilizzazione e l’assemblaggio del bioreattore. Gli esperimenti effettuati dimostrano il successo di due coltivazioni virali continue e una ricombinazione in vivo di batteriofagi litici o lisogeni ingegnerizzati. La tesi si conclude valutando i futuri sviluppi e i limiti del sistema, tenendo in considerazione, in particolare, alcune applicazioni rivolte agli studi di una terapia batteriofagica.
Resumo:
Randomization is a key step in reducing selection bias during the treatment allocation phase in randomized clinical trials. The process of randomization follows specific steps, which include generation of the randomization list, allocation concealment, and implementation of randomization. The phenomenon in the dental and orthodontic literature of characterizing treatment allocation as random is frequent; however, often the randomization procedures followed are not appropriate. Randomization methods assign, at random, treatment to the trial arms without foreknowledge of allocation by either the participants or the investigators thus reducing selection bias. Randomization entails generation of random allocation, allocation concealment, and the actual methodology of implementing treatment allocation randomly and unpredictably. Most popular randomization methods include some form of restricted and/or stratified randomization. This article introduces the reasons, which make randomization an integral part of solid clinical trial methodology, and presents the main randomization schemes applicable to clinical trials in orthodontics.
Resumo:
The Simulation Automation Framework for Experiments (SAFE) is a project created to raise the level of abstraction in network simulation tools and thereby address issues that undermine credibility. SAFE incorporates best practices in network simulationto automate the experimental process and to guide users in the development of sound scientific studies using the popular ns-3 network simulator. My contributions to the SAFE project: the design of two XML-based languages called NEDL (ns-3 Experiment Description Language) and NSTL (ns-3 Script Templating Language), which facilitate the description of experiments and network simulationmodels, respectively. The languages provide a foundation for the construction of better interfaces between the user and the ns-3 simulator. They also provide input to a mechanism which automates the execution of network simulation experiments. Additionally,this thesis demonstrates that one can develop tools to generate ns-3 scripts in Python or C++ automatically from NSTL model descriptions.
Resumo:
Development of novel implants in orthopaedic trauma surgery is based on limited datasets of cadaver trials or artificial bone models. A method has been developed whereby implants can be constructed in an evidence based method founded on a large anatomic database consisting of more than 2.000 datasets of bones extracted from CT scans. The aim of this study was the development and clinical application of an anatomically pre-contoured plate for the treatment of distal fibular fractures based on the anatomical database. 48 Caucasian and Asian bone models (left and right) from the database were used for the preliminary optimization process and validation of the fibula plate. The implant was constructed to fit bilaterally in a lateral position of the fibula. Then a biomechanical comparison of the designed implant to the current gold standard in the treatment of distal fibular fractures (locking 1/3 tubular plate) was conducted. Finally, a clinical surveillance study to evaluate the grade of implant fit achieved was performed. The results showed that with a virtual anatomic database it was possible to design a fibula plate with an optimized fit for a large proportion of the population. Biomechanical testing showed the novel fibula plate to be superior to 1/3 tubular plates in 4-point bending tests. The clinical application showed a very high degree of primary implant fit. Only in a small minority of cases further intra-operative implant bending was necessary. Therefore, the goal to develop an implant for the treatment of distal fibular fractures based on the evidence of a large anatomical database could be attained. Biomechanical testing showed good results regarding the stability and the clinical application confirmed the high grade of anatomical fit.
Resumo:
A genomic biomarker identifying patients likely to benefit from drotrecogin alfa (activated) (DAA) may be clinically useful as a companion diagnostic. This trial was designed to validate biomarkers (improved response polymorphisms (IRPs)). Each IRP (A and B) contains two single nucleotide polymorphisms that were associated with a differential DAA treatment effect.
Resumo:
The new knowledge environments of the digital age are oen described as places where we are all closely read, with our buying habits, location, and identities available to advertisers, online merchants, the government, and others through our use of the Internet. This is represented as a loss of privacy in which these entities learn about our activities and desires, using means that were unavailable in the pre-digital era. This article argues that the reciprocal nature of digital networks means 1) that the privacy issues that we face online are not radically different from those of the pre-Internet era, and 2) that we need to reconceive of close reading as an activity of which both humans and computer algorithms are capable.
Resumo:
OBJECTIVE: To describe outcome after an alternative unilateral approach to the thoracolumbar spine for dorsal laminectomy. STUDY DESIGN: Retrospective clinical study. ANIMALS: Dogs (n=14) with thoracolumbar spinal cord compression. METHODS: Thoracolumbar spinal cord compression was lateral (6 dogs), dorsal (4), and dorsolateral (4) caused by subarachnoid (7) and synovial cysts (2) and intradural-extramedullary neoplasia (5). All dogs were treated by dorsal laminectomy with osteotomy of the spinous process using a unilateral paramedian approach. The contralateral paraspinal muscles were not stripped from the spinous process and the osteoligamentous complexes were preserved. Retraction of the spinous process and muscles to the contralateral side resulted in complete visualization of the dorsal vertebral arch thereby allowing dorsal laminectomy to be performed. RESULTS: No technique complications occurred. Approximately 75% exposure of the spinal cord (dorsal and lateral compartments) was achieved providing adequate visualization and treatment of the lesions. Transient deterioration of neurologic state occurred in 5 dogs because of extensive spinal cord manipulation. At long-term follow-up, 6 dogs were normal, 6 had clinical improvement, and 2 were unchanged. CONCLUSION: Dorsal laminectomy after osteotomy and retraction of the spinous process may be considered in canine patients with dorsal, dorsolateral, or lateral compression to facilitate adequate decompression of the spinal cord. CLINICAL SIGNIFICANCE: This surgical technique offers an alternative approach to the thoracolumbar spine and spinal cord by a modified dorsal laminectomy that preserves the paraspinal muscle integrity on the contralateral side.
Resumo:
BACKGROUND AND OBJECTIVES: There are no widely accepted criteria for the definition of hematopoietic stem cell transplant -associated microangiopathy (TAM). An International Working Group was formed to develop a consensus formulation of criteria for diagnosing clinically significant TAM. DESIGN AND METHODS: The participants proposed a list of candidate criteria, selected those considered necessary, and ranked those considered optional to identify a core set of criteria. Three obligatory criteria and four optional criteria that ranked highest formed a core set. In an appropriateness panel process, the participants scored the diagnosis of 16 patient profiles as appropriate or not appropriate for TAM. Using the experts' ratings on the patient profiles as a gold standard, the sensitivity and specificity of 24 candidate definitions of the disorder developed from the core set of criteria were evaluated. A nominal group technique was used to facilitate consensus formation. The definition of TAM with the highest score formed the final PROPOSAL. RESULTS: The Working Group proposes that the diagnosis of TAM requires fulfilment of all of the following criteria: (i) >4% schistocytes in blood; (ii) de novo, prolonged or progressive thrombocytopenia (platelet count <50 x 109/L or 50% or greater reduction from previous counts); (iii) sudden and persistent increase in lactate dehydrogenase concentration; (iv) decrease in hemoglobin concentration or increased transfusion requirement; and (v) decrease in serum haptoglobin. The sensitivity and specificity of this definition exceed 80%. INTERPRETATION AND CONCLUSIONS: The Working Group recommends that the presented criteria of TAM be adopted in clinical use, especially in scientific trials.
Resumo:
BACKGROUND: Based on a subgroup analysis of 18-month BAsel Stent Kosten Effektivitäts Trial (BASKET) outcome data, we hypothesized that very late (> 12 months) stent thrombosis occurs predominantly after drug-eluting stent implantation in large native coronary vessel stenting. METHODS: To prove or refute this hypothesis, we set up an 11-center 4-country prospective trial of 2260 consecutive patients treated with > or = 3.0-mm stents only, randomized to receive Cypher (Johnson ; Johnson, Miami Lakes, FL), Vision (Abbott Vascular, Abbott Laboratories, IL), or Xience stents (Abbott Vascular). Only patients with left main or bypass graft disease, in-stent restenosis or stent thrombosis, in need of nonheart surgery, at increased bleeding risk, without compliance/consent are excluded. All patients are treated with dual antiplatelet therapy for 12 months. The primary end point will be cardiac death/nonfatal myocardial infarction after 24 months with further follow-up up to 5 years. RESULTS: By June 12, 229 patients (10% of the planned total) were included with a baseline risk similar to that of the same subgroup of BASKET (n = 588). CONCLUSIONS: This study will answer several important questions of contemporary stent use in patients with large native vessel stenting. The 2-year death/myocardial infarction-as well as target vessel revascularization-and bleeding rates in these patients with a first- versus second-generation drug-eluting stent should demonstrate the benefit or harm of these stents compared to cobalt-chromium bare-metal stents in this relevant, low-risk group of everyday patients. In addition, a comparison with similar BASKET patients will allow to estimate the impact of 12- versus 6-month dual antiplatelet therapy on these outcomes.
Resumo:
The Environmental Process and Simulation Center (EPSC) at Michigan Technological University started accommodating laboratories for an Environmental Engineering senior level class CEE 4509 Environmental Process and Simulation Laboratory since 2004. Even though the five units that exist in EPSC provide the students opportunities to have hands-on experiences with a wide range of water/wastewater treatment technologies, a key module was still missing for the student to experience a full cycle of treatment. This project fabricated a direct-filtration pilot system in EPSC and generated a laboratory manual for education purpose. Engineering applications such as clean bed head loss calculation, backwash flowrate determination, multimedia density calculation and run length prediction are included in the laboratory manual. The system was tested for one semester and modifications have been made both to the direct filtration unit and the laboratory manual. Future work is also proposed to further refine the module.
Resumo:
Self-stabilization is a property of a distributed system such that, regardless of the legitimacy of its current state, the system behavior shall eventually reach a legitimate state and shall remain legitimate thereafter. The elegance of self-stabilization stems from the fact that it distinguishes distributed systems by a strong fault tolerance property against arbitrary state perturbations. The difficulty of designing and reasoning about self-stabilization has been witnessed by many researchers; most of the existing techniques for the verification and design of self-stabilization are either brute-force, or adopt manual approaches non-amenable to automation. In this dissertation, we first investigate the possibility of automatically designing self-stabilization through global state space exploration. In particular, we develop a set of heuristics for automating the addition of recovery actions to distributed protocols on various network topologies. Our heuristics equally exploit the computational power of a single workstation and the available parallelism on computer clusters. We obtain existing and new stabilizing solutions for classical protocols like maximal matching, ring coloring, mutual exclusion, leader election and agreement. Second, we consider a foundation for local reasoning about self-stabilization; i.e., study the global behavior of the distributed system by exploring the state space of just one of its components. It turns out that local reasoning about deadlocks and livelocks is possible for an interesting class of protocols whose proof of stabilization is otherwise complex. In particular, we provide necessary and sufficient conditions – verifiable in the local state space of every process – for global deadlock- and livelock-freedom of protocols on ring topologies. Local reasoning potentially circumvents two fundamental problems that complicate the automated design and verification of distributed protocols: (1) state explosion and (2) partial state information. Moreover, local proofs of convergence are independent of the number of processes in the network, thereby enabling our assertions about deadlocks and livelocks to apply on rings of arbitrary sizes without worrying about state explosion.