951 resultados para Simulation Modeling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Työn tavoitteena oli selvittää kaupallisen dynamiikan simulointiohjelmiston soveltuvuus kallioporakoneen dynamiikan analysointiin. Työssä mallinnettiin parametrisoitu virtuaaliprototyyppi uudenlaisella toimintaperiaatteella toimivasta kallioporakoneesta. Virtuaaliprototyyppiä on tarkoitus käyttää fyysisen prototyypin mitoituksessa sekä porakoneen toiminnan simuloinnissa ja suorituskyvyn arvioinnissa ennen ensimmäisen fyysisen prototyypin valmistamista. Mallinnus tehtiin ADAMS -ohjelmistoa ja siihen liitettävää ADAMS/Hydraulics -moduulia käyttäen. Mallinnuksessa kiinnitettiin huomiota erityisesti porakoneessa esiintyvien vuotovirtauksien huomioimiseen. ADAMS -ohjelmisto soveltuu hyvin hydraulisen iskuporakoneen dynaamisten ilmiöiden simulointiin. Koska fyysistä prototyyppiä ei ole vielä olemassa, ei mallin toimintaa voida kuitenkaan tämän tutkimuksen puitteissa verifioida mittauksin. Simuloitujen tulosten perusteella voidaan todeta uuden toimintaperiaatteen olevan käyttökelpoinen kallion poraukseen. Parametrisoitua virtuaaliprototyyppiä voidaan käyttää tehokkaasti hyväksi tuotekehitysvaiheessa sekä se voidaan liittää osaksi laajempaa ja yksityiskohtaisempaa porauslaitteen simulointimallia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Diplomityö tehtiin Wello Oy:n toimeksiannosta. Wello Oy on vesi- ja tuulivoimaratkaisuihin keskittynyt yritys, joka kehittää aaltovoimalaitekonseptia. Työssä selvitettiin aaltovoimaan liittyviä ilmiöitä ja aaltovoimalaitteen mekaanista konseptia ja tehtiin arvio niiden tehokkuudesta. Työssä käytettiin kaupallisia simulointityökaluja kuten monikappaledynamiikan simulointiohjelmaa MSC.ADAMS R3:a ja yleistä matematiikka ohjelmaa Matlab Simulink:ia. Simulointimallia käytettiin arvioimaan laitteen yleistä käyttäytymistä. Lisäksi laitteen analyyttisiä malleja käytettiin laitteen toimintaperiaatteen selvitykseen. Simulointia käytettiin kelluvan laitteen mekanismin tutkimukseen. Tuloksiin pohjautuen laitteelle määriteltiin teoreettiset maksimiteho rajat ja rajoitteet, jotka vaikuttavat laitteen tehokkuuteen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Lung cancer (LC) is the leading cause of cancer death in the developed world. Most cancers are associated with tobacco smoking. A primary hope for reducing lung cancer has been prevention of smoking and successful smoking cessation programs. To date, these programs have not been as successful as anticipated. Objective: The aim of the current study was to evaluate whether lung cancer screening combining low dose computed tomography with autofluorescence bronchoscopy (combined CT & AFB) is superior to CT or AFB screening alone in improving lung cancer specific survival. In addition, the extent of improvement and ideal conditions for combined CT & AFB screening were evaluated. Methods: We applied decision analysis and Monte Carlo simulation modeling using TreeAge Software to evaluate our study aims. Histology- and stage specific probabilities of lung cancer 5-year survival proportions were taken from Surveillance and Epidemiologic End Results (SEER) Registry data. Screeningassociated data was taken from the US NCI Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO), National Lung Screening Trial (NLST), and US NCI Lung Screening Study (LSS), other relevant published data and expert opinion. Results: Decision Analysis - Combined CT and AFB was the best approach at Improving 5-year survival (Overall Expected Survival (OES) in the entire screened population was 0.9863) and in lung cancer patients only (Lung Cancer Specific Expected Survival (LOSES) was 0.3256). Combined screening was slightly better than CT screening alone (OES = 0.9859; LCSES = 0.2966), and substantially better than AFB screening alone (OES = 0.9842; LCSES = 0.2124), which was considerably better than no screening (OES = 0.9829; LCSES = 0.1445). Monte Carlo simulation modeling revealed that expected survival in the screened population and lung cancer patients is highest when screened using CT and combined CT and AFB. CT alone and combined screening was substantially better than AFB screening alone or no screening. For LCSES, combined CT and AFB screening is significantly better than CT alone (0.3126 vs. 0.2938, p< 0.0001). Conclusions: Overall, these analyses suggest that combined CT and AFB is slightly better than CT alone at improving lung cancer survival, and both approaches are substantially better than AFB screening alone or no screening.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study has two main objectives. First, the phlebotomy process at the St. Catharines Site of the Niagara Health System is investigated, which starts when an order for a blood test is placed, and ends when the specimen arrives at the lab. The performance measurement is the flow time of the process, which reflects concerns and interests of both the hospital and the patients. Three popular operational methodologies are applied to reduce the flow time and improve the process: DMAIC from Six Sigma, lean principles and simulation modeling. Potential suggestions are provided for the St. Catharines Site, which could result in an average of seven minutes reduction in the flow time. The second objective addresses the fact that these three methodologies have not been combined before in a process improvement effort. A structured framework combining them is developed to benefit future study of phlebotomy and other hospital processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Le travail a été réalisé en collaboration avec le laboratoire de mécanique acoustique de Marseille, France. Les simulations ont été menées avec les langages Matlab et C. Ce projet s'inscrit dans le champ de recherche dénommé caractérisation tissulaire par ultrasons.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En un mundo hiperconectado, dinámico y cargado de incertidumbre como el actual, los métodos y modelos analíticos convencionales están mostrando sus limitaciones. Las organizaciones requieren, por tanto, herramientas útiles que empleen tecnología de información y modelos de simulación computacional como mecanismos para la toma de decisiones y la resolución de problemas. Una de las más recientes, potentes y prometedoras es el modelamiento y la simulación basados en agentes (MSBA). Muchas organizaciones, incluidas empresas consultoras, emplean esta técnica para comprender fenómenos, hacer evaluación de estrategias y resolver problemas de diversa índole. Pese a ello, no existe (hasta donde conocemos) un estado situacional acerca del MSBA y su aplicación a la investigación organizacional. Cabe anotar, además, que por su novedad no es un tema suficientemente difundido y trabajado en Latinoamérica. En consecuencia, este proyecto pretende elaborar un estado situacional sobre el MSBA y su impacto sobre la investigación organizacional.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem of projecting multidimensional data into lower dimensions has been pursued by many researchers due to its potential application to data analyses of various kinds. This paper presents a novel multidimensional projection technique based on least square approximations. The approximations compute the coordinates of a set of projected points based on the coordinates of a reduced number of control points with defined geometry. We name the technique Least Square Projections ( LSP). From an initial projection of the control points, LSP defines the positioning of their neighboring points through a numerical solution that aims at preserving a similarity relationship between the points given by a metric in mD. In order to perform the projection, a small number of distance calculations are necessary, and no repositioning of the points is required to obtain a final solution with satisfactory precision. The results show the capability of the technique to form groups of points by degree of similarity in 2D. We illustrate that capability through its application to mapping collections of textual documents from varied sources, a strategic yet difficult application. LSP is faster and more accurate than other existing high-quality methods, particularly where it was mostly tested, that is, for mapping text sets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During the last decade, the Internet usage has been growing at an enormous rate which has beenaccompanied by the developments of network applications (e.g., video conference, audio/videostreaming, E-learning, E-Commerce and real-time applications) and allows several types ofinformation including data, voice, picture and media streaming. While end-users are demandingvery high quality of service (QoS) from their service providers, network undergoes a complex trafficwhich leads the transmission bottlenecks. Considerable effort has been made to study thecharacteristics and the behavior of the Internet. Simulation modeling of computer networkcongestion is a profitable and effective technique which fulfills the requirements to evaluate theperformance and QoS of networks. To simulate a single congested link, simulation is run with asingle load generator while for a larger simulation with complex traffic, where the nodes are spreadacross different geographical locations generating distributed artificial loads is indispensable. Onesolution is to elaborate a load generation system based on master/slave architecture.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main idea of this research to solve the problem of inventory management for the paper industry SPM PVT limited. The aim of this research was to find a methodology by which the inventory of raw material could be kept at minimum level by means of buffer stock level.The main objective then lies in finding the minimum level of buffer stock according to daily consumption of raw material, finding the Economic Order Quantity (EOQ) reorders point and how much order will be placed in a year to control the shortage of raw material.In this project, we discuss continuous review model (Deterministic EOQ models) that includes the probabilistic demand directly in the formulation. According to the formula, we see the reorder point and the order up to model. The problem was tackled mathematically as well as simulation modeling was used where mathematically tractable solution was not possible.The simulation modeling was done by Awesim software for developing the simulation network. This simulation network has the ability to predict the buffer stock level based on variable consumption of raw material and lead-time. The data collection for this simulation network is taken from the industrial engineering personnel and the departmental studies of the concerned factory. At the end, we find the optimum level of order quantity, reorder point and order days.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Drinking water distribution networks risk exposure to malicious or accidental contamination. Several levels of responses are conceivable. One of them consists to install a sensor network to monitor the system on real time. Once a contamination has been detected, this is also important to take appropriate counter-measures. In the SMaRT-OnlineWDN project, this relies on modeling to predict both hydraulics and water quality. An online model use makes identification of the contaminant source and simulation of the contaminated area possible. The objective of this paper is to present SMaRT-OnlineWDN experience and research results for hydraulic state estimation with sampling frequency of few minutes. A least squares problem with bound constraints is formulated to adjust demand class coefficient to best fit the observed values at a given time. The criterion is a Huber function to limit the influence of outliers. A Tikhonov regularization is introduced for consideration of prior information on the parameter vector. Then the Levenberg-Marquardt algorithm is applied that use derivative information for limiting the number of iterations. Confidence intervals for the state prediction are also given. The results are presented and discussed on real networks in France and Germany.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work aims presenting the development of a model and computer simulation of a sucker rod pumping system. This system take into account the well geometry, the flow through the tubing, the dynamic behavior of the rod string and the use of a induction motor model. The rod string were modeled using concentrated parameters, allowing the use of ordinary differential equations systems to simulate it s behavior

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A dynamic systems water resources simulation model was developed as a tool to help to analyze water resources management alternatives for the Piracicaba, Capivari and Jundiaí River Water Basins (BH-PCJ). Different politics policy were simulated for 40-year. The model estimates water supply and demand, as well as contamination load from several consumers. Six runs were performed using average precipitation value, changing water supply and demand, and different volumes diverted from BH-PCJ to BH-Alto Tietê For the Business as Usual, the Sustainability Index went from 0.41 in 2010 to 0.22 by 2050; the Water Use Index changed from 80.7% in 2010, to 125.5% by 2050; and the Falkenmark Index changed from 1,302 m 3 person -1 year -1 in 2010 to 774 m 3 P -1 year -1 by 2050. It was noticed that sanitation is one of the biggest concerns in the near future at PCJ River Basin.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Assessment of lung volume (FRC) and ventilation inhomogeneities with ultrasonic flowmeter and multiple breath washout (MBW) has been used to provide important information about lung disease in infants. Sub-optimal adjustment of the mainstream molar mass (MM) signal for temperature and external deadspace may lead to analysis errors in infants with critically small tidal volume changes during breathing. METHODS: We measured expiratory temperature in human infants at 5 weeks of age and examined the influence of temperature and deadspace changes on FRC results with computer simulation modeling. A new analysis method with optimized temperature and deadspace settings was then derived, tested for robustness to analysis errors and compared with the previously used analysis methods. RESULTS: Temperature in the facemask was higher and variations of deadspace volumes larger than previously assumed. Both showed considerable impact upon FRC and LCI results with high variability when obtained with the previously used analysis model. Using the measured temperature we optimized model parameters and tested a newly derived analysis method, which was found to be more robust to variations in deadspace. Comparison between both analysis methods showed systematic differences and a wide scatter. CONCLUSION: Corrected deadspace and more realistic temperature assumptions improved the stability of the analysis of MM measurements obtained by ultrasonic flowmeter in infants. This new analysis method using the only currently available commercial ultrasonic flowmeter in infants may help to improve stability of the analysis and further facilitate assessment of lung volume and ventilation inhomogeneities in infants.