854 resultados para software performance evaluation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il concetto di “sostenibilità” si riferisce allo sviluppo dei sistemi umani attraverso il più piccolo impatto possibile sul sistema ambientale. Le opere che si inseriscono bene nel contesto ambientale circostante e le pratiche che rispettano le risorse in maniera tale da permettere una crescita e uno sviluppo a lungo termine senza impattare sull’ambiente sono indispensabili in una società moderna. I progressi passati, presenti e futuri che hanno reso i conglomerati bituminosi materiali sostenibili dal punto di vista ambientale sono particolarmente importanti data la grande quantità di conglomerato usato annualmente in Europa e negli Stati Uniti. I produttori di bitume e di conglomerato bituminoso stanno sviluppando tecniche innovative per ridurre l’impatto ambientale senza compromettere le prestazioni meccaniche finali. Un conglomerato bituminoso ad “alta lavorabilità” (WMA), pur sviluppando le stesse caratteristiche meccaniche, richiede un temperatura di produzione minore rispetto a quella di un tradizionale conglomerato bituminoso a caldo (HMA). L’abbassamento della temperature di produzione riduce le emissioni nocive. Questo migliora le condizioni dei lavoratori ed è orientato verso uno sviluppo sostenibile. L’obbiettivo principale di questa tesi di laurea è quello di dimostrare il duplice valore sia dal punto di vista dell’eco-compatibilità sia dal punto di vista meccanico di questi conglomerati bituminosi ad “alta lavorabilità”. In particolare in questa tesi di laurea è stato studiato uno SMA ad “alta lavorabilità” (PGGWMA). L’uso di materiali a basso impatto ambientale è la prima fase verso un progetto ecocompatibile ma non può che essere il punto di partenza. L’approccio ecocompatibile deve essere esteso anche ai metodi di progetto e alla caratterizzazione di laboratorio dei materiali perché solo in questo modo è possibile ricavare le massime potenzialità dai materiali usati. Un’appropriata caratterizzazione del conglomerato bituminoso è fondamentale e necessaria per una realistica previsione delle performance di una pavimentazione stradale. La caratterizzazione volumetrica (Mix Design) e meccanica (Deformazioni Permanenti e Comportamento a fatica) di un conglomerato bituminoso è una fase importante. Inoltre, al fine di utilizzare correttamente i materiali, un metodo di progetto avanzato ed efficiente, come quello rappresentato da un approccio Empirico-Meccanicistico (ME), deve essere utilizzato. Una procedura di progetto Empirico-Meccanicistica consiste di un modello strutturale capace di prevedere gli stati di tensione e deformazione all’interno della pavimentazione sotto l’azione del traffico e in funzione delle condizioni atmosferiche e di modelli empirici, calibrati sul comportamento dei materiali, che collegano la risposta strutturale alle performance della pavimentazione. Nel 1996 in California, per poter effettivamente sfruttare i benefici dei continui progressi nel campo delle pavimentazioni stradali, fu iniziato un estensivo progetto di ricerca mirato allo sviluppo dei metodi di progetto Empirico - Meccanicistici per le pavimentazioni stradali. Il risultato finale fu la prima versione del software CalME che fornisce all’utente tre approcci diversi di l’analisi e progetto: un approccio Empirico, uno Empirico - Meccanicistico classico e un approccio Empirico - Meccanicistico Incrementale - Ricorsivo. Questo tesi di laurea si concentra sulla procedura Incrementale - Ricorsiva del software CalME, basata su modelli di danno per quanto riguarda la fatica e l’accumulo di deformazioni di taglio dai quali dipendono rispettivamente la fessurazione superficiale e le deformazioni permanenti nella pavimentazione. Tale procedura funziona per incrementi temporali successivi e, usando i risultati di ogni incremento temporale, ricorsivamente, come input dell’incremento temporale successivo, prevede le condizioni di una pavimentazione stradale per quanto riguarda il modulo complesso dei diversi strati, le fessurazioni superficiali dovute alla fatica, le deformazioni permanenti e la rugosità superficiale. Al fine di verificare le propreità meccaniche del PGGWMA e le reciproche relazioni in termini di danno a fatica e deformazioni permanenti tra strato superficiale e struttura della pavimentazione per fissate condizioni ambientali e di traffico, è stata usata la procedura Incrementale – Ricorsiva del software CalME. Il conglomerato bituminoso studiato (PGGWMA) è stato usato in una pavimentazione stradale come strato superficiale di 60 mm di spessore. Le performance della pavimentazione sono state confrontate a quelle della stessa pavimentazione in cui altri tipi di conglomerato bituminoso sono stati usati come strato superficiale. I tre tipi di conglomerato bituminoso usati come termini di paragone sono stati: un conglomerato bituminoso ad “alta lavorabilità” con granulometria “chiusa” non modificato (DGWMA), un conglomerato bituminoso modificato con polverino di gomma con granulometria “aperta” (GGRAC) e un conglomerato bituminoso non modificato con granulometria “chiusa” (DGAC). Nel Capitolo I è stato introdotto il problema del progetto ecocompatibile delle pavimentazioni stradali. I materiali a basso impatto ambientale come i conglomerati bituminosi ad “alta lavorabilità” e i conglomerati bituminosi modificati con polverino di gomma sono stati descritti in dettaglio. Inoltre è stata discussa l’importanza della caratterizzazione di laboratorio dei materiali e il valore di un metodo razionale di progetto delle pavimentazioni stradali. Nel Capitolo II sono stati descritti i diversi approcci progettuali utilizzabili con il CalME e in particolare è stata spiegata la procedura Incrementale – Ricorsiva. Nel Capitolo III sono state studiate le proprietà volumetriche e meccaniche del PGGWMA. Test di Fatica e di Deformazioni Permanenti, eseguiti rispettivamente con la macchina a fatica per flessione su quattro punti e il Simple Shear Test device (macchina di taglio semplice), sono stati effettuati su provini di conglomerato bituminoso e i risultati dei test sono stati riassunti. Attraverso questi dati di laboratorio, i parametri dei modelli della Master Curve, del danno a fatica e dell’accumulo di deformazioni di taglio usati nella procedura Incrementale – Ricorsiva del CalME sono stati valutati. Infine, nel Capitolo IV, sono stati presentati i risultati delle simulazioni di pavimentazioni stradali con diversi strati superficiali. Per ogni pavimentazione sono stati analizzati la fessurazione superficiale complessiva, le deformazioni permanenti complessive, il danno a fatica e la profondità delle deformazioni in ognuno degli stati legati.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As computational models in fields such as medicine and engineering get more refined, resource requirements are increased. In a first instance, these needs have been satisfied using parallel computing and HPC clusters. However, such systems are often costly and lack flexibility. HPC users are therefore tempted to move to elastic HPC using cloud services. One difficulty in making this transition is that HPC and cloud systems are different, and performance may vary. The purpose of this study is to evaluate cloud services as a means to minimise both cost and computation time for large-scale simulations, and to identify which system properties have the most significant impact on performance. Our simulation results show that, while the performance of Virtual CPU (VCPU) is satisfactory, network throughput may lead to difficulties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical solutions of realistic 2-D and 3-D inverse problems may require a very large amount of computation. A two-level concept on parallelism is often used to solve such problems. The primary level uses the problem partitioning concept which is a decomposition based on the mathematical/physical problem. The secondary level utilizes the widely used data partitioning concept. A theoretical performance model is built based on the two-level parallelism. The observed performance results obtained from a network of general purpose Sun Sparc stations are compared with the theoretical values. Restrictions of the theoretical model are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many businesses, including hydrocarbon industries, reducing cost is of high priority. Although hydrocarbon industries appear able to afford the expensive computing infrastructure and software packages used to process seismic data in the search for hydrocarbon traps, it is always imperative to find ways to minimize cost. Seismic processing costs can be significantly reduced by using inexpensive, open source seismic data processing packages. However, hydrocarbon industries question the processing performance capability of open source packages, claiming that their seismic functions are less integrated and provide almost no technical guarantees for one to use. The objective of this paper is to demonstrate, through a comparative analysis, that open source seismic data processing packages are capable of executing the required seismic functions on an actual industrial workload. To achieve this objective we investigate whether or not open source seismic data processing packages can be executed using the same set of seismic data through data format conversions, and whether or not they can achieve reasonable performance and speedup when executing parallel seismic functions on a HPC cluster. Among the few open source packages available on the Internet, the subjects of our study are two popular packages: Seismic UNIX (SU) and Madagascar.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Im folgenden Beitrag werden zeitdiskrete analytische Methoden vorgestellt, mit Hilfe derer Informations- und Materialflüsse in logistischen Systemen analysiert und bewertet werden können. Bestehende zeitdiskrete Verfahren sind jedoch auf die Bearbeitung und Weitergabe in immer gleichen Mengen („One Piece Flow“) beschränkt. Vor allem in Materialflusssystemen kommt es, bedingt durch die Zusammenfassung von Aufträgen, durch Transporte und durch Sortiervorgänge, zur Bildung von Batches. Daher wurden analytische Methoden entwickelt, die es ermöglichen, verschiedene Sammelprozesse, Batchankünfte an Ressourcen, Batchbearbeitung und Sortieren von Batches analytisch abzubilden und Leistungskenngrößen zu deren Bewertung zu bestimmen. Die im Rahmen der Entwicklungsarbeiten entstandene Software-Lösung „Logistic Analyzer“ ermöglicht eine einfache Modellierung und Analyse von praktischen Problemen. Der Beitrag schließt mit einem numerischen Beispiel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a study and analysis of surface normal-base descriptors for 3D object recognition. Specifically, we evaluate the behaviour of descriptors in the recognition process using virtual models of objects created from CAD software. Later, we test them in real scenes using synthetic objects created with a 3D printer from the virtual models. In both cases, the same virtual models are used on the matching process to find similarity. The difference between both experiments is in the type of views used in the tests. Our analysis evaluates three subjects: the effectiveness of 3D descriptors depending on the viewpoint of camera, the geometry complexity of the model and the runtime used to do the recognition process and the success rate to recognize a view of object among the models saved in the database.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a novel approach is developed to evaluate the overall performance of a local area network as well as to monitor some possible intrusion detections. The data is obtained via system utility 'ping' and huge data is analyzed via statistical methods. Finally, an overall performance index is defined and simulation experiments in three months proved the effectiveness of the proposed performance index. A software package is developed based on these ideas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Approximately half of the houses in Northern Ireland were built before any form of minimum thermal specification or energy efficiency standard was enforced. Furthermore, 44% of households are categorised as being in fuel poverty; spending more than 10% of the household income to heat the house to bring it to an acceptable level of thermal comfort. To bring existing housing stock up to an acceptable standard, retrofitting for improving the energy efficiency is essential and it is also necessary to study the effectiveness of such improvements in future climate scenarios. This paper presents the results from a year-long performance monitoring of two houses that have undergone retrofits to improve energy efficiency. Using wireless sensor technology internal temperature, humidity, external weather, household gas and electricity usage were monitored for a year. Simulations using IES-VE dynamic building modelling software were calibrated using the monitoring data to ASHARE Guideline 14 standards. The energy performance and the internal environment of the houses were then assessed for current and future climate scenarios and the results show that there is a need for a holistic balanced strategy for retrofitting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mixing performance of three passive milli-scale reactors with different geometries was investigated at different Reynolds numbers. The effects of design and operating characteristics such as mixing channel shape and volume flow rate were investigated. The main objective of this work was to demonstrate a process design method that uses on Computational Fluid Dynamics (CFD) for modeling and Additive Manufacturing (AM) technology for manufacture. The reactors were designed and simulated using SolidWorks and Fluent 15.0 software, respectively. Manufacturing of the devices was performed with an EOS M-series AM system. Step response experiments with distilled Millipore water and sodium hydroxide solution provided time-dependent concentration profiles. Villermaux-Dushman reaction experiments were also conducted for additional verification of CFD results and for mixing efficiency evaluation of the different geometries. Time-dependent concentration data and reaction evaluation showed that the performance of the AM-manufactured reactors matched the CFD results reasonably well. The proposed design method allows the implementation of new and innovative solutions, especially in the process design phase, for industrial scale reactor technologies. In addition, rapid implementation is another advantage due to the virtual flow design and due to the fast manufacturing which uses the same geometric file formats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Malicious users try to compromise systems using new techniques. One of the recent techniques used by the attacker is to perform complex distributed attacks such as denial of service and to obtain sensitive data such as password information. These compromised machines are said to be infected with malicious software termed a “bot”. In this paper, we investigate the correlation of behavioural attributes such as keylogging and packet flooding behaviour to detect the existence of a single bot on a compromised machine by applying (1) Spearman’s rank correlation (SRC) algorithm and (2) the Dendritic Cell Algorithm (DCA). We also compare the output results generated from these two methods to the detection of a single bot. The results show that the DCA has a better performance in detecting malicious activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we discuss our participation to the INEX 2008 Link-the-Wiki track. We utilized a sliding window based algorithm to extract the frequent terms and phrases. Using the extracted phrases and term as descriptive vectors, the anchors and relevant links (both incoming and outgoing) are recognized efficiently.