275 resultados para Criticality


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il presente studio si colloca nell’ambito di una ricerca il cui obiettivo è la formulazione di criteri progettuali finalizzati alla ottimizzazione delle prestazioni energetiche delle cantine di aziende vitivinicole con dimensioni produttive medio-piccole. Nello specifico la ricerca si pone l’obiettivo di individuare degli indicatori che possano valutare l’influenza che le principali variabili progettuali hanno sul fabbisogno energetico dell’edificio e sull’andamento delle temperature all’interno dei locali di conservazione ed invecchiamento del vino. Tali indicatori forniscono informazioni sulla prestazione energetica dell’edificio e sull’idoneità dei locali non climatizzati finalizzata alla conservazione del vino Essendo la progettazione una complessa attività multidisciplinare, la ricerca ha previsto l’ideazione di un programma di calcolo in grado di gestire ed elaborare dati provenienti da diversi ambiti (ingegneristici, architettonici, delle produzioni agroindustriali, ecc.), e di restituire risultati sintetici attraverso indicatori allo scopo individuati. Il programma è stato applicato su un caso-studio aziendale rappresentativo del settore produttivo. Sono stati vagliati gli effetti di due modalità di vendemmia e di quattro soluzioni architettoniche differenti. Le soluzioni edilizie derivano dalla combinazione di diversi isolamenti termici e dalla presenza o meno di locali interrati. Per le analisi sul caso-studio ci si è avvalsi di simulazioni energetiche in regime dinamico, supportate e validate da campagne di monitoraggio termico e meteorologico all’interno dell’azienda oggetto di studio. I risultati ottenuti hanno evidenziato come il programma di calcolo concepito nell’ambito di questo studio individui le criticità dell’edificio in termini energetici e di “benessere termico” del vino e consenta una iterativa revisione delle variabili progettuale indagate. Esso quindi risulta essere uno strumento informatizzato di valutazione a supporto della progettazione, finalizzato ad una ottimizzazione del processo progettuale in grado di coniugare, in maniera integrata, gli obiettivi della qualità del prodotto, della efficienza produttiva e della sostenibilità economica ed ambientale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tesi riprende un tema che è stato oggetto in passato di studi anche molto approfonditi; oggi sembra essere tornato alla ribalta grazie ad alcuni contributi che hanno nuovamente stimolato la dottrina a confrontarsi su aspetti così delicati anche alla luce della crisi economica. E'stato da sempre rilevato che la buona scrittura delle norme è un fattore fondamentale per il rilancio dell’economia del paese, per la semplificazione e per garantire ordine, coerenza e chiarezza all’ordinamento giuridico. La prima parte è incentrata su una ricostruzione storica e giuridica delle fonti che hanno disciplinato le “regole per la qualità delle regole”, oltre ad una panoramica della dottrina che si è occupata in passato del tema. Segue l’individuazione specifica di quali sono le regole formali e sostanziali di drafting. In particolare, una parte è dedicata alla giurisprudenza costituzionale per comprendere se esiste o meno un aggancio per la Corte Costituzionale da permetterle il sindacato sulle “regole oscure” e dichiararle illegittime. La seconda parte analizza le pressai, in particolare si è scelto di analizzare il rapporto tra Governo e Parlamento nelle problematiche principali che attengono al procedimento legislativo e alla cornice entro la quale viene esplicato in relazione alla decretazione d’urgenza, maxiemendamenti, questione di fiducia, istruttoria in commissione, gruppi di pressione. Ciò che è stato rilevato, è una scarsa aderenza ai principi e ai criteri di better regulation, peraltro difficilmente giustiziabili da parte della Corte costituzionale e sottratti al controllo di chi, al contrario, ha competenza in questo settore, ossia il Comitato per la legislazione e il DAGL. Le conclusioni, pertanto, prendono le mosse da una serie di criticità rilevate e tentano di tracciare una strada da percorrere che sia rispettosa dei canoni della “better regulation” anche alla luce delle riforme costituzionali e dei regolamenti parlamentari in corso di approvazione.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nel corso degli ultimi decenni, ha assunto importanza crescente il tema della sicurezza e dell’affidabilità degli impianti dell’industria di processo. Tramite l’analisi di affidabilità è possibile individuare i componenti critici di un impianto più a rischio. Nel presente lavoro di tesi è stata eseguita l’analisi di affidabilità di tre impianti dello stabilimento SOL di Mantova: l’impianto di vaporizzazione azoto a bassa pressione, l’impianto di vaporizzazione azoto a media pressione e l’impianto di produzione di aria sintetica. A partire dai diagrammi P&ID degli impianti si è effettuata l’analisi delle possibili modalità di guasto degli impianti stessi tramite la tecnica FMECA, acronimo di Failure Modes & Effects Criticality Analisys. Una volta definite le modalità di guasto degli impianti, si è proceduto a quantificarne l’affidabilità utilizzando la tecnica FTA, acronimo di Fault Tree Analisys. I risultati ottenuti dall’analisi degli alberi dei guasti, hanno permesso di individuare gli eventi primari che maggiormente contribuiscono al fallimento dei sistemi studiati, consentendo di formulare ipotesi per l’incremento di affidabilità degli impianti.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In questa tesi si è studiato l’insorgere di eventi critici in un semplice modello neurale del tipo Integrate and Fire, basato su processi dinamici stocastici markoviani definiti su una rete. Il segnale neurale elettrico è stato modellato da un flusso di particelle. Si è concentrata l’attenzione sulla fase transiente del sistema, cercando di identificare fenomeni simili alla sincronizzazione neurale, la quale può essere considerata un evento critico. Sono state studiate reti particolarmente semplici, trovando che il modello proposto ha la capacità di produrre effetti "a cascata" nell’attività neurale, dovuti a Self Organized Criticality (auto organizzazione del sistema in stati instabili); questi effetti non vengono invece osservati in Random Walks sulle stesse reti. Si è visto che un piccolo stimolo random è capace di generare nell’attività della rete delle fluttuazioni notevoli, in particolar modo se il sistema si trova in una fase al limite dell’equilibrio. I picchi di attività così rilevati sono stati interpretati come valanghe di segnale neurale, fenomeno riconducibile alla sincronizzazione.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the first time in metallic glasses, we extract both the exponents and scaling functions that describe the nature, statistics, and dynamics of slip events during slow deformation, according to a simple mean field model. We model the slips as avalanches of rearrangements of atoms in coupled shear transformation zones (STZs). Using high temporal resolution measurements, we find the predicted, different statistics and dynamics for small and large slips thereby excluding self-organized criticality. The agreement between model and data across numerous independent measures provides evidence for slip avalanches of STZs as the elementary mechanism of inhomogeneous deformation in metallic glasses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: first, we propose a quantile-based criterion for the sequential design of experiments, in the fashion of the classical expected improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Second, we present a procedure for the allocation of the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety. This article has supplementary material available online. The proposed criterion is available in the R package DiceOptim.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider percolation properties of the Boolean model generated by a Gibbs point process and balls with deterministic radius. We show that for a large class of Gibbs point processes there exists a critical activity, such that percolation occurs a.s. above criticality. For locally stable Gibbs point processes we show a converse result, i.e. they do not percolate a.s. at low activity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently more than half of Electronic Health Record (EHR) projects fail. Most of these failures are not due to flawed technology, but rather due to the lack of systematic considerations of human issues. Among the barriers for EHR adoption, function mismatching among users, activities, and systems is a major area that has not been systematically addressed from a human-centered perspective. A theoretical framework called Functional Framework was developed for identifying and reducing functional discrepancies among users, activities, and systems. The Functional Framework is composed of three models – the User Model, the Designer Model, and the Activity Model. The User Model was developed by conducting a survey (N = 32) that identified the functions needed and desired from the user’s perspective. The Designer Model was developed by conducting a systemic review of an Electronic Dental Record (EDR) and its functions. The Activity Model was developed using an ethnographic method called shadowing where EDR users (5 dentists, 5 dental assistants, 5 administrative personnel) were followed quietly and observed for their activities. These three models were combined to form a unified model. From the unified model the work domain ontology was developed by asking users to rate the functions (a total of 190 functions) in the unified model along the dimensions of frequency and criticality in a survey. The functional discrepancies, as indicated by the regions of the Venn diagrams formed by the three models, were consistent with the survey results, especially with user satisfaction. The survey for the Functional Framework indicated the preference of one system over the other (R=0.895). The results of this project showed that the Functional Framework provides a systematic method for identifying, evaluating, and reducing functional discrepancies among users, systems, and activities. Limitations and generalizability of the Functional Framework were discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabajo esta dedicado al estudio de las estructuras macroscópicas conocidas en la literatura como filamentos o blobs que han sido observadas de manera universal en el borde de todo tipo de dispositivos de fusión por confinamiento magnético. Estos filamentos, celdas convectivas elongadas a lo largo de las líneas de campo que surgen en el plasma fuertemente turbulento que existe en este tipo de dispositivos, parecen dominar el transporte radial de partículas y energía en la región conocida como Scrape-off Layer, en la que las líneas de campo dejan de estar cerradas y el plasma es dirigido hacia la pared sólida que forma la cámara de vacío. Aunque el comportamiento y las leyes de escala de estas estructuras son relativamente bien conocidos, no existe aún una teoría generalmente aceptada acerca del mecanismo físico responsable de su formación, que constituye una de las principales incógnitas de la teoría de transporte del borde en plasmas de fusión y una cuestión de gran importancia práctica en el desarrollo de la siguiente generación de reactores de fusión (incluyendo dispositivos como ITER y DEMO), puesto que la eficiencia del confinamiento y la cantidad de energía depositadas en la pared dependen directamente de las características del transporte en el borde. El trabajo ha sido realizado desde una perspectiva eminentemente experimental, incluyendo la observación y el análisis de este tipo de estructuras en el stellarator tipo heliotrón LHD (un dispositivo de gran tamaño, capaz de generar plasmas de características cercanas a las necesarias en un reactor de fusión) y en el stellarator tipo heliac TJ-II (un dispositivo de medio tamaño, capaz de generar plasmas relativamente más fríos pero con una accesibilidad y disponibilidad de diagnósticos mayor). En particular, en LHD se observó la generación de filamentos durante las descargas realizadas en configuración de alta _ (alta presión cinética frente a magnética) mediante una cámara visible ultrarrápida, se caracterizó su comportamiento y se investigó, mediante el análisis estadístico y la comparación con modelos teóricos, el posible papel de la Criticalidad Autoorganizada en la formación de este tipo de estructuras. En TJ-II se diseñó y construyó una cabeza de sonda capaz de medir simultáneamente las fluctuaciones electrostáticas y electromagnéticas del plasma. Gracias a este nuevo diagnóstico se pudieron realizar experimentos con el fin de determinar la presencia de corriente paralela a través de los filamentos (un parámetro de gran importancia en su modelización) y relacionar los dos tipos de fluctuaciones por primera vez en un stellarator. Así mismo, también por primera vez en este tipo de dispositivo, fue posible realizar mediciones simultáneas de los tensores viscoso y magnético (Reynolds y Maxwell) de transporte de cantidad de movimiento. ABSTRACT This work has been devoted to the study of the macroscopic structures known in the literature as filaments or blobs, which have been observed universally in the edge of all kind of magnetic confinement fusion devices. These filaments, convective cells stretching along the magnetic field lines, arise from the highly turbulent plasma present in this kind of machines and seem to dominate radial transport of particles and energy in the region known as Scrapeoff Layer, in which field lines become open and plasma is directed towards the solid wall of the vacuum vessel. Although the behavior and scale laws of these structures are relatively well known, there is no generally accepted theory about the physical mechanism involved in their formation yet, which remains one of the main unsolved questions in the fusion plasmas edge transport theory and a matter of great practical importance for the development of the next generation of fusion reactors (including ITER and DEMO), since efficiency of confinement and the energy deposition levels on the wall are directly dependent of the characteristics of edge transport. This work has been realized mainly from an experimental perspective, including the observation and analysis of this kind of structures in the heliotron stellarator LHD (a large device capable of generating reactor-relevant plasma conditions) and in the heliac stellarator TJ-II (a medium-sized device, capable of relatively colder plasmas, but with greater ease of access and diagnostics availability). In particular, in LHD, the generation of filaments during high _ discharges (with high kinetic to magnetic pressure ratio) was observed by means of an ultrafast visible camera, and the behavior of this structures was characterized. Finally, the potential role of Self-Organized Criticality in the generation of filaments was investigated. In TJ-II, a probe head capable of measuring simultaneously electrostatic and electromagnetic fluctuations in the plasma was designed and built. Thanks to this new diagnostic, experiments were carried out in order to determine the presence of parallel current through filaments (one of the most important parameters in their modelization) and to related electromagnetic (EM) and electrostatic (ES) fluctuations for the first time in an stellarator. As well, also for the first time in this kind of device, measurements of the viscous and magnetic momentum transfer tensors (Reynolds and Maxwell) were performed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Burn-up credit analyses are based on depletion calculations that provide an accurate prediction of spent fuel isotopic contents, followed by criticality calculations to assess keff

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determining as accurate as possible spent nuclear fuel isotopic content is gaining importance due to its safety and economic implications. Since nowadays higher burn ups are achievable through increasing initial enrichments, more efficient burn up strategies within the reactor cores and the extension of the irradiation periods, establishing and improving computation methodologies is mandatory in order to carry out reliable criticality and isotopic prediction calculations. Several codes (WIMSD5, SERPENT 1.1.7, SCALE 6.0, MONTEBURNS 2.0 and MCNP-ACAB) and methodologies are tested here and compared to consolidated benchmarks (OECD/NEA pin cell moderated with light water) with the purpose of validating them and reviewing the state of the isotopic prediction capabilities. These preliminary comparisons will suggest what can be generally expected of these codes when applied to real problems. In the present paper, SCALE 6.0 and MONTEBURNS 2.0 are used to model the same reported geometries, material compositions and burn up history of the Spanish Van de llós II reactor cycles 7-11 and to reproduce measured isotopies after irradiation and decay times. We analyze comparisons between measurements and each code results for several grades of geometrical modelization detail, using different libraries and cross-section treatment methodologies. The power and flux normalization method implemented in MONTEBURNS 2.0 is discussed and a new normalization strategy is developed to deal with the selected and similar problems, further options are included to reproduce temperature distributions of the materials within the fuel assemblies and it is introduced a new code to automate series of simulations and manage material information between them. In order to have a realistic confidence level in the prediction of spent fuel isotopic content, we have estimated uncertainties using our MCNP-ACAB system. This depletion code, which combines the neutron transport code MCNP and the inventory code ACAB, propagates the uncertainties in the nuclide inventory assessing the potential impact of uncertainties in the basic nuclear data: cross-section, decay data and fission yields

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The accurate prediction of the spent nuclear fuel content is essential for its safe and optimized transportation, storage and management. This isotopic evolution can be predicted using powerful codes and methodologies throughout irradiation as well as cooling time periods. However, in order to have a realistic confidence level in the prediction of spent fuel isotopic content, it is desirable to determine how uncertainties affect isotopic prediction calculations by quantifying their associated uncertainties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Isotopic content assessment has a paramount importance for safety and storage reasons. During the latest years, a great variety of codes have been developed to perform transport and decay calculations, but only those that couple both in an iterative manner achieve an accurate prediction of the final isotopic content of irradiated fuels. Needless to say, them all are supposed to pass the test of the comparison of their predictions against the corresponding experimental measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As it is defined in ATM 2000+ Strategy (Eurocontrol 2001), the mission of the Air Traffic Management (ATM) System is: “For all the phases of a flight, the ATM system should facilitate a safe, efficient, and expedite traffic flow, through the provision of adaptable ATM services that can be dimensioned in relation to the requirements of all the users and areas of the European air space. The ATM services should comply with the demand, be compatible, operate under uniform principles, respect the environment and satisfy the national security requirements.” The objective of this paper is to present a methodology designed to evaluate the status of the ATM system in terms of the relationship between the offered capacity and traffic demand, identifying weakness areas and proposing solutions. The first part of the methodology relates to the characterization and evaluation of the current system, while a second part proposes an approach to analyze the possible development limit. As part of the work, general criteria are established to define the framework in which the analysis and diagnostic methodology presented is placed. They are: the use of Air Traffic Control (ATC) sectors as analysis unit, the presence of network effects, the tactical focus, the relative character of the analysis, objectivity and a high level assessment that allows assumptions on the human and Communications, Navigation and Surveillance (CNS) elements, considered as the typical high density air traffic resources. The steps followed by the methodology start with the definition of indicators and metrics, like the nominal criticality or the nominal efficiency of a sector; scenario characterization where the necessary data is collected; network effects analysis to study the relations among the constitutive elements of the ATC system; diagnostic by means of the “System Status Diagram”; analytical study of the ATC system development limit; and finally, formulation of conclusions and proposal for improvement. This methodology was employed by Aena (Spanish Airports Manager and Air Navigation Service Provider) and INECO (Spanish Transport Engineering Company) in the analysis of the Spanish ATM System in the frame of the Spanish airspace capacity sustainability program, although it could be applied elsewhere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are a number of research and development activities that are exploring Time and Space Partition (TSP) to implement safe and secure flight software. This approach allows to execute different real-time applications with different levels of criticality in the same computer board. In order to do that, flight applications must be isolated from each other in the temporal and spatial domains. This paper presents the first results of a partitioning platform based on the Open Ravenscar Kernel (ORK+) and the XtratuM hypervisor. ORK+ is a small, reliable real-time kernel supporting the Ada Ravenscar Computational model that is central to the ASSERT development process. XtratuM supports multiple virtual machines, i.e. partitions, on a single computer and is being used in the Integrated Modular Avionics for Space study. ORK+ executes in an XtratuM partition enabling Ada applications to share the computer board with other applications.