897 resultados para end-to-end testing, javascript, application web, single-page application


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dynamic system test methods for heating systems were developed and applied by the institutes SERC and SP from Sweden, INES from France and SPF from Switzerland already before the MacSheep project started. These test methods followed the same principle: a complete heating system – including heat generators, storage, control etc., is installed on the test rig; the test rig software and hardware simulates and emulates the heat load for space heating and domestic hot water of a single family house, while the unit under test has to act autonomously to cover the heat demand during a representative test cycle. Within the work package 2 of the MacSheep project these similar – but different – test methods were harmonized and improved. The work undertaken includes:  • Harmonization of the physical boundaries of the unit under test. • Harmonization of the boundary conditions of climate and load. • Definition of an approach to reach identical space heat load in combination with an autonomous control of the space heat distribution by the unit under test. • Derivation and validation of new six day and a twelve day test profiles for direct extrapolation of test results.   The new harmonized test method combines the advantages of the different methods that existed before the MacSheep project. The new method is a benchmark test, which means that the load for space heating and domestic hot water preparation will be identical for all tested systems, and that the result is representative for the performance of the system over a whole year. Thus, no modelling and simulation of the tested system is needed in order to obtain the benchmark results for a yearly cycle. The method is thus also applicable to products for which simulation models are not available yet. Some of the advantages of the new whole system test method and performance rating compared to the testing and energy rating of single components are:  • Interaction between the different components of a heating system, e.g. storage, solar collector circuit, heat pump, control, etc. are included and evaluated in this test. • Dynamic effects are included and influence the result just as they influence the annual performance in the field. • Heat losses are influencing the results in a more realistic way, since they are evaluated under "real installed" and representative part-load conditions rather than under single component steady state conditions.   The described method is also suited for the development process of new systems, where it replaces time-consuming and costly field testing with the advantage of a higher accuracy of the measured data (compared to the typically used measurement equipment in field tests) and identical, thus comparable boundary conditions. Thus, the method can be used for system optimization in the test bench under realistic operative conditions, i.e. under relevant operating environment in the lab.   This report describes the physical boundaries of the tested systems, as well as the test procedures and the requirements for both the unit under test and the test facility. The new six day and twelve day test profiles are also described as are the validation results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Designing a successful web project requires understanding not only of its owner's business and technological needs, as well as having the substantial management and development experience, but it also depends on a thorough knowledge of the system's application domain and of other existing systems in the domain. In order to gather such domain knowledge, it is necessary to identify the nature of the proposed web services venture with regards to other similar services offered in the domain, the business setting of enterprises that initiate such ventures, the various types of customers involved, and how these factors translate into requirements. In this paper, we present an approach to studying the domain of web-enabled Human Resource and payroll services with the aim of attaining design knowledge that would ensure customer satisfaction and could eventually pave the way to the successful implementation of web-enabled services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systematic usability testing of the library website was unheard of at Deakin University Library three years ago. However, over the last two years, a large scale usability testing program has evolved and various methodologies have been trialled and tested by the team. This paper will discuss the methodologies used by the team, and the changes that were made to the Library’s search interfaces as a result of the studies. The paper will provide useful insights on what we did right, and on what we need to do differently in future usability studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To reduce weight and improve passenger safety there is an increased need in the automotive industry to use Ultra High Strength Steel (UHSS) for structural and crash components. However, the application of UHSS is restricted by their limited formability and the difficulty of forming them in conventional stamping. An alternative method of manufacturing structural auto body parts from UHSS is the flexible roll forming process, which allows the manufacture of metal sheet with high strength and limited ductility into complex and weight-optimized components. One major problem in the flexible roll forming of UHSS is the web-warping defect, which is the deviation in height of the web area over the length of the profile. It has been shown that web-warping is strongly dependant to the permanent longitudinal strain formed in the flange of the part. Flexible roll forming is a continuous process with many roll stands, which makes numerical analysis extremely time intensive and computationally expensive. An analytical model of web-warping is therefore critical to improve design efficiency during the early process design stage before FEA is applied. This paper establishes for the first time an analytical model for the prediction of web-warping for the flexible roll forming of a section with variable width. The model is based on evaluating longitudinal edge strain in the flange of the part. This information is then used in combination with a simple geometrical model to investigate the relationship between web-warping and longitudinal strain with respect to process parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to estimate (co)variance components using random regression on B-spline functions to weight records obtained from birth to adulthood. A total of 82 064 weight records of 8145 females obtained from the data bank of the Nellore Breeding Program (PMGRN/Nellore Brazil) which started in 1987, were used. The models included direct additive and maternal genetic effects and animal and maternal permanent environmental effects as random. Contemporary group and dam age at calving (linear and quadratic effect) were included as fixed effects, and orthogonal Legendre polynomials of age (cubic regression) were considered as random covariate. The random effects were modeled using B-spline functions considering linear, quadratic and cubic polynomials for each individual segment. Residual variances were grouped in five age classes. Direct additive genetic and animal permanent environmental effects were modeled using up to seven knots (six segments). A single segment with two knots at the end points of the curve was used for the estimation of maternal genetic and maternal permanent environmental effects. A total of 15 models were studied, with the number of parameters ranging from 17 to 81. The models that used B-splines were compared with multi-trait analyses with nine weight traits and to a random regression model that used orthogonal Legendre polynomials. A model fitting quadratic B-splines, with four knots or three segments for direct additive genetic effect and animal permanent environmental effect and two knots for maternal additive genetic effect and maternal permanent environmental effect, was the most appropriate and parsimonious model to describe the covariance structure of the data. Selection for higher weight, such as at young ages, should be performed taking into account an increase in mature cow weight. Particularly, this is important in most of Nellore beef cattle production systems, where the cow herd is maintained on range conditions. There is limited modification of the growth curve of Nellore cattle with respect to the aim of selecting them for rapid growth at young ages while maintaining constant adult weight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Grinding is a parts finishing process for advanced products and surfaces. However, continuous friction between the workpiece and the grinding wheel causes the latter to lose its sharpness, thus impairing the grinding results. This is when the dressing process is required, which consists of sharpening the worn grains of the grinding wheel. The dressing conditions strongly affect the performance of the grinding operation; hence, monitoring them throughout the process can increase its efficiency. The objective of this study was to estimate the wear of a single-point dresser using intelligent systems whose inputs were obtained by the digital processing of acoustic emission signals. Two intelligent systems, the multilayer perceptron and the Kohonen neural network, were compared in terms of their classifying ability. The harmonic content of the acoustic emission signal was found to be influenced by the condition of dresser, and when used to feed the neural networks it is possible to classify the condition of the tool under study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The aim of this study was to investigate the influence of Nd:YAG laser on the shear bond strength to enamel and dentin of total and self-etch adhesives when the laser was applied over the adhesives, before they were photopolymerized, in an attempt to create a new bonding layer by dentin-adhesive melting.Material and Methods: One-hundred twenty bovine incisors were ground to obtain flat surfaces. Specimens were divided into two substrate groups (n=60): substrate E (enamel) and substrate D (dentin). Each substrate group was subdivided into four groups (n=15), according to the surface treatment accomplished: X (Xeno III self-etching adhesive, control), XL (Xeno III + laser Nd:YAG irradiation at 140 mJ/10 Hz for 60 seconds + photopolymerization, experimental), S (acid etching + Single Bond conventional adhesive, Control), and SL (acid etching + Single Bond + laser Nd:YAG at 140 mJ/10 Hz for 60 seconds + photopolymerization, experimental). The bonding area was delimited with 3-mm-diameter adhesive tape for the bonding procedures. Cylinders of composite were fabricated on the bonding area using a Teflon matrix. The teeth were stored in water at 37 degrees C/48 h and submitted to shear testing at a crosshead speed of 0.5 mm/min in a universal testing machine. Results were analyzed with three-way analysis of variance (ANOVA; substrate, adhesive, and treatment) and Tukey tests (alpha=0.05). ANOVA revealed significant differences for the substrate, adhesive system, and type of treatment: lased or unlased (p<0.05). The mean shear bond strength values (MPa) for the enamel groups were X=20.2 +/- 5.61, XL=23.6 +/- 4.92, S=20.8 +/- 4.55, SL=22.1 +/- 5.14 and for the dentin groups were X=14.1 +/- 7.51, XL=22.2 +/- 6.45, S=11.2 +/- 5.77, SL=15.9 +/- 3.61. For dentin, Xeno III self-etch adhesive showed significantly higher shear bond strength compared with Single Bond total-etch adhesive; Nd:YAG laser irradiation showed significantly higher shear bond strength compared with control (unlased).Conclusion: Nd:YAG laser application prior to photopolymerization of adhesive systems significantly increased the bond strength to dentin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web content hosting, in which a Web server stores and provides Web access to documents for different customers, is becoming increasingly common. For example, a web server can host webpages for several different companies and individuals. Traditionally, Web Service Providers (WSPs) provide all customers with the same level of performance (best-effort service). Most service differentiation has been in the pricing structure (individual vs. business rates) or the connectivity type (dial-up access vs. leased line, etc.). This report presents DiffServer, a program that implements two simple, server-side, application-level mechanisms (server-centric and client-centric) to provide different levels of web service. The results of the experiments show that there is not much overhead due to the addition of this additional layer of abstraction between the client and the Apache web server under light load conditions. Also, the average waiting time for high priority requests decreases significantly after they are assigned priorities as compared to a FIFO approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to estimate (co)variance components using random regression on B-spline functions to weight records obtained from birth to adulthood. A total of 82 064 weight records of 8145 females obtained from the data bank of the Nellore Breeding Program (PMGRN/Nellore Brazil) which started in 1987, were used. The models included direct additive and maternal genetic effects and animal and maternal permanent environmental effects as random. Contemporary group and dam age at calving (linear and quadratic effect) were included as fixed effects, and orthogonal Legendre polynomials of age (cubic regression) were considered as random covariate. The random effects were modeled using B-spline functions considering linear, quadratic and cubic polynomials for each individual segment. Residual variances were grouped in five age classes. Direct additive genetic and animal permanent environmental effects were modeled using up to seven knots (six segments). A single segment with two knots at the end points of the curve was used for the estimation of maternal genetic and maternal permanent environmental effects. A total of 15 models were studied, with the number of parameters ranging from 17 to 81. The models that used B-splines were compared with multi-trait analyses with nine weight traits and to a random regression model that used orthogonal Legendre polynomials. A model fitting quadratic B-splines, with four knots or three segments for direct additive genetic effect and animal permanent environmental effect and two knots for maternal additive genetic effect and maternal permanent environmental effect, was the most appropriate and parsimonious model to describe the covariance structure of the data. Selection for higher weight, such as at young ages, should be performed taking into account an increase in mature cow weight. Particularly, this is important in most of Nellore beef cattle production systems, where the cow herd is maintained on range conditions. There is limited modification of the growth curve of Nellore cattle with respect to the aim of selecting them for rapid growth at young ages while maintaining constant adult weight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Zusammenfassung:Die Quartärstruktur des respiratorischen Proteins Hämocyanin (Isoform HtH1) aus der marinen Schnecke Haliotis tuberculata wurde vermittels Kryoelektronen-mikroskopie und 3D-Rekonstruktion untersucht. Das Molekül ist zylinderförmig, hat einen Durchmesser von ca. 35 nm und besteht aus einer Zylinderwand und einem internen Kragenkomplex. Dieser wiederum besteht aus einem Collar und einem Arc.Die kryoelektronenmikroskopischen Aufnahmen von in glasartigem Eis fixierten HtH1-Molekülen brachte eine enorme Verbesserung der Anzahl der zur Verfügung stehenden Ansichtswinkel gegenüber den negativkontrastierten Molekülen, die auf Karbonfilm präpariert waren.Die 3D-Rekonstruktion des HtH1 mittels Aufnahmen bei drei verschiedenen Defo-kuswerten verbesserte die Auflösung noch einmal deutlich gegenüber den Rekon-struktionen, die aus Aufnahmen bei einem festen Defokuswert gemacht wurden, und zwar auf 12 Å. Das Molekül besitzt eine D5-Symmetrie.Aus dieser bisher genausten Rekonstruktion eines Molluskenhämocyanins aus EM-Bildern ließen sich folgende neue Strukturdetails ableiten:· Ein Untereinheitendimer konnte als Repeating Unit im Dekamer des HtH1 beschrieben werden.· Das Untereinheitendimer konnte aus der 3D-Dichtekarte isoliert werden. Es be-steht eindeutig aus 16 Massen, die funktionellen Domänen entsprechen. Zwei dieser Massen bilden den Collar, zwei den Arc und 12 das Wandsegment.· Die gegenläufige Anordnung der beiden Untereinheiten innerhalb dieses Unte-reinheitendimers konnten bestätigt und auf zwei Möglichkeiten eingeschränkt werden.· Die Zahl der alternativen Anordnungen der 16 funktionellen Domänen (HtH1-a bis HtH1-h) im Untereinheitendimer konnten von 80 auf 2 eingeengt werden.· Es konnte über molekulares Modellieren mithilfe einer publizierten Kristallstruk-tur eine 3D-Struktur fastatomarer Auflösung der funktionellen Domäne HtH1-g berechnet werden.· Die funktionelle Domäne HtH1-g konnte als Domänenpaar plausibel in die 3D?Dichtekarte des Untereinheitendimers eingepasst werden, und zwar in die beiden Massen des Arc.Aus der elektronenmikroskopisch gewonnenen Dichtekarte wurde mit Hilfe des

Relevância:

100.00% 100.00%

Publicador:

Resumo:

“Cartographic heritage” is different from “cartographic history”. The second term refers to the study of the development of surveying and drawing techniques related to maps, through time, i.e. through different types of cultural environment which were background for the creation of maps. The first term concerns the whole amount of ancient maps, together with these different types of cultural environment, which the history has brought us and which we perceive as cultural values to be preserved and made available to many users (public, institutions, experts). Unfortunately, ancient maps often suffer preservation problems of their analog support, mostly due to aging. Today, metric recovery in digital form and digital processing of historical cartography allow preserving map heritage. Moreover, modern geomatic techniques give us new chances of using historical information, which would be unachievable on analog supports. In this PhD thesis, the whole digital processing of recovery and elaboration of ancient cartography is reported, with a special emphasis on the use of digital tools in preservation and elaboration of cartographic heritage. It is possible to divide the workflow into three main steps, that reflect the chapter structure of the thesis itself: • map acquisition: conversion of the ancient map support from analog to digital, by means of high resolution scanning or 3D surveying (digital photogrammetry or laser scanning techniques); this process must be performed carefully, with special instruments, in order to reduce deformation as much as possible; • map georeferencing: reproducing in the digital image the native metric content of the map, or even improving it by selecting a large number of still existing ground control points; this way it is possible to understand the projection features of the historical map, as well as to evaluate and represent the degree of deformation induced by the old type of cartographic transformation (that can be unknown to us), by surveying errors or by support deformation, usually all errors of too high value with respect to our standards; • data elaboration and management in a digital environment, by means of modern software tools: vectorization, giving the map a new and more attractive graphic view (for instance, by creating a 3D model), superimposing it on current base maps, comparing it to other maps, and finally inserting it in GIS or WebGIS environment as a specific layer. The study is supported by some case histories, each of them interesting from the point of view of one digital cartographic elaboration step at least. The ancient maps taken into account are the following ones: • three maps of the Po river delta, made at the end of the XVI century by a famous land-surveyor, Ottavio Fabri (he is single author in the first map, co-author with Gerolamo Pontara in the second map, co-author with Bonajuto Lorini and others in the third map), who wrote a methodological textbook where he explains a new topographical instrument, the squadra mobile (mobile square) invented and used by himself; today all maps are preserved in the State Archive of Venice; • the Ichnoscenografia of Bologna by Filippo de’ Gnudi, made in the 1702 and today preserved in the Archiginnasio Library of Bologna; it is a scenographic view of the city, captured in a bird’s eye flight, but also with an icnographic value, as the author himself declares; • the map of Bologna by the periti Gregorio Monari and Antonio Laghi, the first map of the city derived from a systematic survey, even though it was made only ten years later (1711–1712) than the map by de’ Gnudi; in this map the scenographic view was abandoned, in favor of a more correct representation by means of orthogonal projection; today the map is preserved in the State Archive of Bologna; • the Gregorian Cadastre of Bologna, made in 1831 and updated until 1927, now preserved in the State Archive of Bologna; it is composed by 140 maps and 12 brogliardi (register volumes). In particular, the three maps of the Po river delta and the Cadastre were studied with respect to their acquisition procedure. Moreover, the first maps were analyzed from the georeferencing point of view, and the Cadastre was analyzed with respect to a possible GIS insertion. Finally, the Ichnoscenografia was used to illustrate a possible application of digital elaboration, such as 3D modeling. Last but not least, we must not forget that the study of an ancient map should start, whenever possible, from the consultation of the precious original analogical document; analysis by means of current digital techniques allow us new research opportunities in a rich and modern multidisciplinary context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A sample scanning confocal optical microscope (SCOM) was designed and constructed in order to perform local measurements of fluorescence, light scattering and Raman scattering. This instrument allows to measure time resolved fluorescence, Raman scattering and light scattering from the same diffraction limited spot. Fluorescence from single molecules and light scattering from metallic nanoparticles can be studied. First, the electric field distribution in the focus of the SCOM was modelled. This enables the design of illumination modes for different purposes, such as the determination of the three-dimensional orientation of single chromophores. Second, a method for the calculation of the de-excitation rates of a chromophore was presented. This permits to compare different detection schemes and experimental geometries in order to optimize the collection of fluorescence photons. Both methods were combined to calculate the SCOM fluorescence signal of a chromophore in a general layered system. The fluorescence excitation and emission of single molecules through a thin gold film was investigated experimentally and modelled. It was demonstrated that, due to the mediation of surface plasmons, single molecule fluorescence near a thin gold film can be excited and detected with an epi-illumination scheme through the film. Single molecule fluorescence as close as 15nm to the gold film was studied in this manner. The fluorescence dynamics (fluorescence blinking and excited state lifetime) of single molecules was studied in the presence and in the absence of a nearby gold film in order to investigate the influence of the metal on the electronic transition rates. The trace-histogram and the autocorrelation methods for the analysis of single molecule fluorescence blinking were presented and compared via the analysis of Monte-Carlo simulated data. The nearby gold influences the total decay rate in agreement to theory. The gold presence produced no influence on the ISC rate from the excited state to the triplet but increased by a factor of 2 the transition rate from the triplet to the singlet ground state. The photoluminescence blinking of Zn0.42Cd0.58Se QDs on glass and ITO substrates was investigated experimentally as a function of the excitation power (P) and modelled via Monte-Carlo simulations. At low P, it was observed that the probability of a certain on- or off-time follows a negative power-law with exponent near to 1.6. As P increased, the on-time fraction reduced on both substrates whereas the off-times did not change. A weak residual memory effect between consecutive on-times and consecutive off-times was observed but not between an on-time and the adjacent off-time. All of this suggests the presence of two independent mechanisms governing the lifetimes of the on- and off-states. The simulated data showed Poisson-distributed off- and on-intensities, demonstrating that the observed non-Poissonian on-intensity distribution of the QDs is not a product of the underlying power-law probability and that the blinking of QDs occurs between a non-emitting off-state and a distribution of emitting on-states with different intensities. All the experimentally observed photo-induced effects could be accounted for by introducing a characteristic lifetime tPI of the on-state in the simulations. The QDs on glass presented a tPI proportional to P-1 suggesting the presence of a one-photon process. Light scattering images and spectra of colloidal and C-shaped gold nano-particles were acquired. The minimum size of a metallic scatterer detectable with the SCOM lies around 20 nm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synthetic Biology is a relatively new discipline, born at the beginning of the New Millennium, that brings the typical engineering approach (abstraction, modularity and standardization) to biotechnology. These principles aim to tame the extreme complexity of the various components and aid the construction of artificial biological systems with specific functions, usually by means of synthetic genetic circuits implemented in bacteria or simple eukaryotes like yeast. The cell becomes a programmable machine and its low-level programming language is made of strings of DNA. This work was performed in collaboration with researchers of the Department of Electrical Engineering of the University of Washington in Seattle and also with a student of the Corso di Laurea Magistrale in Ingegneria Biomedica at the University of Bologna: Marilisa Cortesi. During the collaboration I contributed to a Synthetic Biology project already started in the Klavins Laboratory. In particular, I modeled and subsequently simulated a synthetic genetic circuit that was ideated for the implementation of a multicelled behavior in a growing bacterial microcolony. In the first chapter the foundations of molecular biology are introduced: structure of the nucleic acids, transcription, translation and methods to regulate gene expression. An introduction to Synthetic Biology completes the section. In the second chapter is described the synthetic genetic circuit that was conceived to make spontaneously emerge, from an isogenic microcolony of bacteria, two different groups of cells, termed leaders and followers. The circuit exploits the intrinsic stochasticity of gene expression and intercellular communication via small molecules to break the symmetry in the phenotype of the microcolony. The four modules of the circuit (coin flipper, sender, receiver and follower) and their interactions are then illustrated. In the third chapter is derived the mathematical representation of the various components of the circuit and the several simplifying assumptions are made explicit. Transcription and translation are modeled as a single step and gene expression is function of the intracellular concentration of the various transcription factors that act on the different promoters of the circuit. A list of the various parameters and a justification for their value closes the chapter. In the fourth chapter are described the main characteristics of the gro simulation environment, developed by the Self Organizing Systems Laboratory of the University of Washington. Then, a sensitivity analysis performed to pinpoint the desirable characteristics of the various genetic components is detailed. The sensitivity analysis makes use of a cost function that is based on the fraction of cells in each one of the different possible states at the end of the simulation and the wanted outcome. Thanks to a particular kind of scatter plot, the parameters are ranked. Starting from an initial condition in which all the parameters assume their nominal value, the ranking suggest which parameter to tune in order to reach the goal. Obtaining a microcolony in which almost all the cells are in the follower state and only a few in the leader state seems to be the most difficult task. A small number of leader cells struggle to produce enough signal to turn the rest of the microcolony in the follower state. It is possible to obtain a microcolony in which the majority of cells are followers by increasing as much as possible the production of signal. Reaching the goal of a microcolony that is split in half between leaders and followers is comparatively easy. The best strategy seems to be increasing slightly the production of the enzyme. To end up with a majority of leaders, instead, it is advisable to increase the basal expression of the coin flipper module. At the end of the chapter, a possible future application of the leader election circuit, the spontaneous formation of spatial patterns in a microcolony, is modeled with the finite state machine formalism. The gro simulations provide insights into the genetic components that are needed to implement the behavior. In particular, since both the examples of pattern formation rely on a local version of Leader Election, a short-range communication system is essential. Moreover, new synthetic components that allow to reliably downregulate the growth rate in specific cells without side effects need to be developed. In the appendix are listed the gro code utilized to simulate the model of the circuit, a script in the Python programming language that was used to split the simulations on a Linux cluster and the Matlab code developed to analyze the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il Web nel corso della sua esistenza ha subito un mutamento dovuto in parte dalle richieste del mercato, ma soprattutto dall’evoluzione e la nascita costante delle numerose tecnologie coinvolte in esso. Si è passati da un’iniziale semplice diffusione di contenuti statici, ad una successiva collezione di siti web, dapprima con limitate presenze di dinamicità e interattività (a causa dei limiti tecnologici), ma successivamente poi evoluti alle attuali applicazioni web moderne che hanno colmato il gap con le applicazioni desktop, sia a livello tecnologico, che a livello di diffusione effettiva sul mercato. Tali applicazioni web moderne possono presentare un grado di complessità paragonabile in tutto e per tutto ai sistemi software desktop tradizionali; le tecnologie web hanno subito nel tempo un evoluzione legata ai cambiamenti del web stesso e tra le tecnologie più diffuse troviamo JavaScript, un linguaggio di scripting nato per dare dinamicità ai siti web che si ritrova tutt’ora ad essere utilizzato come linguaggio di programmazione di applicazioni altamente strutturate. Nel corso degli anni la comunità di sviluppo che ruota intorno a JavaScript ha prodotto numerose librerie al supporto del linguaggio dotando così gli sviluppatori di un linguaggio completo in grado di far realizzare applicazioni web avanzate. Le recenti evoluzioni dei motori javascript presenti nei browser hanno inoltre incrementato le prestazioni del linguaggio consacrandone la sua leadership nei confronti dei linguaggi concorrenti. Negli ultimi anni a causa della crescita della complessità delle applicazioni web, javascript è stato messo molto in discussione in quanto come linguaggio non offre le classiche astrazioni consolidate nel tempo per la programmazione altamente strutturata; per questo motivo sono nati linguaggi orientati alla programmazione ad oggetti per il web che si pongono come obiettivo la risoluzione di questo problema: tra questi si trovano linguaggi che hanno l’ambizione di soppiantare JavaScript come ad esempio Dart creato da Google, oppure altri che invece sfruttano JavaScript come linguaggio base al quale aggiungono le caratteristiche mancanti e, mediante il processo di compilazione, producono codice JavaScript puro compatibile con i motori JavaScript presenti nei browser. JavaScript storicamente fu introdotto come linguaggio sia per la programmazione client-side, che per la controparte server-side, ma per vari motivi (la forte concorrenza, basse performance, etc.) ebbe successo solo come linguaggio per la programmazione client; le recenti evoluzioni del linguaggio lo hanno però riportato in auge anche per la programmazione server-side, soprattutto per i miglioramenti delle performance, ma anche per la sua naturale predisposizione per la programmazione event-driven, paradigma alternativo al multi-threading per la programmazione concorrente. Un’applicazione web di elevata complessità al giorno d’oggi può quindi essere interamente sviluppata utilizzando il linguaggio JavaScript, acquisendone sia i suoi vantaggi che gli svantaggi; le nuove tecnologie introdotte ambiscono quindi a diventare la soluzione per i problemi presenti in JavaScript e di conseguenza si propongono come potenziali nuovi linguaggi completi per la programmazione web del futuro, anticipando anche le prossime evoluzioni delle tecnologie già esistenti preannunciate dagli enti standard della programmazione web, il W3C ed ECMAScript. In questa tesi saranno affrontate le tematiche appena introdotte confrontando tra loro le tecnologie in gioco con lo scopo di ottenere un’ampia panoramica delle soluzioni che uno sviluppatore web dovrà prendere in considerazione per realizzare un sistema di importanti dimensioni; in particolare sarà approfondito il linguaggio TypeScript proposto da Microsoft, il quale è nato in successione a Dart apparentemente con lo stesso scopo, ma grazie alla compatibilità con JavaScript e soprattutto con il vasto mondo di librerie legate ad esso nate in questi ultimi anni, si presenta nel mercato come tecnologia facile da apprendere per tutti gli sviluppatori che già da tempo hanno sviluppato abilità nella programmazione JavaScript.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this thesis is the power transient analysis concerning experimental devices placed within the reflector of Jules Horowitz Reactor (JHR). Since JHR material testing facility is designed to achieve 100 MW core thermal power, a large reflector hosts fissile material samples that are irradiated up to total relevant power of 3 MW. MADISON devices are expected to attain 130 kW, conversely ADELINE nominal power is of some 60 kW. In addition, MOLFI test samples are envisaged to reach 360 kW for what concerns LEU configuration and up to 650 kW according to HEU frame. Safety issues concern shutdown transients and need particular verifications about thermal power decreasing of these fissile samples with respect to core kinetics, as far as single device reactivity determination is concerned. Calculation model is conceived and applied in order to properly account for different nuclear heating processes and relative time-dependent features of device transients. An innovative methodology is carried out since flux shape modification during control rod insertions is investigated regarding the impact on device power through core-reflector coupling coefficients. In fact, previous methods considering only nominal core-reflector parameters are then improved. Moreover, delayed emissions effect is evaluated about spatial impact on devices of a diffuse in-core delayed neutron source. Delayed gammas transport related to fission products concentration is taken into account through evolution calculations of different fuel compositions in equilibrium cycle. Provided accurate device reactivity control, power transients are then computed for every sample according to envisaged shutdown procedures. Results obtained in this study are aimed at design feedback and reactor management optimization by JHR project team. Moreover, Safety Report is intended to utilize present analysis for improved device characterization.