897 resultados para end-to-end testing, javascript, application web, single-page application


Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesi analizza e approfondisce vari concetti tra cui l'e-commerce e il suo impatto nel mercato degli ultimi decenni, le fasi progettuali di un sito web, i linguaggi principali sullo sviluppo web, la piattaforma Magento e le sue funzionalità. Dopo lo studio e l'analisi di tali concetti si è passato alla fase di realizzazione del sito web applicando le conoscenze e gli strumenti base della programmazione web come i linguaggi PHP, HTML5, CSS3 e JavaScript.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In der vorliegenden Arbeit werden Photopionproduktion (PPP) und Elektropionproduktion (EPP) im Rahmen der manifest lorentzinvarianten baryonischen chiralen Störungstheorie untersucht. Dabei werden zwei verschiedene Ansätze verfolgt. Zum einen wird eine Rechnung auf Einschleifenniveau bis zur chiralen Ordnung O(q^4) mit Pionen und Nukleonen als Freiheitsgrade durchgeführt, um die Energieabhängigkeit der Reaktionen über einen möglichst großen Bereich zu beschreiben. Um die Abhängigkeit von der Photonvirtualität in der EPP zu verbessern, werden zum anderen in einer zweiten Rechnung Vektormesonen in die Theorie einbezogen. Diese Rechnung wird bis zur chiralen Ordnung O(q^3) auf Einschleifenniveau durchgeführt. rnrnVon den vier physikalischen Prozessen in PPP und EPP sind nur drei experimentell zugänglich. Untersucht werden diese Reaktionen an mehreren verschiedenen Anlagen, z.B. in Mainz, Bonn oder Saskatoon. Die dort gewonnenen Daten werden hier verwendet, um die Grenzen der chiralen Störungstheorie auszuloten. rnrnDiese Arbeit stellt die erste, vollständige, manifest lorentzinvariante Rechnung in O(q^4) für PPP und EPP, und die erste jemals durchgeführte Rechnung mit Vektormesonen als Freiheitsgrade für diesen Prozess, dar. Neben der Berechnung der physikalischen Observablen wird auch eine Partialwellenzerlegung durchgeführt und die wichtigsten Multipole untersucht. Diese lassen sich aus den gewonnenen Amplituden extrahieren und bieten eine gute Möglichkeit das Nukleon und Resonanzen zu untersuchen. rnrnUm das Matrixelement für die Prozesse berechnen zu können, wurden verschiedene Routinen für das Computeralgebrasystem Mathematica entwickelt, da die Anzahl der zu bestimmenden Diagramme sehr groß ist. Für die Multipolzerlegung werden zwei verschiedene Programme verwendet. Zum einen das bereits existierende Programm XMAID, welches für diese Arbeit entsprechend modifiziert wurde. Zum anderen wurden vergleichbare Routinen für Mathematica entwickelt. Am Ende der Analysen werden die verschiedenen Rechnungen bezüglich ihrer Anwendbarkeit auf PPP und EPP verglichen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studio sulle tecnologie web e realizzazione applicativo multi-piattaforma con tecnologie HTML e JavaScript

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Flachwassergleichungen (SWE) sind ein hyperbolisches System von Bilanzgleichungen, die adäquate Approximationen an groß-skalige Strömungen der Ozeane, Flüsse und der Atmosphäre liefern. Dabei werden Masse und Impuls erhalten. Wir unterscheiden zwei charakteristische Geschwindigkeiten: die Advektionsgeschwindigkeit, d.h. die Geschwindigkeit des Massentransports, und die Geschwindigkeit von Schwerewellen, d.h. die Geschwindigkeit der Oberflächenwellen, die Energie und Impuls tragen. Die Froude-Zahl ist eine Kennzahl und ist durch das Verhältnis der Referenzadvektionsgeschwindigkeit zu der Referenzgeschwindigkeit der Schwerewellen gegeben. Für die oben genannten Anwendungen ist sie typischerweise sehr klein, z.B. 0.01. Zeit-explizite Finite-Volume-Verfahren werden am öftersten zur numerischen Berechnung hyperbolischer Bilanzgleichungen benutzt. Daher muss die CFL-Stabilitätsbedingung eingehalten werden und das Zeitinkrement ist ungefähr proportional zu der Froude-Zahl. Deswegen entsteht bei kleinen Froude-Zahlen, etwa kleiner als 0.2, ein hoher Rechenaufwand. Ferner sind die numerischen Lösungen dissipativ. Es ist allgemein bekannt, dass die Lösungen der SWE gegen die Lösungen der Seegleichungen/ Froude-Zahl Null SWE für Froude-Zahl gegen Null konvergieren, falls adäquate Bedingungen erfüllt sind. In diesem Grenzwertprozess ändern die Gleichungen ihren Typ von hyperbolisch zu hyperbolisch.-elliptisch. Ferner kann bei kleinen Froude-Zahlen die Konvergenzordnung sinken oder das numerische Verfahren zusammenbrechen. Insbesondere wurde bei zeit-expliziten Verfahren falsches asymptotisches Verhalten (bzgl. der Froude-Zahl) beobachtet, das diese Effekte verursachen könnte.Ozeanographische und atmosphärische Strömungen sind typischerweise kleine Störungen eines unterliegenden Equilibriumzustandes. Wir möchten, dass numerische Verfahren für Bilanzgleichungen gewisse Equilibriumzustände exakt erhalten, sonst können künstliche Strömungen vom Verfahren erzeugt werden. Daher ist die Quelltermapproximation essentiell. Numerische Verfahren die Equilibriumzustände erhalten heißen ausbalanciert.rnrnIn der vorliegenden Arbeit spalten wir die SWE in einen steifen, linearen und einen nicht-steifen Teil, um die starke Einschränkung der Zeitschritte durch die CFL-Bedingung zu umgehen. Der steife Teil wird implizit und der nicht-steife explizit approximiert. Dazu verwenden wir IMEX (implicit-explicit) Runge-Kutta und IMEX Mehrschritt-Zeitdiskretisierungen. Die Raumdiskretisierung erfolgt mittels der Finite-Volumen-Methode. Der steife Teil wird mit Hilfe von finiter Differenzen oder au eine acht mehrdimensional Art und Weise approximniert. Zur mehrdimensionalen Approximation verwenden wir approximative Evolutionsoperatoren, die alle unendlich viele Informationsausbreitungsrichtungen berücksichtigen. Die expliziten Terme werden mit gewöhnlichen numerischen Flüssen approximiert. Daher erhalten wir eine Stabilitätsbedingung analog zu einer rein advektiven Strömung, d.h. das Zeitinkrement vergrößert um den Faktor Kehrwert der Froude-Zahl. Die in dieser Arbeit hergeleiteten Verfahren sind asymptotisch erhaltend und ausbalanciert. Die asymptotischer Erhaltung stellt sicher, dass numerische Lösung das "korrekte" asymptotische Verhalten bezüglich kleiner Froude-Zahlen besitzt. Wir präsentieren Verfahren erster und zweiter Ordnung. Numerische Resultate bestätigen die Konvergenzordnung, so wie Stabilität, Ausbalanciertheit und die asymptotische Erhaltung. Insbesondere beobachten wir bei machen Verfahren, dass die Konvergenzordnung fast unabhängig von der Froude-Zahl ist.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation document deals with the development of a project, over a span of more than two years, carried out within the scope of the Arrowhead Framework and which bears my personal contribution in several sections. The final part of the project took place during a visiting period at the university of Luleå. The Arrowhead Project is an European project, belonging to the ARTEMIS association, which aims to foster new technologies and unify the access to them into an unique framework. Such technologies include the Internet of Things phe- nomenon, Smart Houses, Electrical Mobility and renewable energy production. An application is considered compliant with such framework when it respects the Service Oriented Architecture paradigm and it is able to interact with a set of defined components called Arrowhead Core Services. My personal contribution to this project is given by the development of several user-friendly API, published in the project's main repository, and the integration of a legacy system within the Arrowhead Framework. The implementation of this legacy system was initiated by me in 2012 and, after many improvements carried out by several developers in UniBO, it has been again significantly modified this year in order to achieve compatibility. The system consists of a simulation of an urban scenario where a certain amount of electrical vehicles are traveling along their specified routes. The vehicles are con-suming their battery and, thus, need to recharge at the charging stations. The electrical vehicles need to use a reservation mechanism to be able to recharge and avoid waiting lines, due to the long recharge process. The integration with the above mentioned framework consists in the publication of the services that the system provides to the end users through the instantiation of several Arrowhead Service Producers, together with a demo Arrowhead- compliant client application able to consume such services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L'obbiettivo di questa tesi è mostrare come attualmente sia possibile creare applicazioni web in maniera semplice e veloce, tramite l'utilizzo del framework Meteor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays we live in densely populated regions and this leads to many environmental issues. Among all pollutants that human activities originate, metals are relevant because they can be potentially toxic for most of living beings. We studied the fate of Cd, Cr, Cu, Fe, Mn, Ni, Pb and Zn in a vineyard environment analysing samples of plant, wine and soil. Sites were chosen considering the type of wine produced, the type of cultivation (both organic and conventional agriculture) and the geographic location. We took vineyards that cultivate the same grape variety, the Trebbiano). We investigated 5 vineyards located in the Ravenna district (Italy): two on the Lamone Valley slopes, one in the area of river-bank deposits near Ravenna city, then a farm near Lugo and one near Bagnacavallo in interfluve regions. We carried out a very detailed characterization of soils in the sites, including the analysis of: pH, electric conductivity, texture, total carbonate and extimated content of dolomite, active carbonate, iron from ammonium oxalate, Iron Deficiency Chlorosis Index (IDCI), total nitrogen and organic carbon, available phosphorous, available potassium and Cation Exchange Capacity (CEC). Then we made the analysis of the bulk chemical composition and a DTPA extraction to determine the available fraction of elements in soils. All the sites have proper ground to cultivate, with already a good amount of nutrients, such as not needing strong fertilisations, but a vineyard on hills suffers from iron deficiency chlorosis due to the high level of active carbonate. We found some soils with much silica and little calcium oxide that confirm the marly sandstone substratum, while other soils have more calcium oxide and more aluminium oxide that confirm the argillaceous marlstone substratum. We found some critical situations, such as high concentrations of Chromium, especially in the farm near Lugo, and we noticed differences between organic vineyards and conventional ones: the conventional ones have a higher enrichment in soils of some metals (Copper and Zinc). Each metal accumulates differently in every single part of grapevines. We found differences between hill plants and lowland ones: behaviors of plants in metal accumulations seems to have patterns. Metals are more abundant in barks, then in leaves or sometimes in roots. Plants seem trying to remove excesses of metal storing them in bark. Two wines have excess of acetic acid and one conventional farm produces wine with content of Zinc over the Italian law limit. We already found evidence of high values relating them with uncontaminated environments, but more investigations are suggested to link those values to their anthropogenic supplies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Altered pressure in the developing left ventricle (LV) results in altered morphology and tissue material properties. Mechanical stress and strain may play a role in the regulating process. This study showed that confocal microscopy, three-dimensional reconstruction, and finite element analysis can provide a detailed model of stress and strain in the trabeculated embryonic heart. The method was used to test the hypothesis that end-diastolic strains are normalized after altered loading of the LV during the stages of trabecular compaction and chamber formation. Stage-29 chick LVs subjected to pressure overload and underload at stage 21 were reconstructed with full trabecular morphology from confocal images and analyzed with finite element techniques. Measured material properties and intraventricular pressures were specified in the models. The results show volume-weighted end-diastolic von Mises stress and strain averaging 50–82% higher in the trabecular tissue than in the compact wall. The volume-weighted-average stresses for the entire LV were 115, 64, and 147Pa in control, underloaded, and overloaded models, while strains were 11, 7, and 4%; thus, neither was normalized in a volume-weighted sense. Localized epicardial strains at mid-longitudinal level were similar among the three groups and to strains measured from high-resolution ultrasound images. Sensitivity analysis showed changes in material properties are more significant than changes in geometry in the overloaded strain adaptation, although resulting stress was similar in both types of adaptation. These results emphasize the importance of appropriate metrics and the role of trabecular tissue in evaluating the evolution of stress and strain in relation to pressure-induced adaptation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Open web steel joists are designed in the United States following the governing specification published by the Steel Joist Institute. For compression members in joists, this specification employs an effective length factor, or K-factor, in confirming their adequacy. In most cases, these K-factors have been conservatively assumed equal to 1.0 for compression web members, regardless of the fact that intuition and limited experimental work indicate that smaller values could be justified. Given that smaller K-factors could result in more economical designs without a loss in safety, the research presented in this thesis aims to suggest procedures for obtaining more rational values. Three different methods for computing in-plane and out-of-plane K-factors are investigated, including (1) a hand calculation method based on the use of alignment charts, (2) computational critical load (eigenvalue) analyses using uniformly distributed loads, and (3) computational analyses using a compressive strain approach. The latter method is novel and allows for computing the individual buckling load of a specific member within a system, such as a joist. Four different joist configurations are investigated, including an 18K3, 28K10, and two variations of a 32LH06. Based on these methods and the very limited number of joists studied, it appears promising that in-plane and out-of-plane K-factors of 0.75 and 0.85, respectively, could be used in computing the flexural buckling strength of web members in routine steel joist design. Recommendations for future work, which include systematically investigating a wider range of joist configurations and connection restraint, are provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context-Daytime sleepiness in kidney transplant recipients has emerged as a potential predictor of impaired adherence to the immunosuppressive medication regimen. Thus there is a need to assess daytime sleepiness in clinical practice and transplant registries.Objective-To evaluate the validity of a single-item measure of daytime sleepiness integrated in the Swiss Transplant Cohort Study (STCS), using the American Educational Research Association framework.Methods-Using a cross-sectional design, we enrolled a convenience sample of 926 home-dwelling kidney transplant recipients (median age, 59.69 years; 25%-75% quartile [Q25-Q75], 50.27-59.69), 63% men; median time since transplant 9.42 years (Q25-Q75, 4.93-15.85). Daytime sleepiness was assessed by using a single item from the STCS and the 8 items of the validated Epworth Sleepiness Scale. Receiver operating characteristic curve analysis was used to determine the cutoff for the STCS daytime sleepiness item against the Epworth Sleepiness Scale score.Results-Based on the receiver operating characteristic curve analysis, a score greater than 4 on the STCS daytime sleepiness item is recommended to detect daytime sleepiness. Content validity was high as all expert reviews were unanimous. Concurrent validity was moderate (Spearman ϱ, 0.531; P< .001) and convergent validity with depression and poor sleep quality although low, was significant (ϱ, 0.235; P<.001 and ϱ, 0.318, P=.002, respectively). For the group difference validity: kidney transplant recipients with moderate, severe, and extremely severe depressive symptom scores had 3.4, 4.3, and 5.9 times higher odds of having daytime sleepiness, respectively, as compared with recipients without depressive symptoms.Conclusion-The accumulated evidence provided evidence for the validity of the STCS daytime sleepiness item as a simple screening scale for daytime sleepiness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantifying the health effects associated with simultaneous exposure to many air pollutants is now a research priority of the US EPA. Bayesian hierarchical models (BHM) have been extensively used in multisite time series studies of air pollution and health to estimate health effects of a single pollutant adjusted for potential confounding of other pollutants and other time-varying factors. However, when the scientific goal is to estimate the impacts of many pollutants jointly, a straightforward application of BHM is challenged by the need to specify a random-effect distribution on a high-dimensional vector of nuisance parameters, which often do not have an easy interpretation. In this paper we introduce a new BHM formulation, which we call "reduced BHM", aimed at analyzing clustered data sets in the presence of a large number of random effects that are not of primary scientific interest. At the first stage of the reduced BHM, we calculate the integrated likelihood of the parameter of interest (e.g. excess number of deaths attributed to simultaneous exposure to high levels of many pollutants). At the second stage, we specify a flexible random-effect distribution directly on the parameter of interest. The reduced BHM overcomes many of the challenges in the specification and implementation of full BHM in the context of a large number of nuisance parameters. In simulation studies we show that the reduced BHM performs comparably to the full BHM in many scenarios, and even performs better in some cases. Methods are applied to estimate location-specific and overall relative risks of cardiovascular hospital admissions associated with simultaneous exposure to elevated levels of particulate matter and ozone in 51 US counties during the period 1999-2005.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Michigan Basin is located in the upper Midwest region of the United States and is centered geographically over the Lower Peninsula of Michigan. It is filled primarily with Paleozoic carbonates and clastics, overlying Precambrian basement rocks and covered by Pleistocene glacial drift. In Michigan, more than 46,000 wells have been drilled in the basin, many producing significant quantities of oil and gas since the 1920s in addition to providing a wealth of data for subsurface visualization. Well log tomography, formerly log-curve amplitude slicing, is a visualization method recently developed at Michigan Technological University to correlate subsurface data by utilizing the high vertical resolution of well log curves. The well log tomography method was first successfully applied to the Middle Devonian Traverse Group within the Michigan Basin using gamma ray log curves. The purpose of this study is to prepare a digital data set for the Middle Devonian Dundee and Rogers City Limestones, apply the well log tomography method to this data and from this application, interpret paleogeographic trends in the natural radioactivity. Both the Dundee and Rogers City intervals directly underlie the Traverse Group and combined are the most prolific reservoir within the Michigan Basin. Differences between this study and the Traverse Group include increased well control and “slicing” of a more uniform lithology. Gamma ray log curves for the Dundee and Rogers City Limestones were obtained from 295 vertical wells distributed over the Lower Peninsula of Michigan, converted to Log ASCII Standard files, and input into the well log tomography program. The “slicing” contour results indicate that during the formation of the Dundee and Rogers City intervals, carbonates and evaporites with low natural radioactive signatures on gamma ray logs were deposited. This contrasts the higher gamma ray amplitudes from siliciclastic deltas that cyclically entered the basin during Traverse Group deposition. Additionally, a subtle north-south, low natural radioactive trend in the center of the basin may correlate with previously published Dundee facies tracts. Prominent trends associated with the distribution of limestone and dolomite are not observed because the regional range of gamma ray values for both carbonates are equivalent in the Michigan Basin and additional log curves are needed to separate these lithologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing importance of conserving natural resources and moving toward sustainable practices, the aging transportation infrastructure can benefit from these ideas by improving their existing recycling practices. When an asphalt pavement needs to be replaced, the existing pavement is removed and ground up. This ground material, known as reclaimed asphalt pavement (RAP), is then added into new asphalt roads. However, since RAP was exposed to years of ultraviolet degradation and environmental weathering, the material has aged and cannot be used as a direct substitute for aggregate and binder in new asphalt pavements. One material that holds potential for restoring the aged asphalt binder to a usable state is waste engine oil. This research aims to study the feasibility of using waste engine oil as a recycling agent to improve the recyclability of pavements containing RAP. Testing was conducted in three phases, asphalt binder testing, advanced asphalt binder testing, and laboratory mixture testing. Asphalt binder testing consisted of dynamic shear rheometer and rotational viscometer testing on both unaged and aged binders containing waste engine oil and reclaimed asphalt binder (RAB). Fourier Transform Infrared Spectroscopy (FTIR) testing was carried out to on the asphalt binders blended with RAB and waste engine oil compare the structural indices indicative of aging. Lastly, sample asphalt samples containing waste engine oil and RAP were subjected to rutting testing and tensile strength ratio testing. These tests lend evidence to support the claim that waste engine oil can be used as a rejuvenating agent to chemically restore asphalt pavements containing RAP. Waste engine oil can reduce the stiffness and improve the low temperature properties of asphalt binders blended with RAB. Waste engine oil can also soften asphalt pavements without having a detrimental effect on the moisture susceptibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The seasonal appearance of a deep chlorophyll maximum (DCM) in Lake Superior is a striking phenomenon that is widely observed; however its mechanisms of formation and maintenance are not well understood. As this phenomenon may be the reflection of an ecological driver, or a driver itself, a lack of understanding its driving forces limits the ability to accurately predict and manage changes in this ecosystem. Key mechanisms generally associated with DCM dynamics (i.e. ecological, physiological and physical phenomena) are examined individually and in concert to establish their role. First the prevailing paradigm, “the DCM is a great place to live”, is analyzed through an integration of the results of laboratory experiments and field measurements. The analysis indicates that growth at this depth is severely restricted and thus not able to explain the full magnitude of this phenomenon. Additional contributing mechanisms like photoadaptation, settling and grazing are reviewed with a one-dimensional mathematical model of chlorophyll and particulate organic carbon. Settling has the strongest impact on the formation and maintenance of the DCM, transporting biomass to the metalimnion and resulting in the accumulation of algae, i.e. a peak in the particulate organic carbon profile. Subsequently, shade adaptation becomes manifest as a chlorophyll maximum deeper in the water column where light conditions particularly favor the process. Shade adaptation mediates the magnitude, shape and vertical position of the chlorophyll peak. Growth at DCM depth shows only a marginal contribution, while grazing has an adverse effect on the extent of the DCM. The observed separation of the carbon biomass and chlorophyll maximum should caution scientists to equate the DCM with a large nutrient pool that is available to higher trophic levels. The ecological significance of the DCM should not be separated from the underlying carbon dynamics. When evaluated in its entirety, the DCM becomes the projected image of a structure that remains elusive to measure but represents the foundation of all higher trophic levels. These results also offer guidance in examine ecosystem perturbations such as climate change. For example, warming would be expected to prolong the period of thermal stratification, extending the late summer period of suboptimal (phosphorus-limited) growth and attendant transport of phytoplankton to the metalimnion. This reduction in epilimnetic algal production would decrease the supply of algae to the metalimnion, possibly reducing the supply of prey to the grazer community. This work demonstrates the value of modeling to challenge and advance our understanding of ecosystem dynamics, steps vital to reliable testing of management alternatives.