914 resultados para DYNAMIC-ANALYSIS
Resumo:
[EN] Sediment materials play an important role on the dynamic response of large structures where fluid-soil-structure interaction is relevant and materials of that kind are present. Dam-reservoir systems and harbor structures are examples of civil engineering constructions where those effects are significant.
Resumo:
The general aim of this work is to contribute to the energy performance assessment of ventilated façades by the simultaneous use of experimental data and numerical simulations. A significant amount of experimental work was done on different types of ventilated façades with natural ventilation. The measurements were taken on a test building. The external walls of this tower are rainscreen ventilated façades. Ventilation grills are located at the top and at the bottom of the tower. In this work the modelling of the test building using a dynamic thermal simulation program (ESP-r) is presented and the main results discussed. In order to investigate the best summer thermal performance of rainscreen ventilated skin façade a study for different setups of rainscreen walls was made. In particular, influences of ventilation grills, air cavity thickness, skin colour, skin material, orientation of façade were investigated. It is shown that some types of rainscreen ventilated façade typologies are capable of lowering the cooling energy demand of a few percent points.
Resumo:
In numerosi campi scientici l'analisi di network complessi ha portato molte recenti scoperte: in questa tesi abbiamo sperimentato questo approccio sul linguaggio umano, in particolare quello scritto, dove le parole non interagiscono in modo casuale. Abbiamo quindi inizialmente presentato misure capaci di estrapolare importanti strutture topologiche dai newtork linguistici(Degree, Strength, Entropia, . . .) ed esaminato il software usato per rappresentare e visualizzare i grafi (Gephi). In seguito abbiamo analizzato le differenti proprietà statistiche di uno stesso testo in varie sue forme (shuffolato, senza stopwords e senza parole con bassa frequenza): il nostro database contiene cinque libri di cinque autori vissuti nel XIX secolo. Abbiamo infine mostrato come certe misure siano importanti per distinguere un testo reale dalle sue versioni modificate e perché la distribuzione del Degree di un testo normale e di uno shuffolato abbiano lo stesso andamento. Questi risultati potranno essere utili nella sempre più attiva analisi di fenomeni linguistici come l'autorship attribution e il riconoscimento di testi shuffolati.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
With the outlook of improving seismic vulnerability assessment for the city of Bishkek (Kyrgyzstan), the global dynamic behaviour of four nine-storey r.c. large-panel buildings in elastic regime is studied. The four buildings were built during the Soviet era within a serial production system. Since they all belong to the same series, they have very similar geometries both in plan and in height. Firstly, ambient vibration measurements are performed in the four buildings. The data analysis composed of discrete Fourier transform, modal analysis (frequency domain decomposition) and deconvolution interferometry, yields the modal characteristics and an estimate of the linear impulse response function for the structures of the four buildings. Then, finite element models are set up for all four buildings and the results of the numerical modal analysis are compared with the experimental ones. The numerical models are finally calibrated considering the first three global modes and their results match the experimental ones with an error of less then 20%.
Resumo:
We recently reported on the Multi Wave Animator (MWA), a novel open-source tool with capability of recreating continuous physiologic signals from archived numerical data and presenting them as they appeared on the patient monitor. In this report, we demonstrate for the first time the power of this technology in a real clinical case, an intraoperative cardiopulmonary arrest following reperfusion of a liver transplant graft. Using the MWA, we animated hemodynamic and ventilator data acquired before, during, and after cardiac arrest and resuscitation. This report is accompanied by an online video that shows the most critical phases of the cardiac arrest and resuscitation and provides a basis for analysis and discussion. This video is extracted from a 33-min, uninterrupted video of cardiac arrest and resuscitation, which is available online. The unique strength of MWA, its capability to accurately present discrete and continuous data in a format familiar to clinicians, allowed us this rare glimpse into events leading to an intraoperative cardiac arrest. Because of the ability to recreate and replay clinical events, this tool should be of great interest to medical educators, researchers, and clinicians involved in quality assurance and patient safety.
Resumo:
One of the most intriguing phenomena in glass forming systems is the dynamic crossover (T(B)), occurring well above the glass temperature (T(g)). So far, it was estimated mainly from the linearized derivative analysis of the primary relaxation time τ(T) or viscosity η(T) experimental data, originally proposed by Stickel et al. [J. Chem. Phys. 104, 2043 (1996); J. Chem. Phys. 107, 1086 (1997)]. However, this formal procedure is based on the general validity of the Vogel-Fulcher-Tammann equation, which has been strongly questioned recently [T. Hecksher et al. Nature Phys. 4, 737 (2008); P. Lunkenheimer et al. Phys. Rev. E 81, 051504 (2010); J. C. Martinez-Garcia et al. J. Chem. Phys. 134, 024512 (2011)]. We present a qualitatively new way to identify the dynamic crossover based on the apparent enthalpy space (H(a)(') = dlnτ/d(1/T)) analysis via a new plot lnH(a)(') vs. 1∕T supported by the Savitzky-Golay filtering procedure for getting an insight into the noise-distorted high order derivatives. It is shown that depending on the ratio between the "virtual" fragility in the high temperature dynamic domain (m(high)) and the "real" fragility at T(g) (the low temperature dynamic domain, m = m(low)) glass formers can be splitted into two groups related to f < 1 and f > 1, (f = m(high)∕m(low)). The link of this phenomenon to the ratio between the apparent enthalpy and activation energy as well as the behavior of the configurational entropy is indicated.
Resumo:
Fuel Cells are a promising alternative energy technology. One of the biggest problems that exists in fuel cell is that of water management. A better understanding of wettability characteristics in the fuel cells is needed to alleviate the problem of water management. Contact angle data on gas diffusion layers (GDL) of the fuel cells can be used to characterize the wettability of GDL in fuel cells. A contact angle measurement program has been developed to measure the contact angle of sessile drops from drop images. Digitization of drop images induces pixel errors in the contact angle measurement process. The resulting uncertainty in contact angle measurement has been analyzed. An experimental apparatus has been developed for contact angle measurements at different temperature, with the feature to measure advancing and receding contact angles on gas diffusion layers of fuel cells.
Resumo:
Prompt gamma activation analysis (PGAA) is especially sensitive for elements with high neutron-capture cross sections, like boron, which can be detected down to a level of ng/g. However, if it is a major component, the high count rate from its signal will distort the spectra, making the evaluation difficult. A lead attenuator was introduced in front of the HPGe-detector to reduce low-energy gamma radiation and specifically the boron gamma rays reaching the detector, whose thickness was found to be optimal at 10 mm. Detection efficiencies with and without the lead attenuator were compared, and it was shown that the dynamic range of the PGAA technique was significantly increased. The method was verified with the analyses of stoichiometric compounds: TiB2, NiB, PVC, Alborex, and Alborite.
Resumo:
OBJECTIVE Texture analysis is an alternative method to quantitatively assess MR-images. In this study, we introduce dynamic texture parameter analysis (DTPA), a novel technique to investigate the temporal evolution of texture parameters using dynamic susceptibility contrast enhanced (DSCE) imaging. Here, we aim to introduce the method and its application on enhancing lesions (EL), non-enhancing lesions (NEL) and normal appearing white matter (NAWM) in multiple sclerosis (MS). METHODS We investigated 18 patients with MS and clinical isolated syndrome (CIS), according to the 2010 McDonald's criteria using DSCE imaging at different field strengths (1.5 and 3 Tesla). Tissues of interest (TOIs) were defined within 27 EL, 29 NEL and 37 NAWM areas after normalization and eight histogram-based texture parameter maps (TPMs) were computed. TPMs quantify the heterogeneity of the TOI. For every TOI, the average, variance, skewness, kurtosis and variance-of-the-variance statistical parameters were calculated. These TOI parameters were further analyzed using one-way ANOVA followed by multiple Wilcoxon sum rank testing corrected for multiple comparisons. RESULTS Tissue- and time-dependent differences were observed in the dynamics of computed texture parameters. Sixteen parameters discriminated between EL, NEL and NAWM (pAVG = 0.0005). Significant differences in the DTPA texture maps were found during inflow (52 parameters), outflow (40 parameters) and reperfusion (62 parameters). The strongest discriminators among the TPMs were observed in the variance-related parameters, while skewness and kurtosis TPMs were in general less sensitive to detect differences between the tissues. CONCLUSION DTPA of DSCE image time series revealed characteristic time responses for ELs, NELs and NAWM. This may be further used for a refined quantitative grading of MS lesions during their evolution from acute to chronic state. DTPA discriminates lesions beyond features of enhancement or T2-hypersignal, on a numeric scale allowing for a more subtle grading of MS-lesions.
Resumo:
Most statistical analysis, theory and practice, is concerned with static models; models with a proposed set of parameters whose values are fixed across observational units. Static models implicitly assume that the quantified relationships remain the same across the design space of the data. While this is reasonable under many circumstances this can be a dangerous assumption when dealing with sequentially ordered data. The mere passage of time always brings fresh considerations and the interrelationships among parameters, or subsets of parameters, may need to be continually revised. ^ When data are gathered sequentially dynamic interim monitoring may be useful as new subject-specific parameters are introduced with each new observational unit. Sequential imputation via dynamic hierarchical models is an efficient strategy for handling missing data and analyzing longitudinal studies. Dynamic conditional independence models offers a flexible framework that exploits the Bayesian updating scheme for capturing the evolution of both the population and individual effects over time. While static models often describe aggregate information well they often do not reflect conflicts in the information at the individual level. Dynamic models prove advantageous over static models in capturing both individual and aggregate trends. Computations for such models can be carried out via the Gibbs sampler. An application using a small sample repeated measures normally distributed growth curve data is presented. ^
Resumo:
The behavior of sample components whose pI values are outside the pH gradient established by 101 hypothetical biprotic carrier ampholytes covering a pH 6-8 range was investigated by computer simulation under constant current conditions with concomitant constant electroosmosis toward the cathode. Data obtained with the sample being applied between zones of carrier ampholytes and on the anodic side of the carrier ampholytes were studied and found to evolve into zone structures comprising three regions between anolyte and catholyte. The focusing region with the pH gradient is bracketed by two isotachopheretic zone structures comprising selected sample and carrier components as isotachophoretic zones. The isotachophoretic structures electrophoretically migrate in opposite direction and their lengths increase with time due to the gradual isotachophoretic decay at the pH gradient edges. Due to electroosmosis, however, the overall pattern is being transported toward the cathode. Sample components whose pI values are outside the established pH gradient are demonstrated to form isotachophoretic zones behind the leading cation of the catholyte (components with pI values larger than 8) and the leading anion of the anolyte (components with pI values smaller than 6). Amphoteric compounds with appropriate pI values or nonamphoteric components can act as isotachophoretic spacer compounds between sample compounds or between the leader and the sample with the highest mobility. The simulation data obtained provide for the first time insight into the dynamics of amphoteric sample components that do not focus within the established pH gradient.