44 resultados para Flame Acceleration
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
We derive analytical expressions for the propagation speed of downward combustion fronts of thin solid fuels with a background flow initially at rest. The classical combustion model for thin solid fuels that consists of five coupled reaction-convection-diffusion equations is here reduced into a single equation with the gas temperature as the single variable. For doing so we apply a two-zone combustion model that divides the system into a preheating region and a pyrolyzing region. The speed of the combustion front is obtained after matching the temperature and its derivative at the location that separates both regions.We also derive a simplified version of this analytical expression expected to be valid for a wide range of cases. Flame front velocities predicted by our analyticalexpressions agree well with experimental data found in the literature for a large variety of cases and substantially improve the results obtained from a previous well-known analytical expression
Resumo:
The effects of flow induced by a random acceleration field (g-jitter) are considered in two related situations that are of interest for microgravity fluid experiments: the random motion of isolated buoyant particles, and diffusion driven coarsening of a solid-liquid mixture. We start by analyzing in detail actual accelerometer data gathered during a recent microgravity mission, and obtain the values of the parameters defining a previously introduced stochastic model of this acceleration field. The diffusive motion of a single solid particle suspended in an incompressible fluid that is subjected to such random accelerations is considered, and mean squared velocities and effective diffusion coefficients are explicitly given. We next study the flow induced by an ensemble of such particles, and show the existence of a hydrodynamically induced attraction between pairs of particles at distances large compared with their radii, and repulsion at short distances. Finally, a mean field analysis is used to estimate the effect of g-jitter on diffusion controlled coarsening of a solid-liquid mixture. Corrections to classical coarsening rates due to the induced fluid motion are calculated, and estimates are given for coarsening of Sn-rich particles in a Sn-Pb eutectic fluid, an experiment to be conducted in microgravity in the near future.
Resumo:
The effective diffusion coefficient for the overdamped Brownian motion in a tilted periodic potential is calculated in closed analytical form. Universality classes and scaling properties for weak thermal noise are identified near the threshold tilt where deterministic running solutions set in. In this regime the diffusion may be greatly enhanced, as compared to free thermal diffusion with, for a realistic experimental setup, an enhancement of up to 14 orders of magnitude.
Resumo:
Observers are often required to adjust actions with objects that change their speed. However, no evidence for a direct sense of acceleration has been found so far. Instead, observers seem to detect changes in velocity within a temporal window when confronted with motion in the frontal plane (2D motion). Furthermore, recent studies suggest that motion-in-depth is detected by tracking changes of position in depth. Therefore, in order to sense acceleration in depth a kind of second-order computation would have to be carried out by the visual system. In two experiments, we show that observers misperceive acceleration of head-on approaches at least within the ranges we used [600-800 ms] resulting in an overestimation of arrival time. Regardless of the viewing condition (only monocular or monocular and binocular), the response pattern conformed to a constant velocity strategy. However, when binocular information was available, overestimation was highly reduced.
Resumo:
Estudi elaborat a partir d’una estada a l’Associação para o Desenvolvimento da Aerodinânica Industrial (ADAI) de la Universitat de Coimbra, Portugal, entre març i juliol de 2006. Aquesta disposa d'un laboratori d'assaigs i té medis suficients per a cremar de manera controlada parcel•les prèviament delimitades en terreny forestal. Això permet observar el fenomen dels incendis forestals a dues escales de treball diferents. L’objectiu ha estat l’obtenció de dades experimentals sobre la propagació de fronts de flames que avancen sobre combustible tractat amb retardants sota l’efecte del pendent o el vent. S’ha participat en proves experimentals de camp i se n’han realitzat dues en instal•lacions de laboratori en què l’efecte del pendent o de la velocitat del vent podia ser variat. Degut a l’elevat nombre de variables que entren en joc l’anàlisi acurada de les dades encara està en procés.
Resumo:
Durant les darreres dècades, i degut, principalment, a un canvi en els hàbits alimentaris, hi ha hagut un augment a nivell mundial de malalties cròniques (l’obesitat, malalties cardiovasculars, etc.). En els països mediterranis hi ha menys incidència d’aquestes malalties i sembla ser que això es deu a l’anomenada dieta mediterrània. La dieta mediterrània es caracteritza per una combinació d’oli d’oliva com a grassa principal, verdures, hortalisses i fruites en abundància, lleguminoses, fruits secs, formatges i iogurt, peix, pa, pasta, cereals i els seus derivats i un consum moderat de vi i carns. Aquest model alimentari, ric en tocoferols, fitosterols i fitoestanols que ajuden a reduir el contingut de colesterol en sang, fa que en les poblacions mediterrànies hi hagi menys incidència de malalties cardiovasculars. Aquests compostos inhibeixen el deteriorament oxidatiu dels olis, actuen com agent antipolimerització per olis de fregir. Tenen capacitat de reduir els nivells de colesterol, evitant la incidència de malalties cardiovasculars. Els fitoesterols y fitoestanols es poden trobar en forma lliure o esterificada amb àcids grassos, àcids fenòlics i glucosa. Els objectius d’ aquest treball han estat, primer en el desenvolupament de mètodes d'anàlisi ràpids, fiables i robusts dels tocoferols, fitoesterols i fitoestanols i la seva aplicació en fruits sec, oli de segó, oli de pinyol de raïm i productes que els continguin. El primer mètode va estar basat en la cromatografía líquida (HPLC-DAD) amb extracció en fase sòlida (SPE) com tècnica alternativa a la saponificació para la determinació de fitoesterols lliures. Aquest mètode va estar aplicada a mostres de bombons que contenia fitoesterols. El segon mètode va estar basat en la cromatografia de gasos (GCFID) amb aponificació i SPE per quantificar fitoesterols i fitoestanols lliures, esterificats i totals. En els documents annexos es descriuen a profunditat els mètodes desenvolupats.
Resumo:
La Teoria de la Relativitat General preveu que quan un objecte massiu és sotmès a una certa acceleració en certes condicions ha d’emetre ones gravitacionals. Es tracta d’un tipus d’on altament energètica però que interacciona amb la matèria de manera molt feble i el seu punt d’emissió és força llunyà. Per la qual cosa la seva detecció és una tasca extraordinàriament complicada. Conseqüentment, la detecció d’aquestes ones es creu molt més factible utilitzant instruments situats a l’espai. Amb aquest objectiu, neis la missió LISA (Laser Interferometer Space Antenna). Es tracta aquesta d’una missió conjunta entre la NASA i l’ESA amb llançament previst per 2020-2025. Per reduir els riscs que comporta una primera utilització de tecnologia no testejada, unit a l’alt cost econòmic de la missió LISA. Aquesta missió contindrà instruments molt avançats: el LTP (LISA Technoplogy Package), desenvolupat per la Unió Europea, que provarà la tecnologia de LISA i el Drag Free flying system, que s’encarregarà de provar una sèrie de propulsors (thrusters) utilitzats per al control d’actitud i posició de satèl•lit amb precisió de nanòmetres. Particularment, el LTP, està composat per dues masses de prova separades per 35 centímetres, i d’un interferòmetre làser que mesura la variació de la distància relativa entre elles. D’aquesta manera, el LTP mesurarà les prestacions dels equips i les possibles interferències que afecten a la mesura. Entre les fonts de soroll es troben, entre d’altres, el vent i pressió de radiació solar, les càrregues electrostàtiques, el gradient tèrmic, les fluctuacions de voltatge o les forces internes. Una de les possibles causes de soroll és aquella que serà l’objecte d’estudi en aquest projecte de tesi doctoral: la presència dintre del LTP de camps magnètics, que exerceixen una força sobre les masses de prova, la seva estimació i el seu control, prenent en compte les caracterírstiques magnètiques de l’experiment i la dinàmica del satèl•lit.
Resumo:
Els sistemes automatitzats que requereixen d’un control d’estabilitat o moviment es poden trobar cada cop en més àmbits. Aplicacions UAV o de posicionament global són les més comunes per aquest tipus de sistemes, degut a que necessiten d’un control de moviment molt precís. Per a dur a terme aquest procés s’utilitzen unitats de mesura inercial, que mitjançant acceleròmetres i giroscopis degudament posicionats, a més a més d’una correcció del possible error que puguin introduir aquests últims, proporcionen una acceleració i una velocitat angular de les quals es pot extreure el camí efectuat per aquestes unitats. La IMU, combinada amb un GPS i mitjançant un filtre de Kalman, proporcionen una major exactitud , a més d’un punt de partida (proporcionat per el GPS), un recorregut representable en un mapa y, en el cas de perdre la senyal GPS, poder seguir adquirint dades de la IMU. Aquestes dades poden ser recollides i processades per una FPGA, que a la vegada podem sincronitzar amb una PDA per a que l’usuari pugui veure representat el moviment del sistema. Aquest treball es centra en el funcionament de la IMU i l’adquisició de dades amb la FPGA. També introdueix el filtre de Kalman per a la correcció de l’error dels sensors.
Resumo:
Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios
Resumo:
The speed of front propagation in fractals is studied by using (i) the reduction of the reaction-transport equation into a Hamilton-Jacobi equation and (ii) the local-equilibrium approach. Different equations proposed for describing transport in fractal media, together with logistic reaction kinetics, are considered. Finally, we analyze the main features of wave fronts resulting from this dynamic process, i.e., why they are accelerated and what is the exact form of this acceleration
Resumo:
Purpose: The objective of this study is to investigate the feasibility of detecting and quantifying 3D cerebrovascular wall motion from a single 3D rotational x-ray angiography (3DRA) acquisition within a clinically acceptable time and computing from the estimated motion field for the further biomechanical modeling of the cerebrovascular wall. Methods: The whole motion cycle of the cerebral vasculature is modeled using a 4D B-spline transformation, which is estimated from a 4D to 2D + t image registration framework. The registration is performed by optimizing a single similarity metric between the entire 2D + t measured projection sequence and the corresponding forward projections of the deformed volume at their exact time instants. The joint use of two acceleration strategies, together with their implementation on graphics processing units, is also proposed so as to reach computation times close to clinical requirements. For further characterizing vessel wall properties, an approximation of the wall thickness changes is obtained through a strain calculation. Results: Evaluation on in silico and in vitro pulsating phantom aneurysms demonstrated an accurate estimation of wall motion curves. In general, the error was below 10% of the maximum pulsation, even in the situation when substantial inhomogeneous intensity pattern was present. Experiments on in vivo data provided realistic aneurysm and vessel wall motion estimates, whereas in regions where motion was neither visible nor anatomically possible, no motion was detected. The use of the acceleration strategies enabled completing the estimation process for one entire cycle in 5-10 min without degrading the overall performance. The strain map extracted from our motion estimation provided a realistic deformation measure of the vessel wall. Conclusions: The authors' technique has demonstrated that it can provide accurate and robust 4D estimates of cerebrovascular wall motion within a clinically acceptable time, although it has to be applied to a larger patient population prior to possible wide application to routine endovascular procedures. In particular, for the first time, this feasibility study has shown that in vivo cerebrovascular motion can be obtained intraprocedurally from a 3DRA acquisition. Results have also shown the potential of performing strain analysis using this imaging modality, thus making possible for the future modeling of biomechanical properties of the vascular wall.
Resumo:
This paper presents new estimates of total factor productivity growth in Britain for the period 1770-1860. We use a dual technique recently popularized by Hsieh (1999), and argue that the estimates we derive from factor prices are of similar quality to quantity-based calculations. Our results provide further evidence, derived from this independent set of sources, that productivity growth during the British Industrial Revolution was relatively slow. During the years 1770-1800, TFP growth was close to zero, according to our estimates. The period 1800-1830 experienced an acceleration of productivity growth. The Crafts-Harley view of the Industrial Revolution is thus reinforced. We also consider alternative explanations of slow productivity growth, and reject the interpretation that focuses on the introduction of steam as a general purpose technology.
Resumo:
It is generally accepted that the extent of phenotypic change between human and great apes is dissonant with the rate of molecular change. Between these two groups, proteins are virtually identical, cytogenetically there are few rearrangements that distinguish ape-human chromosomes, and rates of single-base-pair change and retrotransposon activity have slowed particularly within hominid lineages when compared to rodents or monkeys. Studies of gene family evolution indicate that gene loss and gain are enriched within the primate lineage. Here, we perform a systematic analysis of duplication content of four primate genomes (macaque, orang-utan, chimpanzee and human) in an effort to understand the pattern and rates of genomic duplication during hominid evolution. We find that the ancestral branch leading to human and African great apes shows the most significant increase in duplication activity both in terms of base pairs and in terms of events. This duplication acceleration within the ancestral species is significant when compared to lineage-specific rate estimates even after accounting for copy-number polymorphism and homoplasy. We discover striking examples of recurrent and independent gene-containing duplications within the gorilla and chimpanzee that are absent in the human lineage. Our results suggest that the evolutionary properties of copy-number mutation differ significantly from other forms of genetic mutation and, in contrast to the hominid slowdown of single-base-pair mutations, there has been a genomic burst of duplication activity at this period during human evolution.
Resumo:
When preparing an article on image restoration in astronomy, it is obvious that some topics have to be dropped to keep the work at reasonable length. We have decided to concentrate on image and noise models and on the algorithms to find the restoration. Topics like parameter estimation and stopping rules are also commented on. We start by describing the Bayesian paradigm and then proceed to study the noise and blur models used by the astronomical community. Then the prior models used to restore astronomical images are examined. We describe the algorithms used to find the restoration for the most common combinations of degradation and image models. Then we comment on important issues such as acceleration of algorithms, stopping rules, and parameter estimation. We also comment on the huge amount of information available to, and made available by, the astronomical community.