990 resultados para propagation dynamics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In peripheral tissues circadian gene expression can be driven either by local oscillators or by cyclic systemic cues controlled by the master clock in the brain's suprachiasmatic nucleus. In the latter case, systemic signals can activate immediate early transcription factors (IETFs) and thereby control rhythmic transcription. In order to identify IETFs induced by diurnal blood-borne signals, we developed an unbiased experimental strategy, dubbed Synthetic TAndem Repeat PROMoter (STAR-PROM) screening. This technique relies on the observation that most transcription factor binding sites exist at a relatively high frequency in random DNA sequences. Using STAR-PROM we identified serum response factor (SRF) as an IETF responding to oscillating signaling proteins present in human and rodent sera. Our data suggest that in mouse liver SRF is regulated via dramatic diurnal changes of actin dynamics, leading to the rhythmic translocation of the SRF coactivator Myocardin-related transcription factor-B (MRTF-B) into the nucleus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The global economic and financial crisis is a challenge for all governments, but particularly for federal states because divided and/or shared territorial powers make federations susceptible to coordination problems in fiscal policy making. This article explores the effects of the ongoing crisis on federal relations. Three kinds of problems that may become the cause of federal tensions and conflicts are evoked: opportunism of subgovernments, centralisation and erosion of solidarity among members of the federation. Our analysis of fiscal policies and federal conflicts of 11 federations between 2007 and the present reveals three kinds of coordination problems: shirking in the use of federal government grants, rent-seeking in equalisation payments, and over-borrowing and over-spending. Our results show that shirking remained limited to few cases and occurred only in the first part of the crisis. However, rent-seeking and over-borrowing and over-spending led to a reduction of solidarity among subgovernments and to increased regulation of the fiscal discretion of the members of the federation. Subsequently, tensions in federal relations increased - although only in one case did this challenged the federal order.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this paper is to propose a convergent finite volume method for a reactionâeuro"diffusion system with cross-diffusion. First, we sketch an existence proof for a class of cross-diffusion systems. Then the standard two-point finite volume fluxes are used in combination with a nonlinear positivity-preserving approximation of the cross-diffusion coefficients. Existence and uniqueness of the approximate solution are addressed, and it is also shown that the scheme converges to the corresponding weak solution for the studied model. Furthermore, we provide a stability analysis to study pattern-formation phenomena, and we perform two-dimensional numerical examples which exhibit formation of nonuniform spatial patterns. From the simulations it is also found that experimental rates of convergence are slightly below second order. The convergence proof uses two ingredients of interest for various applications, namely the discrete Sobolev embedding inequalities with general boundary conditions and a space-time $L^1$ compactness argument that mimics the compactness lemma due to Kruzhkov. The proofs of these results are given in the Appendix.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pro-gradu tutkielman tavoitteena on tutkia, miten yritykset tasapainoilevat tiedon jakamisen ja suojaamisen välillä innovaatioyhteistyöprojekteissa, ja miten sopimukset, immateriaalioikeudet ja luottamus voivat vaikuttaa tähän tasapainoon. Yhteistyössä yritysten täytyy jakaa tarpeellista tietoa kumppanilleen, mutta toisaalta niiden täytyy varoa, etteivät ne menetä ydinosaamiseensa kuuluvaa tietoa ja kilpailuetuaan. Yrityksillä on useita keinoja tietovuodon estämiseen. Tutkielmassa keskitytään patenttien, sopimusten ja liikesalaisuuksien käyttöön tietoa suojaavina mekanismeina. Kyseiset suojamekanismit vaikuttavat luottamukseen kumppaneiden välillä, ja täten myös näiden halukkuuteen jakaa tietoa kumppaneilleen. Jos kumppanit eivät jaa tarpeeksi tietoa toisilleen, voi yhteistyö epäonnistua. Sopimusten, immateriaalioikeuksien ja luottamuksen rooleja ja vuorovaikutusta tutkitaan kahdenvälisissä yhteistyöprojekteissa. Tutkielmassa esitellään neljä case-esimerkkiä, jotka on koottu suomalaisen metsätoimialan yrityksen haastatteluista.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Munc13 gene family encodes molecules located at the synaptic active zone that regulate the reliability of synapses to encode information over a wide range of frequencies in response to action potentials. In the CNS, proteins of the Munc13 family are critical in regulating neurotransmitter release and synaptic plasticity. Although Munc13-1 is essential for synaptic transmission, it is paradoxical that Munc13-2 and Munc13-3 are functionally dispensable at some synapses, although their loss in other synapses leads to increases in frequency-dependent facilitation. We addressed this issue at the calyx of Held synapse, a giant glutamatergic synapse that we found to express all these Munc13 isoforms. We studied their roles in the regulation of synaptic transmission and their impact on the reliability of information transfer. Through detailed electrophysiological analyses of Munc13-2, Munc13-3, and Munc13-2-3 knock-out and wild-type mice, we report that the combined loss of Munc13-2 and Munc13-3 led to an increase in the rate of calcium-dependent recovery and a change in kinetics of release of the readily releasable pool. Furthermore, viral-mediated overexpression of a dominant-negative form of Munc13-1 at the calyx demonstrated that these effects are Munc13-1 dependent. Quantitative immunohistochemistry using Munc13-fluorescent protein knock-in mice revealed that Munc13-1 is the most highly expressed Munc13 isoform at the calyx and the only one highly colocalized with Bassoon at the active zone. Based on these data, we conclude that Munc13-2 and Munc13-3 isoforms limit the ability of Munc13-1 to regulate calcium-dependent replenishment of readily releasable pool and slow pool to fast pool conversion in central synapses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we consider a discrete-time risk process allowing for delay in claim settlement, which introduces a certain type of dependence in the process. From martingale theory, an expression for the ultimate ruin probability is obtained, and Lundberg-type inequalities are derived. The impact of delay in claim settlement is then investigated. To this end, a convex order comparison of the aggregate claim amounts is performed with the corresponding non-delayed risk model, and numerical simulations are carried out with Belgian market data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract The main objective of this work is to show how the choice of the temporal dimension and of the spatial structure of the population influences an artificial evolutionary process. In the field of Artificial Evolution we can observe a common trend in synchronously evolv¬ing panmictic populations, i.e., populations in which any individual can be recombined with any other individual. Already in the '90s, the works of Spiessens and Manderick, Sarma and De Jong, and Gorges-Schleuter have pointed out that, if a population is struc¬tured according to a mono- or bi-dimensional regular lattice, the evolutionary process shows a different dynamic with respect to the panmictic case. In particular, Sarma and De Jong have studied the selection pressure (i.e., the diffusion of a best individual when the only selection operator is active) induced by a regular bi-dimensional structure of the population, proposing a logistic modeling of the selection pressure curves. This model supposes that the diffusion of a best individual in a population follows an exponential law. We show that such a model is inadequate to describe the process, since the growth speed must be quadratic or sub-quadratic in the case of a bi-dimensional regular lattice. New linear and sub-quadratic models are proposed for modeling the selection pressure curves in, respectively, mono- and bi-dimensional regu¬lar structures. These models are extended to describe the process when asynchronous evolutions are employed. Different dynamics of the populations imply different search strategies of the resulting algorithm, when the evolutionary process is used to solve optimisation problems. A benchmark of both discrete and continuous test problems is used to study the search characteristics of the different topologies and updates of the populations. In the last decade, the pioneering studies of Watts and Strogatz have shown that most real networks, both in the biological and sociological worlds as well as in man-made structures, have mathematical properties that set them apart from regular and random structures. In particular, they introduced the concepts of small-world graphs, and they showed that this new family of structures has interesting computing capabilities. Populations structured according to these new topologies are proposed, and their evolutionary dynamics are studied and modeled. We also propose asynchronous evolutions for these structures, and the resulting evolutionary behaviors are investigated. Many man-made networks have grown, and are still growing incrementally, and explanations have been proposed for their actual shape, such as Albert and Barabasi's preferential attachment growth rule. However, many actual networks seem to have undergone some kind of Darwinian variation and selection. Thus, how these networks might have come to be selected is an interesting yet unanswered question. In the last part of this work, we show how a simple evolutionary algorithm can enable the emrgence o these kinds of structures for two prototypical problems of the automata networks world, the majority classification and the synchronisation problems. Synopsis L'objectif principal de ce travail est de montrer l'influence du choix de la dimension temporelle et de la structure spatiale d'une population sur un processus évolutionnaire artificiel. Dans le domaine de l'Evolution Artificielle on peut observer une tendence à évoluer d'une façon synchrone des populations panmictiques, où chaque individu peut être récombiné avec tout autre individu dans la population. Déjà dans les année '90, Spiessens et Manderick, Sarma et De Jong, et Gorges-Schleuter ont observé que, si une population possède une structure régulière mono- ou bi-dimensionnelle, le processus évolutionnaire montre une dynamique différente de celle d'une population panmictique. En particulier, Sarma et De Jong ont étudié la pression de sélection (c-à-d la diffusion d'un individu optimal quand seul l'opérateur de sélection est actif) induite par une structure régulière bi-dimensionnelle de la population, proposant une modélisation logistique des courbes de pression de sélection. Ce modèle suppose que la diffusion d'un individu optimal suit une loi exponentielle. On montre que ce modèle est inadéquat pour décrire ce phénomène, étant donné que la vitesse de croissance doit obéir à une loi quadratique ou sous-quadratique dans le cas d'une structure régulière bi-dimensionnelle. De nouveaux modèles linéaires et sous-quadratique sont proposés pour des structures mono- et bi-dimensionnelles. Ces modèles sont étendus pour décrire des processus évolutionnaires asynchrones. Différentes dynamiques de la population impliquent strategies différentes de recherche de l'algorithme résultant lorsque le processus évolutionnaire est utilisé pour résoudre des problèmes d'optimisation. Un ensemble de problèmes discrets et continus est utilisé pour étudier les charactéristiques de recherche des différentes topologies et mises à jour des populations. Ces dernières années, les études de Watts et Strogatz ont montré que beaucoup de réseaux, aussi bien dans les mondes biologiques et sociologiques que dans les structures produites par l'homme, ont des propriétés mathématiques qui les séparent à la fois des structures régulières et des structures aléatoires. En particulier, ils ont introduit la notion de graphe sm,all-world et ont montré que cette nouvelle famille de structures possède des intéressantes propriétés dynamiques. Des populations ayant ces nouvelles topologies sont proposés, et leurs dynamiques évolutionnaires sont étudiées et modélisées. Pour des populations ayant ces structures, des méthodes d'évolution asynchrone sont proposées, et la dynamique résultante est étudiée. Beaucoup de réseaux produits par l'homme se sont formés d'une façon incrémentale, et des explications pour leur forme actuelle ont été proposées, comme le preferential attachment de Albert et Barabàsi. Toutefois, beaucoup de réseaux existants doivent être le produit d'un processus de variation et sélection darwiniennes. Ainsi, la façon dont ces structures ont pu être sélectionnées est une question intéressante restée sans réponse. Dans la dernière partie de ce travail, on montre comment un simple processus évolutif artificiel permet à ce type de topologies d'émerger dans le cas de deux problèmes prototypiques des réseaux d'automates, les tâches de densité et de synchronisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé: At least since the Great Depression, explaining why there are business fluctuations has been one of the biggest challenges that the science of economics has had to face. The hope is that if we could better understand recessions, then we could also be more successful in overcoming them. This dissertation consists of three papers that are part of the general endeavor of economists to understand these fluctuations. The first paper discusses, for a particular model, whether a result related to fluctuations would still hold if time were modeled as continuous rather than discrete. The two other papers focus on price stickiness. The second paper discusses why, after a large devaluation, prices of non-tradables may change by only a small amount in comparison to the magnitude of the devaluation. The third paper examines price adjustment in a model in which information is imperfect and it is costly to change prices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work concerns the experimental study of rapid granular shear flows in annular Couette geometry. The flow is induced by continuous driving of the horizontal plate at the top of the granular bed in an annulus. The compressive pressure, driving torque, instantaneous bed height and rotational speed of the shearing plate are measured. Moreover, local stress fluctuations are measured in a medium made of steel spheres 2 and 3 mm in diameter. Both monodisperse packing and bidisperse packing are investigated to reveal the influence of size diversity in intermittent features of granular materials. Experiments are conducted in an annulus that can contain up to 15 kg of spherical steel balls. The shearing granular medium takes place via the rotation of the upper plate which compresses the material loaded inside the annulus. Fluctuations of compressive force are locally measured at the bottom of the annulus using a piezoelectric sensor. Rapid shear flow experiments are pursued at different compressive forces and shear rates and the sensitivity of fluctuations are then investigated by different means through monodisperse and bidisperse packings. Another important feature of rapid granular shear flows is the formation of ordered structures upon shearing. It requires a certain range for the amount of granular material (uniform size distribution) loaded in the system in order to obtain stable flows. This is studied more deeply in this thesis. The results of the current work bring some new insights into deformation dynamics and intermittency in rapid granular shear flows. The experimental apparatus is modified in comparison to earlier investigations. The measurements produce data for various quantities continuously sampled from the start of shearing to the end. Static failure and dynamic shearing ofa granular medium is investigated. The results of this work revealed some important features of failure dynamics and structure formation in the system. Furthermore, some computer simulations are performed in a 2D annulus to examine the nature of kinetic energy dissipation. It is found that turbulent flow models can statistically represent rapid granular flows with high accuracy. In addition to academic outcomes and scientific publications our results have a number of technological applications associated with grinding, mining and massive grain storages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building and sustaining competitive advantage through the creation of market imperfections is challenging in a constantly changing business environment - particularly since the sources of such advantages are increasingly knowledge-based. Facilitated by improved networks and communication, knowledge spills over to competitors more easily than before,thus creating an appropriability problem: the inability of an innovating firm to utilize its innovations commercially. Consequently, as the importance of intellectual assets increases, their protection also calls for new approaches. Companies have various means of protection at their disposal, and by taking advantage of them they can make intangibles more non-transferable and prevent, or at leastdelay, imitation of their most crucial intellectual assets. However, creating barriers against imitation has another side to it, and the transfer of knowledge in situations requiring knowledge sharing may be unintentionally obstructed. Theaim of this thesis is to increase understanding of how firms can balance knowledge protection and sharing so as to benefit most from their knowledge assets. Thus, knowledge protection is approached through an examination of the appropriability regime of a firm, i.e., the combination of available and effective means ofprotecting innovations, their profitability, and the increased rents due to R&D. A further aim is to provide a broader understanding of the formation and structure of the appropriability regime. The study consists of two parts. The first part introduces the research topic and the overall results of the study, and the second part consists of six complementary research publications covering various appropriability issues. The thesis contributes to the existing literature in several ways. Although there is a wide range of prior research on appropriability issues, a lot of it is restricted either to the study of individual appropriability mechanisms, or to comparing certain features of them. These approaches are combined, and the relevant theoretical concepts are clarified and developed. In addition, the thesis provides empirical evidence of the formation of the appropriability regime, which is consequently presented as an adaptive process. Thus, a framework is provided that better corresponds to the complex reality of the current business environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combustion of wood is increasing because of the needs of decreasing the emissions of carbon dioxide and the amount of waste going to landfills. Wood based fuels are often scattered on a large area. The transport distances should be short enough to prevent too high costs, and so the size of heating and power plants using wood fuels is often rather small. Combustion technologies of small-size units have to be developed to reach efficient and environmentally friendly energy production. Furnaces that use different packed bed combustion or gasification techniques areoften most economic in small-scale energy production. Ignition front propagation rate affects the stability, heat release rate and emissions of packed bed combustion. Ignition front propagation against airflow in packed beds of wood fuels has been studied. The research has been carried out mainly experimentally. Theoretical aspects have been considered to draw conclusions about the experimental results. The effects of airflow rate, moisture content of the fuel, size, shape and density of particles, and porosity of the bed on the propagation rate of the ignition front have been studied. The experiments were carried out in a pot furnace. The fuels used in the experiments were mainly real wood fuels that are often burned in the production of energy. The fuel types were thin wood chips, saw dust, shavings, wood chips, and pellets with different sizes. Also a few mixturesof the above were tested. Increase in the moisture content of the fuel decreases the propagation rates of the ignition front and makes the range of possible airflow rates narrower because of the energy needed for the evaporation of water and the dilution of volatile gases due to evaporated steam. Increase in the airflow rate increases the ignition rate until a maximum rate of propagation is reached after which it decreases. The maximum flame propagation rate is not always reached in stoichiometric combustion conditions. Increase in particle size and density transfers the optimum airflow rate towards fuel lean conditions. Mixing of small and large particles is often advantageous, because small particles make itpossible to reach the maximum ignition rate in fuel rich conditions, and large particles widen the range of possible airflow rates. A correlation was found forthe maximum rate of ignition front propagation in different wood fuels. According to the correlation, the maximum ignition mass flux is increased when the sphericity of the particles and the porosity of the bed are increased and the moisture content of the fuel is decreased. Another fit was found between sphericity and porosity. Increase in sphericity decreases the porosity of the bed. The reasons of the observed results are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sameness between the inertial mass and the gravitational mass is an assumption and not a consequence of the equivalent principle is shown. In the context of the Sciama’s inertia theory, the sameness between the inertial mass and the gravitational mass is discussed and a certain condition which must be experimentally satisfied is given. The inertial force proposed by Sciama, in a simple case, is derived from the Assis’ inertia theory based in the introduction of a Weber type force. The origin of the inertial force is totally justified taking into account that the Weber force is, in fact, an approximation of a simple retarded potential, see [18, 19]. The way how the inertial forces are also derived from some solutions of the general relativistic equations is presented. We wonder if the theory of inertia of Assis is included in the framework of the General Relativity. In the context of the inertia developed in the present paper we establish the relation between the constant acceleration a0 , that appears in the classical Modified Newtonian Dynamics (M0ND) theory, with the Hubble constant H0 , i.e. a0 ≈ cH0 .