25 resultados para Branching Processes with Immigration
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Pesquisas realizadas recentemente salientam o papel desempenhado pelas emoções no cerne dos processos cognitivos e, em particular, na criatividade, entendendo esta não como uma faculdade mas como um processo complexo e dinâmico, dependente de vários fatores intrínsecos e extrínsecos ao sujeito. Paralelamente, a abordagem pedagógica no domínio das artes modificou-se consideravelmente desde os anos 80, desviando a atenção da produção para o processo de conhecimento e o seu conteúdo. Essa alteração levou a uma mudança de paradigma: em vez de se introduzir cegamente os alunos nos detalhes e métodos da produção artística, isolando estes do seu contexto social, procura-se agora despertar neles o olhar estético. Como em Duchamp, tenta-se alcançar uma viragem qualitiva, do objeto para o contexto.
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Química e Biológica
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Química e Biológica
Resumo:
Workflows have been successfully applied to express the decomposition of complex scientific applications. This has motivated many initiatives that have been developing scientific workflow tools. However the existing tools still lack adequate support to important aspects namely, decoupling the enactment engine from workflow tasks specification, decentralizing the control of workflow activities, and allowing their tasks to run autonomous in distributed infrastructures, for instance on Clouds. Furthermore many workflow tools only support the execution of Direct Acyclic Graphs (DAG) without the concept of iterations, where activities are executed millions of iterations during long periods of time and supporting dynamic workflow reconfigurations after certain iteration. We present the AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic) model of computation, based on the Process Networks model, where the workflow activities (AWA) are autonomic processes with independent control that can run in parallel on distributed infrastructures, e. g. on Clouds. Each AWA executes a Task developed as a Java class that implements a generic interface allowing end-users to code their applications without concerns for low-level details. The data-driven coordination of AWA interactions is based on a shared tuple space that also enables support to dynamic workflow reconfiguration and monitoring of the execution of workflows. We describe how AWARD supports dynamic reconfiguration and discuss typical workflow reconfiguration scenarios. For evaluation we describe experimental results of AWARD workflow executions in several application scenarios, mapped to a small dedicated cluster and the Amazon (Elastic Computing EC2) Cloud.
Resumo:
Workflows have been successfully applied to express the decomposition of complex scientific applications. However the existing tools still lack adequate support to important aspects namely, decoupling the enactment engine from tasks specification, decentralizing the control of workflow activities allowing their tasks to run in distributed infrastructures, and supporting dynamic workflow reconfigurations. We present the AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic) model of computation, based on Process Networks, where the workflow activities (AWA) are autonomic processes with independent control that can run in parallel on distributed infrastructures. Each AWA executes a task developed as a Java class with a generic interface allowing end-users to code their applications without low-level details. The data-driven coordination of AWA interactions is based on a shared tuple space that also enables dynamic workflow reconfiguration. For evaluation we describe experimental results of AWARD workflow executions in several application scenarios, mapped to the Amazon (Elastic Computing EC2) Cloud.
Resumo:
In this contribution, we investigate the low-temperature, low-density behaviour of dipolar hard-sphere (DHS) particles, i.e., hard spheres with dipoles embedded in their centre. We aim at describing the DHS fluid in terms of a network of chains and rings (the fundamental clusters) held together by branching points (defects) of different nature. We first introduce a systematic way of classifying inter-cluster connections according to their topology, and then employ this classification to analyse the geometric and thermodynamic properties of each class of defects, as extracted from state-of-the-art equilibrium Monte Carlo simulations. By computing the average density and energetic cost of each defect class, we find that the relevant contribution to inter-cluster interactions is indeed provided by (rare) three-way junctions and by four-way junctions arising from parallel or anti-parallel locally linear aggregates. All other (numerous) defects are either intra-cluster or associated to low cluster-cluster interaction energies, suggesting that these defects do not play a significant part in the thermodynamic description of the self-assembly processes of dipolar hard spheres. (C) 2013 AIP Publishing LLC.
Resumo:
In this article, we calibrate the Vasicek interest rate model under the risk neutral measure by learning the model parameters using Gaussian processes for machine learning regression. The calibration is done by maximizing the likelihood of zero coupon bond log prices, using mean and covariance functions computed analytically, as well as likelihood derivatives with respect to the parameters. The maximization method used is the conjugate gradients. The only prices needed for calibration are zero coupon bond prices and the parameters are directly obtained in the arbitrage free risk neutral measure.
Resumo:
Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical research. However, the resulting images present a low signal to noise ratio and a time intensity decay due to the photobleaching effect. This phenomenon is a consequence of the decreasing on the radiation emission efficiency of the tagging protein. This occurs because the fluorophore permanently loses its ability to fluoresce, due to photochemical reactions induced by the incident light. The Poisson multiplicative noise that corrupts these images, in addition with its quality degradation due to photobleaching, make long time biological observation processes very difficult. In this paper a denoising algorithm for Poisson data, where the photobleaching effect is explicitly taken into account, is described. The algorithm is designed in a Bayesian framework where the data fidelity term models the Poisson noise generation process as well as the exponential intensity decay caused by the photobleaching. The prior term is conceived with Gibbs priors and log-Euclidean potential functions, suitable to cope with the positivity constrained nature of the parameters to be estimated. Monte Carlo tests with synthetic data are presented to characterize the performance of the algorithm. One example with real data is included to illustrate its application.
Resumo:
We calculate the equilibrium thermodynamic properties, percolation threshold, and cluster distribution functions for a model of associating colloids, which consists of hard spherical particles having on their surfaces three short-ranged attractive sites (sticky spots) of two different types, A and B. The thermodynamic properties are calculated using Wertheim's perturbation theory of associating fluids. This also allows us to find the onset of self-assembly, which can be quantified by the maxima of the specific heat at constant volume. The percolation threshold is derived, under the no-loop assumption, for the correlated bond model: In all cases it is two percolated phases that become identical at a critical point, when one exists. Finally, the cluster size distributions are calculated by mapping the model onto an effective model, characterized by a-state-dependent-functionality (f) over bar and unique bonding probability (p) over bar. The mapping is based on the asymptotic limit of the cluster distributions functions of the generic model and the effective parameters are defined through the requirement that the equilibrium cluster distributions of the true and effective models have the same number-averaged and weight-averaged sizes at all densities and temperatures. We also study the model numerically in the case where BB interactions are missing. In this limit, AB bonds either provide branching between A-chains (Y-junctions) if epsilon(AB)/epsilon(AA) is small, or drive the formation of a hyperbranched polymer if epsilon(AB)/epsilon(AA) is large. We find that the theoretical predictions describe quite accurately the numerical data, especially in the region where Y-junctions are present. There is fairly good agreement between theoretical and numerical results both for the thermodynamic (number of bonds and phase coexistence) and the connectivity properties of the model (cluster size distributions and percolation locus).
Resumo:
The majority of worldwide structures use concrete as its main material. This happens because concrete is economically feasible, due to its undemanding production technology and case Of use. However, it is widely recognized that concrete production has a strong environmental impact in the planet. Natural aggregates use is one of the most important problems of concrete production nowadays, since they are obtained from limited, and in some countries scarce, resources. In Portugal, although there are enough stone quarries to cover coarse aggregates needs for several more years, Supplies of fine aggregates are becoming scarcer, especially in the northern part of the country. On the other hand, as concrete structures' life cycle comes to an end, an urgent need emerges to establish technically and economically viable solutions for demolition debris, other than for use as road base and quarry fill. This paper presents a partial life cycle assessment (LCA) of concrete made with fine recycled concrete aggregates performed with EcoConcrete tool. EcoConcrete is a tailor-made, interactive, learning and communications tool promoted by the Joint Project Group (JPG) on the LCA of concrete, to qualify and quantify the overall environment impact of concrete products. It consists of an interactive Excel-spreadsheet in which several environmental inputs (material quantities, distances from origin to production Site, production processes) and outputs (material, energy, emissions to air, water, soil or waste) are collected in a life cycle inventory, and are then processed to determine the environmental impact (assessment) of the analysed concrete, in terms of ozone layer depletion, smog or "greenhouse" effect.
Resumo:
Magma flow in dykes is still not well understood; some reported magnetic fabrics are contradictory and the potential effects of exsolution and metasomatism processes on the magnetic properties are issues open to debate. Therefore, a long dyke made of segments with different thickness, which record distinct degrees of metasomatism, the Messejana-Plasencia dyke (MPD), was studied. Oriented dolerite samples were collected along several cross-sections and characterized by means of microscopy and magnetic analyses. The results obtained show that the effects of metasomatism on rock mineralogy are important, and that the metasomatic processes can greatly influence anisotropy degree and mean susceptibility only when rocks are strongly affected by metasomatism. Petrography, scanning electron microscopy (SEM) and bulk magnetic analyses show a high-temperature oxidation-exsolution event, experienced by the very early Ti-spinels, during the early stages of magma cooling, which was mostly observed in central domains of the thick dyke segments. Exsolution reduced the grain size of the magnetic carrier (multidomain to single domain transformation), thus producing composite fabrics involving inverse fabrics. These are likely responsible for a significant number of the 'abnormal' fabrics, which make the interpretation of magma flow much more complex. By choosing to use only the 'normal' fabric for magma flow determination, we have reduced by 50 per cent the number of relevant sites. In these sites, the imbrication angle of the magnetic foliation relative to dyke wall strongly suggests flow with end-members indicating vertical-dominated flow (seven sites) and horizontal-dominated flow (three sites).
Resumo:
When a paleomagnetic pole is sought for in an igneous body, the host rocks should be subjected to a contact test to assure that the determined paleopole has the age of the intrusion. If the contact test is positive, it precludes the possibility that the measured magnetization is a later effect. Therefore, we investigated the variations of the remanent magnetization along cross-sections of rocks hosting the Foum Zguid dyke (southern Morocco) and the dyke itself. A positive contact test was obtained, but it is mainly related with Chemical/Crystalline Remanent Magnetization due to metasomatic processes in the host-rocks during magma intrusion and cooling, and not only with Thermo-Remanent Magnetization as ordinarily assumed in standard studies. Paleomagnetic data obtained within the dyke then reflect the Earth magnetic field during emplacement of this well-dated (196.9 +/- 1.8 Ma) intrusion.
Resumo:
The rapid growth in genetics and molecular biology combined with the development of techniques for genetically engineering small animals has led to increased interest in in vivo small animal imaging. Small animal imaging has been applied frequently to the imaging of small animals (mice and rats), which are ubiquitous in modeling human diseases and testing treatments. The use of PET in small animals allows the use of subjects as their own control, reducing the interanimal variability. This allows performing longitudinal studies on the same animal and improves the accuracy of biological models. However, small animal PET still suffers from several limitations. The amounts of radiotracers needed, limited scanner sensitivity, image resolution and image quantification issues, all could clearly benefit from additional research. Because nuclear medicine imaging deals with radioactive decay, the emission of radiation energy through photons and particles alongside with the detection of these quanta and particles in different materials make Monte Carlo method an important simulation tool in both nuclear medicine research and clinical practice. In order to optimize the quantitative use of PET in clinical practice, data- and image-processing methods are also a field of intense interest and development. The evaluation of such methods often relies on the use of simulated data and images since these offer control of the ground truth. Monte Carlo simulations are widely used for PET simulation since they take into account all the random processes involved in PET imaging, from the emission of the positron to the detection of the photons by the detectors. Simulation techniques have become an importance and indispensable complement to a wide range of problems that could not be addressed by experimental or analytical approaches.
Resumo:
This article describes work performed on the assessment of the levels of airborne ultrafine particles emitted in two welding processes metal-active gas (MAG) of carbon steel and friction-stir welding (FSW) of aluminium in terms of deposited area in alveolar tract of the lung using a nanoparticle surface area monitor analyser. The obtained results showed the dependence from process parameters on emitted ultrafine particles and clearly demonstrated the presence of ultrafine particles, when compared with background levels. The obtained results showed that the process that results on the lower levels of alveolar-deposited surface area is FSW, unlike MAG. Nevertheless, all the tested processes resulted in important doses of ultrafine particles that are to be deposited in the human lung of exposed workers.
Resumo:
Fluorescence confocal microscopy (FCM) is now one of the most important tools in biomedicine research. In fact, it makes it possible to accurately study the dynamic processes occurring inside the cell and its nucleus by following the motion of fluorescent molecules over time. Due to the small amount of acquired radiation and the huge optical and electronics amplification, the FCM images are usually corrupted by a severe type of Poisson noise. This noise may be even more damaging when very low intensity incident radiation is used to avoid phototoxicity. In this paper, a Bayesian algorithm is proposed to remove the Poisson intensity dependent noise corrupting the FCM image sequences. The observations are organized in a 3-D tensor where each plane is one of the images acquired along the time of a cell nucleus using the fluorescence loss in photobleaching (FLIP) technique. The method removes simultaneously the noise by considering different spatial and temporal correlations. This is accomplished by using an anisotropic 3-D filter that may be separately tuned in space and in time dimensions. Tests using synthetic and real data are described and presented to illustrate the application of the algorithm. A comparison with several state-of-the-art algorithms is also presented.