965 resultados para SIMULATING FLUIDS
Resumo:
The Greater Himalayan leucogranites are a discontinuous suite of intrusions emplaced in a thickened crust during the Miocene southward ductile extrusion of the Himalayan metamorphic core. Melt-induced weakening is thought to have played a critical role in strain localization that facilitated the extrusion. Recent advancements in centrifuge analogue modelling techniques allow for the replication of a broader range of crustal deformation behaviors, enhancing our understanding of large hot orogens. Polydimethylsiloxane (PDMS) is commonly used in centrifuge experiments to model weak melt zones. Difficulties in handling PDMS had, until now, limited its emplacement in models prior to any deformation. A new modelling technique has been developed where PDMS is emplaced into models that have been subjected to some shortening. This technique aims to better understand the effects of melt on strain localization and potential decoupling between structural levels within an evolving orogenic system. Models are subjected to an early stage of shortening, followed by the introduction of PDMS, and then a final stage of shortening. Theoretical percentages of partial melt and their effect on rock strength are considered when adding a specific percentage of PDMS in each model. Due to the limited size of the models, only PDMS sheets of 3 mm thickness were used, which varied in length and width. Within undeformed packages, minimal surface and internal deformation occurred when PDMS is emplaced in the lower layer of the model, showing a vertical volume increase of ~20% within the package; whereas the emplacement of PDMS into the middle layer showed internal dragging of the middle laminations into the lower layer and a vertical volume increase ~30%. Emplacement of PDMS results in ~7% shortening for undeformed and deformed models. Deformed models undergo ~20% additional shortening after two rounds of deformation. Strain localization and decoupling between units occur in deformed models where the degree of deformation changes based on the amount of partial melt present. Surface deformation visible by the formation of a bulge, mode 1 extension cracks and varying surface strain ellipses varies depending if PDMS is present. Better control during emplacement is exhibited when PDMS is added into cooler models, resulting in reduced internal deformation within the middle layer.
Resumo:
An all fiber-optical method to monitor densities and viscosities of liquids utilizing a steel cantilever (4 x 0.3 x 0.08 cm3) is presented. The actuation is performed by photothermally heating the cantilever at its base with an intensity-modulated 808 nm diode laser. The cantilever vibrations are picked up by an in-fiber Fabry Perot cavity sensor attached along the length of the cantilever. The fluid properties can be related to the resonance characteristics of the cantilever, e.g. a shift in the resonance frequency corresponds to a change in fluid density, and the width of the resonance peak gives information on the dynamic viscosity after calibration of the system. Aqueous glycerol, sucrose and ethanol samples in the range of 0.79–1.32 gcm−3 (density) and 0.89–702 mPas (viscosity) were used to investigate the limits of the sensor. A good agreement with literature values could be found with an average deviation of around 10 % for the dynamic viscosities, and 5–16 % for the mass densities. A variety of clear and opaque commercial spirits and an unknown viscous sample, e.g. home-made maple syrup, were analyzed and compared to literature values. The unique detection mechanism allows for the characterization of opaque samples and is superior to conventional microcantilever sensors. The method is expected to be beneficial in various industrial sectors such as quality control of food samples.
Resumo:
Simulating the efficiency of business processes could reveal crucial bottlenecks for manufacturing companies and could lead to significant optimizations resulting in decreased time to market, more efficient resource utilization, and larger profit. While such business optimization software is widely utilized by larger companies, SMEs typically do not have the required expertise and resources to efficiently exploit these advantages. The aim of this work is to explore how simulation software vendors and consultancies can extend their portfolio to SMEs by providing business process optimization based on a cloud computing platform. By executing simulation runs on the cloud, software vendors and associated business consultancies can get access to large computing power and data storage capacity on demand, run large simulation scenarios on behalf of their clients, analyze simulation results, and advise their clients regarding process optimization. The solution is mutually beneficial for both vendor/consultant and the end-user SME. End-user companies will only pay for the service without requiring large upfront costs for software licenses and expensive hardware. Software vendors can extend their business towards the SME market with potentially huge benefits.
Resumo:
The modeling technique of Mackay et al. is applied to simulate the coronal magnetic field of NOAA active region AR10977 over a seven day period (2007 December 2-10). The simulation is driven with a sequence of line-of-sight component magnetograms from SOHO/MDI and evolves the coronal magnetic field though a continuous series of non-linear force-free states. Upon comparison with Hinode/XRT observations, results show that the simulation reproduces many features of the active region's evolution. In particular, it describes the formation of a flux rope across the polarity inversion line during flux cancellation. The flux rope forms at the same location as an observed X-ray sigmoid. After five days of evolution, the free magnetic energy contained within the flux rope was found to be 3.9 × 1030 erg. This value is more than sufficient to account for the B1.4 GOES flare observed from the active region on 2007 December 7. At the time of the observed eruption, the flux rope was found to contain 20% of the active region flux. We conclude that the modeling technique proposed in Mackay et al.—which directly uses observed magnetograms to energize the coronal field—is a viable method to simulate the evolution of the coronal magnetic field.
Resumo:
Introduction. Intravascular papillary endothelial hyperplasia (Masson's hemangioma or Masson’s tumor) is a benign vascular disease with an exuberant endothelial proliferation in normal blood vessels. Although relatively uncommon, its correct diagnosis is important because it can clinically be like both benign lesions and malignant neoplasms. We present a case of intravascular proliferative endothelial hyperplasia simulating a tendon cyst both clinically and on ultrasound. Case report. A 74-year old Caucasian female presented with a 4-month history of soreness and swelling in the fourth finger of the right hand. Ultrasound showed an oval mass with fluid content, referred to a tendon cyst. A wide surgical excision was subsequently performed. The final histological diagnosis was Masson’s tumor. Discussion. The pathogenesis of intravascular papillary endothelial hyperplasia is still unclear but the exuberant endothelial cell proliferation might be stimulated by an autocrine loop of endothelial basic fibroblast growth factor (bFGF) secretion. There are three types of papillary endothelial hyperplasia: primary, or intravascular; secondary, or mixed; and extravascular. The main differential diagnosis is against pyogenic granuloma, Kaposi sarcoma, hemangioma, and angiosarcoma. Conclusions. Masson's tumor can be like both benign lesions and malignant neoplasms clinically and on ultrasound. For this reason, the right diagnosis can be made only by histology, which reveals a papillary growth composed of hyperplastic endothelial cells supported by delicate fibrous stalks entirely confined within the vascular lumen.
Resumo:
Background: The appearance of symptoms compatible with systemic autoimmune diseases has been described in relation to several viral infections like HIV, cytomegalovirus and especially PVB19, depending on the evolution of the immunological condition of the host and their age. We present a young immunocompetent male patient, with clinical manifestations simulating systemic lupus erythematosus (SLE) with important activation of cytokines. Methods: For quantification of the different cytokines in plasma, a commercially available multiplex bead immunoassay, based on the Luminex platform (Cat # HSCYTO-60SK-08, Milliplex® MAP High Sensitivity, Millipore), was used according to the manufacturer’s instructions. All samples were run in duplicate and the data (mean fluorescence intensity) were analyzed using a Luminex reader. The mean concentration was calculated using a standard curve. Results: The clinical evolution was favourable without the need for any specific treatment, showing complete recovery after two months. Whilst the symptoms and viral charge were disappearing, the anti-DNA continued to increase and we demonstrate important activation of IL-10, IL-6 and TNFα cytokines as a result of a hyperstimulating response by an immunocompetent hyperfunctional system, which persists after clinical improvement. We should emphasize the behaviour of two cytokines: IL-12p70 and IL-2, which showed opposite tendencies. Conclusions: Viral infections, especially PVB19, can produce or simulate several autoimmune diseases as a hyperstimulation response from an immunocompetent hyperfunctional system. Consequently, a persistent increase of autoantobodies and important activation of cytokines, even after clinical improvement and seroconversion, can be demonstrated.
Resumo:
Thermal Diagnostics experiments to be carried out on board LISA Pathfinder (LPF) will yield a detailed characterisation of how temperature fluctuations affect the LTP (LISA Technology Package) instrument performance, a crucial information for future space based gravitational wave detectors as the proposed eLISA. Amongst them, the study of temperature gradient fluctuations around the test masses of the Inertial Sensors will provide as well information regarding the contribution of the Brownian noise, which is expected to limit the LTP sensitivity at frequencies close to 1mHz during some LTP experiments. In this paper we report on how these kind of Thermal Diagnostics experiments were simulated in the last LPF Simulation Campaign (November, 2013) involving all the LPF Data Analysis team and using an end-to-end simulator of the whole spacecraft. Such simulation campaign was conducted under the framework of the preparation for LPF operations.
Resumo:
Objectives: to report a case of hypereosinophilic syndrome which presented clinically acute coronary syndrome. Materials and methods: we describe a case of a 69-year-old woman with acute coronary syndrome and peripheral hypereosinophilia. Results: the condition rapidly evolved to severe heart failure. Coronary disease was excluded by cardiac catheterization. Systemic corticosteroid therapy was initiated and further secondary causes of hypereosinophilia were excluded.
Resumo:
Intelligent agents offer a new and exciting way of understanding the world of work. In this paper we apply agent-based modeling and simulation to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between human resource management practices and retail productivity. Despite the fact we are working within a relatively novel and complex domain, it is clear that intelligent agents could offer potential for fostering sustainable organizational capabilities in the future. Our research so far has led us to conduct case study work with a top ten UK retailer, collecting data in four departments in two stores. Based on our case study data we have built and tested a first version of a department store simulator. In this paper we will report on the current development of our simulator which includes new features concerning more realistic data on the pattern of footfall during the day and the week, a more differentiated view of customers, and the evolution of customers over time. This allows us to investigate more complex scenarios and to analyze the impact of various management practices.
Resumo:
O fogo é um processo frequente nas paisagens do norte de Portugal. Estudos anteriores mostraram que os bosques de azinheira (Quercus rotundifolia) persistem após a passagem do fogo e ajudam a diminuir a sua intensidade e taxa de propagação. Os principais objetivos deste estudo foram compreender e modelar o efeito dos bosques de azinheira no comportamento do fogo ao nível da paisagem da bacia superior do rio Sabor, localizado no nordeste de Portugal. O impacto dos bosques de azinheira no comportamento do fogo foi testado em termos de área e configuração de acordo com cenários que simulam a possível distribuição destas unidades de vegetação na paisagem, considerando uma percentagem de ocupação da azinheira de 2.2% (Low), 18.1% (Moderate), 26.0% (High), e 39.8% (Rivers). Estes cenários tiveram como principal objetivo testar 1) o papel dos bosques de azinheira no comportamento do fogo e 2) de que forma a configuração das manchas de azinheira podem ajudar a diminuir a intensidade da linha de fogo e área ardida. Na modelação do comportamento do fogo foi usado o modelo FlamMap para simular a intensidade de linha do fogo e taxa de propagação do fogo com base em modelos de combustível associados a cada ocupação e uso do solo presente na área de estudo, e também com base em fatores topográficos (altitude, declive e orientação da encosta) e climáticos (humidade e velocidade do vento). Foram ainda usados dois modelos de combustível para a ocupação de azinheira (áreas interiores e de bordadura), desenvolvidos com base em dados reais obtidos na região. Usou-se o software FRAGSATS para a análise dos padrões espaciais das classes de intensidade de linha do fogo, usando-se as métricas Class Area (CA), Number of Patches (NP) e Large Patches Index (LPI). Os resultados obtidos indicaram que a intensidade da linha de fogo e a taxa de propagação do fogo variou entre cenários e entre modelos de combustível para o azinhal. A intensidade média da linha de fogo e a taxa média de propagação do fogo decresceu à medida que a percentagem de área de bosques de azinheira aumentou na paisagem. Também foi observado que as métricas CA, NP e LPI variaram entre cenários e modelos de combustível para o azinhal, decrescendo quando a percentagem de área de bosques de azinheira aumentou. Este estudo permitiu concluir que a variação da percentagem de ocupação e configuração espacial dos bosques de azinheira influenciam o comportamento do fogo, reduzindo, em termos médios, a intensidade da linha de fogo e a taxa de propagação, sugerindo que os bosques de azinhal podem ser usados como medidas silvícolas preventivas para diminuir o risco de incêndio nesta região.
Resumo:
Background: Reduced-representation sequencing technology iswidely used in genotyping for its economical and efficient features. A popular way to construct the reduced-representation sequencing libraries is to digest the genomic DNA with restriction enzymes. A key factor of this method is to determine the restriction enzyme(s). But there are few computer programs which can evaluate the usability of restriction enzymes in reduced-representation sequencing. SimRAD is an R package which can simulate the digestion of DNA sequence by restriction enzymes and return enzyme loci number as well as fragment number. But for linkage mapping analysis, enzyme loci distribution is also an important factor to evaluate the enzyme. For phylogenetic studies, comparison of the enzyme performance across multiple genomes is important. It is strongly needed to develop a simulation tool to implement these functions. Results: Here, we introduce a Perl module named RestrictionDigest with more functions and improved performance. It can analyze multiple genomes at one run and generate concise comparison of enzyme performance across the genomes. It can simulate single-enzyme digestion, double-enzyme digestion and size selection process and generate comprehensive information of the simulation including enzyme loci number, fragment number, sequences of the fragments, positions of restriction sites on the genome, the coverage of digested fragments on different genome regions and detailed fragment length distribution. Conclusions: RestrictionDigest is an easy-to-use Perl module with flexible parameter settings.With the help of the information produced by the module, researchers can easily determine the most appropriate enzymes to construct the reduced-representation libraries to meet their experimental requirements.
Resumo:
Female genital tuberculosis remains a major health problem in developing countries and is an important cause of infertility. As symptoms, laboratory data and physical fndings are non-specifc, its diagnosis can be diffcult. We describe a case of a 39-year-old woman suffering from peri-umbilical pain and increased abdominal size for one year, anorexia, asthenia, weight loss, occasionally dysuria and dyspareunia, and four months amenorrhea. Laboratory data revealed cancer antigen 125 (CA-125) level of 132.3 U/mL, erythrocyte sedimentation rate of 42 mm/h, and gamma-globulins of 2.66 g/dL. Computer Tomography scan showed loculated ascites. It was initially suspected a carcinomatous origin, but ascites evaluation was negative for malignant cells. Magnetic Resonance Imaging from another hospital showed endometrial heterogeneity. Therefore, an endometrial biopsy was performed demonstrating an infammatory infltrate with giant cells of type Langhans and bacteriological culture identifed Mycobacterium tuberculosis
Resumo:
This dissertation is devoted to the equations of motion governing the evolution of a fluid or gas at the macroscopic scale. The classical model is a PDE description known as the Navier-Stokes equations. The behavior of solutions is notoriously complex, leading many in the scientific community to describe fluid mechanics using a statistical language. In the physics literature, this is often done in an ad-hoc manner with limited precision about the sense in which the randomness enters the evolution equation. The stochastic PDE community has begun proposing precise models, where a random perturbation appears explicitly in the evolution equation. Although this has been an active area of study in recent years, the existing literature is almost entirely devoted to incompressible fluids. The purpose of this thesis is to take a step forward in addressing this statistical perspective in the setting of compressible fluids. In particular, we study the well posedness for the corresponding system of Stochastic Navier Stokes equations, satisfied by the density, velocity, and temperature. The evolution of the momentum involves a random forcing which is Brownian in time and colored in space. We allow for multiplicative noise, meaning that spatial correlations may depend locally on the fluid variables. Our main result is a proof of global existence of weak martingale solutions to the Cauchy problem set within a bounded domain, emanating from large initial datum. The proof involves a mix of deterministic and stochastic analysis tools. Fundamentally, the approach is based on weak compactness techniques from the deterministic theory combined with martingale methods. Four layers of approximate stochastic PDE's are built and analyzed. A careful study of the probability laws of our approximating sequences is required. We prove appropriate tightness results and appeal to a recent generalization of the Skorohod theorem. This ultimately allows us to deduce analogues of the weak compactness tools of Lions and Feireisl, appropriately interpreted in the stochastic setting.
Resumo:
Despite the wide swath of applications where multiphase fluid contact lines exist, there is still no consensus on an accurate and general simulation methodology. Most prior numerical work has imposed one of the many dynamic contact-angle theories at solid walls. Such approaches are inherently limited by the theory accuracy. In fact, when inertial effects are important, the contact angle may be history dependent and, thus, any single mathematical function is inappropriate. Given these limitations, the present work has two primary goals: 1) create a numerical framework that allows the contact angle to evolve naturally with appropriate contact-line physics and 2) develop equations and numerical methods such that contact-line simulations may be performed on coarse computational meshes.
Fluid flows affected by contact lines are dominated by capillary stresses and require accurate curvature calculations. The level set method was chosen to track the fluid interfaces because it is easy to calculate interface curvature accurately. Unfortunately, the level set reinitialization suffers from an ill-posed mathematical problem at contact lines: a ``blind spot'' exists. Standard techniques to handle this deficiency are shown to introduce parasitic velocity currents that artificially deform freely floating (non-prescribed) contact angles. As an alternative, a new relaxation equation reinitialization is proposed to remove these spurious velocity currents and its concept is further explored with level-set extension velocities.
To capture contact-line physics, two classical boundary conditions, the Navier-slip velocity boundary condition and a fixed contact angle, are implemented in direct numerical simulations (DNS). DNS are found to converge only if the slip length is well resolved by the computational mesh. Unfortunately, since the slip length is often very small compared to fluid structures, these simulations are not computationally feasible for large systems. To address the second goal, a new methodology is proposed which relies on the volumetric-filtered Navier-Stokes equations. Two unclosed terms, an average curvature and a viscous shear VS, are proposed to represent the missing microscale physics on a coarse mesh.
All of these components are then combined into a single framework and tested for a water droplet impacting a partially-wetting substrate. Very good agreement is found for the evolution of the contact diameter in time between the experimental measurements and the numerical simulation. Such comparison would not be possible with prior methods, since the Reynolds number Re and capillary number Ca are large. Furthermore, the experimentally approximated slip length ratio is well outside of the range currently achievable by DNS. This framework is a promising first step towards simulating complex physics in capillary-dominated flows at a reasonable computational expense.
Resumo:
Multi-agent systems offer a new and exciting way of understanding the world of work. We apply agent-based modeling and simulation to investigate a set of problems in a retail context. Specifically, we are working to understand the relationship between people management practices on the shop-floor and retail performance. Despite the fact we are working within a relatively novel and complex domain, it is clear that using an agent-based approach offers great potential for improving organizational capabilities in the future. Our multi-disciplinary research team has worked closely with one of the UK’s top ten retailers to collect data and build an understanding of shop-floor operations and the key actors in a department (customers, staff, and managers). Based on this case study we have built and tested our first version of a retail branch agent-based simulation model where we have focused on how we can simulate the effects of people management practices on customer satisfaction and sales. In our experiments we have looked at employee development and cashier empowerment as two examples of shop floor management practices. In this paper we describe the underlying conceptual ideas and the features of our simulation model. We present a selection of experiments we have conducted in order to validate our simulation model and to show its potential for answering “what-if” questions in a retail context. We also introduce a novel performance measure which we have created to quantify customers’ satisfaction with service, based on their individual shopping experiences.