995 resultados para flow analyses
Resumo:
When a computer program requires legitimate access to confidential data, the question arises whether such a program may illegally reveal sensitive information. This paper proposes a policy model to specify what information flow is permitted in a computational system. The security definition, which is based on a general notion of information lattices, allows various representations of information to be used in the enforcement of secure information flow in deterministic or nondeterministic systems. A flexible semantics-based analysis technique is presented, which uses the input-output relational model induced by an attacker's observational power, to compute the information released by the computational system. An illustrative attacker model demonstrates the use of the technique to develop a termination-sensitive analysis. The technique allows the development of various information flow analyses, parametrised by the attacker's observational power, which can be used to enforce what declassification policies.
Resumo:
Conservative medical treatment is commonly first recommended for patients with uncomplicated Type-B aortic dissection (AD). However, if dissection-related complications occur, endovascular repair or open surgery is performed. Here we establish computational models of AD based on radiological three-dimensional images of a patient at initial presentation and after 4-years of best medical treatment (BMT). Computational fluid dynamics analyses are performed to quantitatively investigate the hemodynamic features of AD. Entry and re-entries (functioning as entries and outlets) are identified in the initial and follow-up models, and obvious variations of the inter-luminal flow exchange are revealed. Computational studies indicate that the reduction of blood pressure in BMT patients lowers pressure and wall shear stress in the thoracic aorta in general, and flattens the pressure distribution on the outer wall of the dissection, potentially reducing the progressive enlargement of the false lumen. Finally, scenario studies of endovascular aortic repair are conducted. The results indicate that, for patients with multiple tears, stent-grafts occluding all re-entries would be required to effectively reduce inter-luminal blood communication and thus induce thrombosis in the false lumen. This implicates that computational flow analyses may identify entries and relevant re-entries between true and false lumen and potentially assist in stent-graft planning.
Resumo:
A mathematical model is presented for steady fluid flow across microvessel walls through a serial pathway consisting of the endothelial surface glycocalyx and the intercellular cleft between adjacent endothelial cells, with junction strands and their discontinuous gaps. The three-dimensional flow through the pathway from the vessel lumen to the tissue space has been computed numerically based on a Brinkman equation with appropriate values of the Darcy permeability. The predicted values of the hydraulic conductivity Lp, defined as the ratio of the flow rate per unit surface area of the vessel wall to the pressure drop across it, are close to experimental measurements for rat mesentery microvessels. If the values of the Darcy permeability for the surface glycocalyx are determined based on the regular arrangements of fibres with 6nm radius and 8nm spacing proposed recently from the detailed structural measurements, then the present study suggests that the surface glycocalyx could be much less resistant to flow compared to previous estimates by the one-dimensional flow analyses, and the intercellular cleft could be a major determinant of the hydraulic conductivity of the microvessel wall.
Resumo:
This study describes the pedagogical impact of real-world experimental projects undertaken as part of an advanced undergraduate Fluid Mechanics subject at an Australian university. The projects have been organised to complement traditional lectures and introduce students to the challenges of professional design, physical modelling, data collection and analysis. The physical model studies combine experimental, analytical and numerical work in order to develop students’ abilities to tackle real-world problems. A first study illustrates the differences between ideal and real fluid flow force predictions based upon model tests of buildings in a large size wind tunnel used for research and professional testing. A second study introduces the complexity arising from unsteady non-uniform wave loading on a sheltered pile. The teaching initiative is supported by feedback from undergraduate students. The pedagogy of the course and projects is discussed with reference to experiential, project-based and collaborative learning. The practical work complements traditional lectures and tutorials, and provides opportunities which cannot be learnt in the classroom, real or virtual. Student feedback demonstrates a strong interest for the project phases of the course. This was associated with greater motivation for the course, leading in turn to lower failure rates. In terms of learning outcomes, the primary aim is to enable students to deliver a professional report as the final product, where physical model data are compared to ideal-fluid flow calculations and real-fluid flow analyses. Thus the students are exposed to a professional design approach involving a high level of expertise in fluid mechanics, with sufficient academic guidance to achieve carefully defined learning goals, while retaining sufficient flexibility for students to construct there own learning goals. The overall pedagogy is a blend of problem-based and project-based learning, which reflects academic research and professional practice. The assessment is a mix of peer-assessed oral presentations and written reports that aims to maximise student reflection and development. Student feedback indicated a strong motivation for courses that include a well-designed project component.
Resumo:
Este projeto teve como objetivo a utilização de ferramentas e conceitos Lean, para análise e implementação da Melhoria de Processos em ambiente fabril, com o intuito de melhorar o sistema produtivo em algumas secções da empresa SCHMITT+SOHN Elevadores. Pretende-se nesta primeira fase uma melhoria na comunicação entre os vários setores fabris, assim como uma melhoria no fluxo dos materiais. Pretende-se igualmente uma simplificação das ações dos operadores nos seus locais de trabalho. Foi para isso necessário, numa fase inicial, a recolha e análise dos diversos dados necessários ao trabalho proposto. Foram feitos diversos VSM’s e análises de fluxo em duas secções da empresa, com o intuito de estudar a situação atual e posteriormente proceder à implementação de um sistema Kanban. Foram também analisados problemas a nível de stocks e armazenamento de matéria-prima e materiais diversos tendo sido concretizadas diversas soluções. O trabalho que apresentamos foi fortemente condicionado por imperativos ditados pela empresa, tendo o nosso estudo e aplicações praticas um suporte de apenas 3 meses. No restante período estivemos condicionados pela empresa a trabalhos de conhecimento do funcionamento da mesma, formação em Kaizen e preparação para as implementações de VSM e Kanban. Nesta fase é ainda prematura a validação dos resultados obtidos nas secções, visto que o processo para a implementação do sistema Kanban ainda se encontra em fase de desenvolvimento.
Resumo:
The papermaking industry has been continuously developing intelligent solutions to characterize the raw materials it uses, to control the manufacturing process in a robust way, and to guarantee the desired quality of the end product. Based on the much improved imaging techniques and image-based analysis methods, it has become possible to look inside the manufacturing pipeline and propose more effective alternatives to human expertise. This study is focused on the development of image analyses methods for the pulping process of papermaking. Pulping starts with wood disintegration and forming the fiber suspension that is subsequently bleached, mixed with additives and chemicals, and finally dried and shipped to the papermaking mills. At each stage of the process it is important to analyze the properties of the raw material to guarantee the product quality. In order to evaluate properties of fibers, the main component of the pulp suspension, a framework for fiber characterization based on microscopic images is proposed in this thesis as the first contribution. The framework allows computation of fiber length and curl index correlating well with the ground truth values. The bubble detection method, the second contribution, was developed in order to estimate the gas volume at the delignification stage of the pulping process based on high-resolution in-line imaging. The gas volume was estimated accurately and the solution enabled just-in-time process termination whereas the accurate estimation of bubble size categories still remained challenging. As the third contribution of the study, optical flow computation was studied and the methods were successfully applied to pulp flow velocity estimation based on double-exposed images. Finally, a framework for classifying dirt particles in dried pulp sheets, including the semisynthetic ground truth generation, feature selection, and performance comparison of the state-of-the-art classification techniques, was proposed as the fourth contribution. The framework was successfully tested on the semisynthetic and real-world pulp sheet images. These four contributions assist in developing an integrated factory-level vision-based process control.
Resumo:
The Southwest Indian Ridge segment that extends between 10° and 16° E has the slowest spreading rate of any other oceanic ridge (about 8.4 mm/year). In 2013 during the expedition ANTXXIX/8 seismology, geology, microbiology, heat flow analyses were carried out. Here, no hydrothermal plumes or black smoker systems were found but the results of the survey allowed to identify areas with peculiar characteristics: Area 1 with higher heat flux bsf; Area 2 where in 2002 the presence of hydrothermal emissions was hypothesized (Bach et al., 2002); Area 3 with anomalies of methane, ammonium, sulphide and dissolved inorganic carbon in pore water sediment profiles, and recovery of fauna vents. All these aspects suggest the presence of a hydrothermal circulation. Using Illumina 16S gene tag, statistical tools and phylogenetic trees, I provided a biological proof of the presence of hydrothermal circulation in this ridge segment. At Area 3, alpha and beta diversity indexes showed similarities with those described for venting microbial communities and about 40-70% of the dominant microbial community was found phylogenetically related to clones isolated hydrothermal-driven environments. Although the majority of chemosynthetic environment related taxa were not classified like autotrophic prokaryotes, some of them are key taxa in support of the presence of hydrothermal circulation, since they are partners of consortia or mediate specific reaction typically described for hydrothermal and seep environments, or are specialized organisms in exploiting labile organic substrates. Concluding, these results are remarkable because support the importance of ultra slow spreading ridge systems in contributing to global geochemical cycles and larval dispersion of vent fauna.
Resumo:
A reliable assessment of relevant substance flows is very important for environmental risk assessments and efficiency analysis of measures to reduce or avoid emissions of micropollutants like drugs to water systems. Accordingly, a detailed preparation of monitoring campaigns should include an accuracy check for the sampling configuration to prove the reliability of the monitoring results and the subsequent data processing. The accuracy of substance flow analyses is expected to be particularly weak for substances having high short-term variations of concentrations in sewage. This is especially the case linked to the observation of substance flows close to source in waste water systems. The verification of a monitoring configuration in a hospital sewer in Luxembourg is in the centre of interest of the case study presented here. A tracer test in the sewer system under observation is an essential element of the suggested accuracy check and provides valuable information for an uncertainty analysis. The results illustrate the importance of accuracy checks as an essential element of the preparation of monitoring campaigns. Moreover the study shows that continuous flow proportional sampling enables a representative observation of short-term peak loads of the iodinated x-ray contrast media iobitridol close to source.
Resumo:
Dissertação de mestrado integrado em Engenharia Biomédica (área de especialização em Engenharia Clínica)
Resumo:
BACKGROUND: According to recent guidelines, patients with coronary artery disease (CAD) should undergo revascularization if significant myocardial ischemia is present. Both, cardiovascular magnetic resonance (CMR) and fractional flow reserve (FFR) allow for a reliable ischemia assessment and in combination with anatomical information provided by invasive coronary angiography (CXA), such a work-up sets the basis for a decision to revascularize or not. The cost-effectiveness ratio of these two strategies is compared. METHODS: Strategy 1) CMR to assess ischemia followed by CXA in ischemia-positive patients (CMR + CXA), Strategy 2) CXA followed by FFR in angiographically positive stenoses (CXA + FFR). The costs, evaluated from the third party payer perspective in Switzerland, Germany, the United Kingdom (UK), and the United States (US), included public prices of the different outpatient procedures and costs induced by procedural complications and by diagnostic errors. The effectiveness criterion was the correct identification of hemodynamically significant coronary lesion(s) (= significant CAD) complemented by full anatomical information. Test performances were derived from the published literature. Cost-effectiveness ratios for both strategies were compared for hypothetical cohorts with different pretest likelihood of significant CAD. RESULTS: CMR + CXA and CXA + FFR were equally cost-effective at a pretest likelihood of CAD of 62% in Switzerland, 65% in Germany, 83% in the UK, and 82% in the US with costs of CHF 5'794, euro 1'517, £ 2'680, and $ 2'179 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. CONCLUSIONS: The CMR + CXA strategy is more cost-effective than CXA + FFR below a CAD prevalence of 62%, 65%, 83%, and 82% for the Swiss, the German, the UK, and the US health care systems, respectively. These findings may help to optimize resource utilization in the diagnosis of CAD.
Resumo:
A new compact system encompassing in flow gas diffusion unit and a wall-jet amperometric FIA detector, coated with a supramolecular porphyrin film, was specially designed as an alternative to the time-consuming Monier-Williams method, allowing fast, reproducible and accurate analyses of free sulphite species in fruit juices. In fact, a linear response between 0.64 and 6.4 ppm of sodium sulphite. LOD = 0.043 ppm, relative standard deviation of +/- 1.5% (n = 10) and analytical frequency of 85 analyses/h were obtained utilising optimised conditions. That superior analytical performance allows the precise evaluation of the amount of free sulphite present in foods, providing an important comparison between the standard addition and the standard injection methods. Although the first one is most frequently used, it was strongly influenced by matrix effects because of the unexpected reactivity of sulphite ions with the juice matrixes, leading to its partial consumption soon after addition. In contrast, the last method was not susceptible to matrix effects yielding accurate results, being more reliable for analytical purposes. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
As land is developed, the impervious surfaces that are created increase the amount of runoff during rainfall events, disrupting the natural hydrologic cycle, with an increment in volume of runoff and in pollutant loadings. Pollutants deposited or derived from an activity on the land surface will likely end up in stormwater runoff in some concentration, such as nutrients, sediment, heavy metals, hydrocarbons, gasoline additives, pathogens, deicers, herbicides and pesticides. Several of these pollutants are particulate-bound, so it appears clear that sediment removal can provide significant water-quality improvements and it appears to be important the knowledge of the ability of stromwater treatment devices to retain particulate matter. For this reason three different units which remove sediments have been tested through laboratory. In particular a roadside gully pot has been tested under steady hydraulic conditions, varying the characteristics of the influent solids (diameter, particle size distribution and specific gravity). The efficiency in terms of particles retained has been evaluated as a function of influent flow rate and particles characteristics; results have been compared to efficiency evaluated applying an overflow rate model. Furthermore the role of particles settling velocity in efficiency determination has been investigated. After the experimental runs on the gully pot, a standard full-scale model of an hydrodynamic separator (HS) has been tested under unsteady influent flow rate condition, and constant solid concentration at the input. The results presented in this study illustrate that particle separation efficiency of the unit is predominately influenced by operating flow rate, which strongly affects the particles and hydraulic residence time of the system. The efficiency data have been compared to results obtained from a modified overflow rate model; moreover the residence time distribution has been experimentally determined through tracer analyses for several steady flow rates. Finally three testing experiments have been performed for two different configurations of a full-scale model of a clarifier (linear and crenulated) under unsteady influent flow rate condition, and constant solid concentration at the input. The results illustrate that particle separation efficiency of the unit is predominately influenced by the configuration of the unit itself. Turbidity measures have been used to compare turbidity with the suspended sediments concentration, in order to find a correlation between these two values, which can allow to have a measure of the sediments concentration simply installing a turbidity probe.
Resumo:
Long-term management plans for restoration of natural flow conditions through the Everglades increase the importance of understanding potential nutrient impacts of increased freshwater delivery on Florida Bay biogeochemistry. Planktonic communities respond quickly to changes in water quality, thus spatial variability in community composition and relationships to nutrient parameters must be understood in order to evaluate future downstream impacts of modifications to Everglades hydrology. Here we present initial results combining flow cytometry analyses of phytoplankton and bacterial populations (0.1–50 μm size fraction) with measurements of δ13C and δ15N composition and dissolved inorganic nutrient concentrations to explore proxies for planktonic species assemblage compositions and nutrient cycling. Particulate organic material in the 0.1–50 μm size fraction was collected from five stations in Northeastern and Western Florida Bay to characterize spatial variability in species assemblage and stable isotopic composition. A dense bloom of the picocyanobacterium, Synechococcus elongatus, was observed at Western Florida Bay sites. Smaller Synechococcus sp. were present at Northeast sites in much lower abundance. Bacteria and detrital particles were also more abundant at Western Florida Bay stations than in the northeast region. The highest abundance of detritus occurred at Trout Creek, which receives freshwater discharge from the Everglades through Taylor Slough. In terms of nutrient availability and stable isotopic values, the S. elongatus population in the Western bay corresponded to low DIN (0.5 μM NH 4 + ; 0.2 μM NO 3 − ) concentrations and depleted δ15N signatures ranging from +0.3 to +0.8‰, suggesting that the bloom supported high productivity levels through N2-fixation. δ15N values from the Northeast bay were more enriched (+2.0 to +3.0‰), characteristic of N-recycling. δ13C values were similar for all marine Florida Bay stations, ranging from −17.6 to −14.4‰, however were more depleted at the mangrove ecotone station (−25.5 to −22.3‰). The difference in the isotopic values reflects differences in carbon sources. These findings imply that variations in resource availability and nutrient sources exert significant control over planktonic community composition, which is reflected by stable isotopic signatures.
Field data, numerical simulations and probability analyses to assess lava flow hazards at Mount Etna
Resumo:
Improving lava flow hazard assessment is one of the most important and challenging fields of volcanology, and has an immediate and practical impact on society. Here, we present a methodology for the quantitative assessment of lava flow hazards based on a combination of field data, numerical simulations and probability analyses. With the extensive data available on historic eruptions of Mt. Etna, going back over 2000 years, it has been possible to construct two hazard maps, one for flank and the other for summit eruptions, allowing a quantitative analysis of the most likely future courses of lava flows. The effective use of hazard maps of Etna may help in minimizing the damage from volcanic eruptions through correct land use in densely urbanized area with a population of almost one million people. Although this study was conducted on Mt. Etna, the approach used is designed to be applicable to other volcanic areas.
Resumo:
Hedgerows represent important components of agri-environment landscapes that are increasingly coming under threat from climate change, emergent diseases, invasive species and land use change. Given that population genetic data can be used to inform best-practice management strategies for woodland and hedgerow tree species, we carried out a study on hawthorn (Crataegus monogyna Jacq.), a key component of hedgerows, on a regional basis using a combination of nuclear and chloroplast microsatellite markers. We found that levels of genetic diversity were high and comparable to, or slightly higher than, other tree species from the same region. Levels of population differentiation for both sets of markers, however, were extremely low, suggesting extensive gene flow via both seed and pollen. These findings suggest that a holistic approach to woodland management, one which does not necessarily rely on the concept of “seed zones” previously suggested, but which also takes into account populations with high and/or rare chloroplast (i.e. seed-specific) genetic variation, might be the best approach to restocking and replanting.