946 resultados para Piecewise Interpolation
Resumo:
This thesis introduces the L1 Adaptive Control Toolbox, a set of tools implemented in Matlab that aid in the design process of an L1 adaptive controller and enable the user to construct simulations of the closed-loop system to verify its performance. Following a brief review of the existing theory on L1 adaptive controllers, the interface of the toolbox is presented, including a description of the functions accessible to the user. Two novel algorithms for determining the required sampling period of a piecewise constant adaptive law are presented and their implementation in the toolbox is discussed. The detailed description of the structure of the toolbox is provided as well as a discussion of the implementation of the creation of simulations. Finally, the graphical user interface is presented and described in detail, including the graphical design tools provided for the development of the filter C(s). The thesis closes with suggestions for further improvement of the toolbox.
Resumo:
fuzzySim is an R package for calculating fuzzy similarity in species occurrence patterns. It includes functions for data preparation, such as converting species lists (long format) to presence-absence tables (wide format), obtaining unique abbreviations of species names, or transposing (parts of) complex data frames; and sample data sets for providing practical examples. It can convert binary presence-absence to fuzzy occurrence data, using e.g. trend surface analysis, inverse distance interpolation or prevalence-independent environmental favourability modelling, for multiple species simultaneously. It then calculates fuzzy similarity among (fuzzy) species distributions and/or among (fuzzy) regional species compositions. Currently available similarity indices are Jaccard, Sørensen, Simpson, and Baroni-Urbani & Buser.
Resumo:
Atualmente, sensores remotos e computadores de alto desempenho estão sendo utilizados como instrumentos principais na coleta e produção de dados oceanográficos. De posse destes dados, é possível realizar estudos que permitem simular e prever o comportamento do oceano por meio de modelos numéricos regionais. Dentre os fatores importantes no estudo da oceanografia, podem ser destacados àqueles referentes aos impactos ambientais, de contaminação antrópica, utilização de energias renováveis, operações portuárias e etc. Contudo, devido ao grande volume de dados gerados por instituições ambientais, na forma de resultados de modelos globais como o HYCOM (Hybrid Coordinate Ocean Model) e dos programas de Reanalysis da NOAA (National Oceanic and Atmospheric Administration), torna-se necessária a criação de rotinas computacionais para realizar o tratamento de condições iniciais e de contorno, de modo que possam ser aplicadas a modelos regionais como o TELEMAC3D (www.opentelemac.org). Problemas relacionados a baixa resolução, ausência de dados e a necessidade de interpolação para diferentes malhas ou sistemas de coordenadas verticais, tornam necessária a criação de um mecanismo computacional que realize este tratamento adequadamente. Com isto, foram desenvolvidas rotinas na linguagem de programação Python, empregando interpoladores de vizinho mais próximo, de modo que, a partir de dados brutos dos modelos HYCOM e do programa de Reanalysis da NOAA, foram preparadas condições iniciais e de contorno para a realização de uma simulação numérica teste. Estes resultados foram confrontados com outro resultado numérico onde, as condições foram construídas a partir de um método de interpolação mais sofisticado, escrita em outra linguagem, e que já vem sendo utilizada no laboratório. A análise dos resultados permitiu concluir que, a rotina desenvolvida no âmbito deste trabalho, funciona adequadamente para a geração de condições iniciais e de contorno do modelo TELEMAC3D. Entretanto, um interpolador mais sofisticado deve ser desenvolvido de forma a aumentar a qualidade nas interpolações, otimizar o custo computacional, e produzir condições que sejam mais realísticas para a utilização do modelo TELEMAC3D.
Resumo:
Untreated effluents that reach surface water affect the aquatic life and humans. This study aimed to evaluate the wastewater s toxicity (municipal, industrial and shrimp pond effluents) released in the Estuarine Complex of Jundiaí- Potengi, Natal/RN, through chronic quantitative e qualitative toxicity tests using the test organism Mysidopsis Juniae, CRUSTACEA, MYSIDACEA (Silva, 1979). For this, a new methodology for viewing chronic effects on organisms of M. juniae was used (only renewal), based on another existing methodology to another testorganism very similar to M. Juniae, the M. Bahia (daily renewal).Toxicity tests 7 days duration were used for detecting effects on the survival and fecundity in M. juniae. Lethal Concentration 50% (LC50%) was determined by the Trimmed Spearman-Karber; Inhibition Concentration 50% (IC50%) in fecundity was determined by Linear Interpolation. ANOVA (One Way) tests (p = 0.05) were used to determinate the No Observed Effect Concentration (NOEC) and Low Observed Effect Concentration (LOEC). Effluents flows were measured and the toxic load of the effluents was estimated. Multivariate analysis - Principal Component Analysis (PCA) and Correspondence Analysis (CA) - identified the physic-chemical parameters better explain the patterns of toxicity found in survival and fecundity of M. juniae. We verified the feasibility of applying the only renewal system in chronic tests with M. Juniae. Most efluentes proved toxic on the survival and fecundity of M. Juniae, except for some shrimp pond effluents. The most toxic effluent was ETE Lagoa Aerada (LC50, 6.24%; IC50, 4.82%), ETE Quintas (LC50, 5.85%), Giselda Trigueiro Hospital (LC50, 2.05%), CLAN (LC50, 2.14%) and COTEMINAS (LC50, IC50 and 38.51%, 6.94%). The greatest toxic load was originated from ETE inefficient high flow effluents, textile effluents and CLAN. The organic load was related to the toxic effects of wastewater and hospital effluents in survival of M. Juniae, as well as heavy metals, total residual chlorine and phenols. In industrial effluents was found relationship between toxicity and organic load, phenols, oils and greases and benzene. The effects on fertility were related, in turn, with chlorine and heavy metals. Toxicity tests using other organisms of different trophic levels, as well as analysis of sediment toxicity are recommended to confirm the patterns found with M. Juniae. However, the results indicate the necessity for implementation and improvement of sewage treatment systems affluent to the Potengi s estuary
Resumo:
Generalised refraction is a topic which has, thus far, garnered far less attention than it deserves. The purpose of this thesis is to highlight the potential that generalised refraction has to offer with regards to imaging and its application to designing new passive optical devices. Specifically in this thesis we will explore two types of gener- alised refraction which takes place across a planar interface: refraction by generalised confocal lenslet arrays (gCLAs), and refraction by ray-rotation sheets. We will show that the corresponding laws of refraction for these interfaces produce, in general, light-ray fields with non-zero curl, and as such do not have a corresponding outgoing waveform. We will then show that gCLAs perform integral, geometrical imaging, and that this enables them to be considered as approximate realisations of metric tensor interfaces. The concept of piecewise transformation optics will be introduced and we will show that it is possible to use gCLAs along with other optical elements such as lenses to design simple piecewise transformation-optics devices such as invisibility cloaks and insulation windows. Finally, we shall show that ray-rotation sheets can be interpreted as performing geometrical imaging into complex space, and that as a consequence, ray-rotation sheets and gCLAs may in fact be more closely related than first realised. We conclude with a summary of potential future projects which lead naturally from the results of this thesis.
Resumo:
Tese (doutorado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Civil e Ambiental, 2015.
Simulação numérica da convecção mista em cavidade preenchida com meio poroso heterogêneo e homogêneo
Resumo:
In this work is presented mixed convection heat transfer inside a lid-driven cavity heated from below and filled with heterogeneous and homogeneous porous medium. In the heterogeneous approach, the solid domain is represented by heat conductive equally spaced blocks; the fluid phase surrounds the blocks being limited by the cavity walls. The homogeneous or pore-continuum approach is characterized by the cavity porosity and permeability. Generalized mass, momentum and energy conservation equations are obtained in dimensionless form to represent both the continuum and the pore-continuum models. The numerical solution is obtained via the finite volume method. QUICK interpolation scheme is set for numerical treatment of the advection terms and SIMPLE algorithm is applied for pressure-velocity coupling. Aiming the laminar regime, the flow parameters are kept in the range of 102≤Re≤103 and 103≤Ra≤106 for both the heterogeneous and homogeneous approaches. In the tested configurations for the continuous model, 9, 16, 36, and 64 blocks are considered for each combination of Re and Ra being the microscopic porosity set as constant φ=0,64 . For the pore-continuum model the Darcy number (Da) is set according to the number of blocks in the heterogeneous cavity and the φ. Numerical results of the comparative study between the microscopic and macroscopic approaches are presented. As a result, average Nusselt number equations for the continuum and the pore continuum models as a function of Ra and Re are obtained.
Resumo:
When performing Particle Image Velocimetry (PIV) measurements in complex fluid flows with moving interfaces and a two-phase flow, it is necessary to develop a mask to remove non-physical measurements. This is the case when studying, for example, the complex bubble sweep-down phenomenon observed in oceanographic research vessels. Indeed, in such a configuration, the presence of an unsteady free surface, of a solid–liquid interface and of bubbles in the PIV frame, leads to generate numerous laser reflections and therefore spurious velocity vectors. In this note, an image masking process is developed to successively identify the boundaries of the ship and the free surface interface. As the presence of the solid hull surface induces laser reflections, the hull edge contours are simply detected in the first PIV frame and dynamically estimated for consecutive ones. As for the unsteady surface determination, a specific process is implemented like the following: i) the edge detection of the gradient magnitude in the PIV frame, ii) the extraction of the particles by filtering high-intensity large areas related to the bubbles and/or hull reflections, iii) the extraction of the rough region containing these particles and their reflections, iv) the removal of these reflections. The unsteady surface is finally obtained with a fifth-order polynomial interpolation. The resulted free surface is successfully validated from the Fourier analysis and by visualizing selected PIV images containing numerous spurious high intensity areas. This paper demonstrates how this data analysis process leads to PIV images database without reflections and an automatic detection of both the free surface and the rigid body. An application of this new mask is finally detailed, allowing a preliminary analysis of the hydrodynamic flow.
Resumo:
This work is concerned with the design and analysis of hp-version discontinuous Galerkin (DG) finite element methods for boundary-value problems involving the biharmonic operator. The first part extends the unified approach of Arnold, Brezzi, Cockburn & Marini (SIAM J. Numer. Anal. 39, 5 (2001/02), 1749-1779) developed for the Poisson problem, to the design of DG methods via an appropriate choice of numerical flux functions for fourth order problems; as an example we retrieve the interior penalty DG method developed by Suli & Mozolevski (Comput. Methods Appl. Mech. Engrg. 196, 13-16 (2007), 1851-1863). The second part of this work is concerned with a new a-priori error analysis of the hp-version interior penalty DG method, when the error is measured in terms of both the energy-norm and L2-norm, as well certain linear functionals of the solution, for elemental polynomial degrees $p\ge 2$. Also, provided that the solution is piecewise analytic in an open neighbourhood of each element, exponential convergence is also proven for the p-version of the DG method. The sharpness of the theoretical developments is illustrated by numerical experiments.
Resumo:
Measuring the extent to which a piece of structural timber has distorted at a macroscopic scale is fundamental to assessing its viability as a structural component. From the sawmill to the construction site, as structural timber dries, distortion can render it unsuitable for its intended purposes. This rejection of unusable timber is a considerable source of waste to the timber industry and the wider construction sector. As such, ensuring accurate measurement of distortion is a key step in addressing ineffciencies within timber processing. Currently, the FRITS frame method is the established approach used to gain an understanding of timber surface profile. The method, while reliable, is dependent upon relatively few measurements taken across a limited area of the overall surface, with a great deal of interpolation required. Further, the process is unavoidably slow and cumbersome, the immobile scanning equipment limiting where and when measurements can be taken and constricting the process as a whole. This thesis seeks to introduce LiDAR scanning as a new, alternative approach to distortion feature measurement. In its infancy as a measurement technique within timber research, the practicalities of using LiDAR scanning as a measurement method are herein demonstrated, exploiting many of the advantages the technology has over current approaches. LiDAR scanning creates a much more comprehensive image of a timber surface, generating input data multiple magnitudes larger than that of the FRITS frame. Set-up and scanning time for LiDAR is also much quicker and more flexible than existing methods. With LiDAR scanning the measurement process is freed from many of the constraints of the FRITS frame and can be done in almost any environment. For this thesis, surface scans were carried out on seven Sitka spruce samples of dimensions 48.5x102x3000mm using both the FRITS frame and LiDAR scanner. The samples used presented marked levels of distortion and were relatively free from knots. A computational measurement model was created to extract feature measurements from the raw LiDAR data, enabling an assessment of each piece of timber to be carried out in accordance with existing standards. Assessment of distortion features focused primarily on the measurement of twist due to its strong prevalence in spruce and the considerable concern it generates within the construction industry. Additional measurements of surface inclination and bow were also made with each method to further establish LiDAR's credentials as a viable alternative. Overall, feature measurements as generated by the new LiDAR method compared well with those of the established FRITS method. From these investigations recommendations were made to address inadequacies within existing measurement standards, namely their reliance on generalised and interpretative descriptions of distortion. The potential for further uses of LiDAR scanning within timber researches was also discussed.
Resumo:
Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.
Resumo:
The study of Quality of Life (Qol) has been conducted on various scales throughout the years with focus on assessing overall quality of living amongst citizens. The main focus in these studies have been on economic factors, with the purpose of creating a Quality of Life Index (QLI).When it comes down to narrowing the focus to the environment and factors like Urban Green Spaces (UGS) and air quality the topic gets more focused on pointing out how each alternative meets this certain criteria. With the benefits of UGS and a healthy environment in focus a new Environmental Quality of Life Index (EQLI) will be proposed by incorporating Multi Criteria Analysis (MCA) and Geographical Information Systems (GIS). Working with MCA on complex environmental problems and incorporating it with GIS is a challenging but rewarding task, and has proven to be an efficient approach among environmental scientists. Background information on three MCA methods will be shown: Analytical Hierarchy Process (AHP), Regime Analysis and PROMETHEE. A survey based on a previous study conducted on the status of UGS within European cities was sent to 18 municipalities in the study area. The survey consists of evaluating the current status of UGS as well as planning and management of UGS with in municipalities for the purpose of getting criteria material for the selected MCA method. The current situation of UGS is assessed with use of GIS software and change detection is done on a 10 year period using NDVI index for comparison purposes to one of the criteria in the MCA. To add to the criteria, interpolation of nitrogen dioxide levels was performed with ordinary kriging and the results transformed into indicator values. The final outcome is an EQLI map with indicators of environmentally attractive municipalities with ranking based on predefinedMCA criteria using PROMETHEE I pairwise comparison and PROMETHEE II complete ranking of alternatives. The proposed methodology is applied to Lisbon’s Metropolitan Area, Portugal.
Resumo:
The organophosphate temephos has been the main insecticide used against larvae of the dengue and yellow fever mosquito ( Aedes aegypti ) in Brazil since the mid-1980s. Reports of resistance date back to 1995; however, no systematic reports of widespread temephos resistance have occurred to date. As resistance investigation is paramount for strategic decision-making by health officials, our objective here was to investigate the spatial and temporal spread of temephos resistance in Ae. aegypti in Brazil for the last 12 years using discriminating temephos concentrations and the bioassay protocols of the World Health Organization. The mortality results obtained were subjected to spatial analysis for distance interpolation using semi-variance models to generate maps that depict the spread of temephos resistance in Brazil since 1999. The problem has been expanding. Since 2002-2003, approximately half the country has exhibited mosquito populations resistant to temephos. The frequency of temephos resistance and, likely, control failures, which start when the insecticide mortality level drops below 80%, has increased even further since 2004. Few parts of Brazil are able to achieve the target 80% efficacy threshold by 2010/2011, resulting in a significant risk of control failure by temephos in most of the country. The widespread resistance to temephos in Brazilian Ae. aegypti populations greatly compromise effective mosquito control efforts using this insecticide and indicates the urgent need to identify alternative insecticides aided by the preventive elimination of potential mosquito breeding sites.
Resumo:
One of the main activities in the petroleum engineering is to estimate the oil production in the existing oil reserves. The calculation of these reserves is crucial to determine the economical feasibility of your explotation. Currently, the petroleum industry is facing problems to analyze production due to the exponentially increasing amount of data provided by the production facilities. Conventional reservoir modeling techniques like numerical reservoir simulation and visualization were well developed and are available. This work proposes intelligent methods, like artificial neural networks, to predict the oil production and compare the results with the ones obtained by the numerical simulation, method quite a lot used in the practice to realization of the oil production prediction behavior. The artificial neural networks will be used due your learning, adaptation and interpolation capabilities
Resumo:
Room temperature electroreflectance (ER) spectroscopy has been used to study the fundamental properties of AlxInyGa${}_{1-x-y}$N/AlN/GaN heterostructures under different applied bias. The (0001)-oriented heterostructures were grown by metal-organic vapor phase epitaxy on sapphire. The band gap energy of the AlxInyGa${}_{1-x-y}{\rm{N}}$ layers has been determined from analysis of the ER spectra using Aspnes' model. The obtained values are in good agreement with a nonlinear band gap interpolation equation proposed earlier. Bias-dependent ER allows one to determine the sheet carrier density of the two-dimensional electron gas and the barrier field strength.