866 resultados para the Fuzzy Colour Segmentation Algorithm


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

What is the secret behind Nokia’s phenomenal success in mobile phones? More generally, how to succeed in a turbulent high technology market? Historical and comparative studies bring understanding not only about past phenomena, but also insights about current and coming ones. In this paper we compare Ford and General Motors in the 1920s and 1930s with Motorola and Nokia in the 1980s and 1990s. The insight from this comparison is how reinvention of the product, category, relative to the market incumbent’s market conceptualization, is a recipe for success. Nokia’s success in the 1990s was based on following the guidelines pioneered by General Motors in the 1920s: market segmentation, embracing the latest technological advances, and brand management in the pursuit of consumer repurchases. The article tells how Nokia addresses both GM’s legacy and changes in the business environment. At the end of our paper, we disuss the generalizability of our findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a semi-parametric Algorithm for parsing football video structures. The approach works on a two interleaved based process that closely collaborate towards a common goal. The core part of the proposed method focus perform a fast automatic football video annotation by looking at the enhance entropy variance within a series of shot frames. The entropy is extracted on the Hue parameter from the HSV color system, not as a global feature but in spatial domain to identify regions within a shot that will characterize a certain activity within the shot period. The second part of the algorithm works towards the identification of dominant color regions that could represent players and playfield for further activity recognition. Experimental Results shows that the proposed football video segmentation algorithm performs with high accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A compiled set of in situ data is important to evaluate the quality of ocean-colour satellite-data records. Here we describe the data compiled for the validation of the ocean-colour products from the ESA Ocean Colour Climate Change Initiative (OC-CCI). The data were acquired from several sources (MOBY, BOUSSOLE, AERONET-OC, SeaBASS, NOMAD, MERMAID, AMT, ICES, HOT, GeP&CO), span between 1997 and 2012, and have a global distribution. Observations of the following variables were compiled: spectral remote-sensing reflectances, concentrations of chlorophyll a, spectral inherent optical properties and spectral diffuse attenuation coefficients. The data were from multi-project archives acquired via the open internet services or from individual projects, acquired directly from data providers. Methodologies were implemented for homogenisation, quality control and merging of all data. No changes were made to the original data, other than averaging of observations that were close in time and space, elimination of some points after quality control and conversion to a standard format. The final result is a merged table designed for validation of satellite-derived ocean-colour products and available in text format. Metadata of each in situ measurement (original source, cruise or experiment, principal investigator) were preserved throughout the work and made available in the final table. Using all the data in a validation exercise increases the number of matchups and enhances the representativeness of different marine regimes. By making available the metadata, it is also possible to analyse each set of data separately. The compiled data are available at doi: 10.1594/PANGAEA.854832 (Valente et al., 2015).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Entender o comportamento e suas pequenas variações decorrentes das mudanças do ambiente térmico e desenvolver modelos que simulem o bem-estar a partir de respostas das aves ao ambiente constituem o primeiro passo para a criação de um sistema de monitoramento digital de aves em galpões de produção. Neste trabalho, foi desenvolvido um sistema de suporte à decisão com base na teoria dos conjuntos fuzzy para a estimativa do bem-estar de matrizes pesadas em função de frequências e duração dos comportamentos expressos pelas aves. O desenvolvimento do sistema passou por cinco etapas distintas: 1) organização dos dados experimentais; 2) apresentação dos vídeos em entrevista com especialista; 3) criação das funções de pertinência com base nas entrevistas e na revisão da literatura; 4) simulação de frequências de ocorrências e tempos médios de expressão dos comportamentos classificados como indicadores de bem-estar utilizando equações de regressão obtidas na literatura, e 5) construção das regras, simulação e validação do sistema. O sistema fuzzy desenvolvido estimou satisfatoriamente o bem-estar de matrizes pesadas, tendo na sua última versão, com maior número de regras, acertado 77,8% dos dados experimentais, comparados com as respostas esperadas por um especialista. O sistema pode ser utilizado como instrumento matemático-computacional para apoiar decisões em galpões de produção de matrizes pesadas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given a 2manifold triangular mesh \(M \subset {\mathbb {R}}^3\), with border, a parameterization of \(M\) is a FACE or trimmed surface \(F=\{S,L_0,\ldots, L_m\}\) -- \(F\) is a connected subset or region of a parametric surface \(S\), bounded by a set of LOOPs \(L_0,\ldots ,L_m\) such that each \(L_i \subset S\) is a closed 1manifold having no intersection with the other \(L_j\) LOOPs -- The parametric surface \(S\) is a statistical fit of the mesh \(M\) -- \(L_0\) is the outermost LOOP bounding \(F\) and \(L_i\) is the LOOP of the ith hole in \(F\) (if any) -- The problem of parameterizing triangular meshes is relevant for reverse engineering, tool path planning, feature detection, redesign, etc -- Stateofart mesh procedures parameterize a rectangular mesh \(M\) -- To improve such procedures, we report here the implementation of an algorithm which parameterizes meshes \(M\) presenting holes and concavities -- We synthesize a parametric surface \(S \subset {\mathbb {R}}^3\) which approximates a superset of the mesh \(M\) -- Then, we compute a set of LOOPs trimming \(S\), and therefore completing the FACE \(F=\ {S,L_0,\ldots ,L_m\}\) -- Our algorithm gives satisfactory results for \(M\) having low Gaussian curvature (i.e., \(M\) being quasi-developable or developable) -- This assumption is a reasonable one, since \(M\) is the product of manifold segmentation preprocessing -- Our algorithm computes: (1) a manifold learning mapping \(\phi : M \rightarrow U \subset {\mathbb {R}}^2\), (2) an inverse mapping \(S: W \subset {\mathbb {R}}^2 \rightarrow {\mathbb {R}}^3\), with \ (W\) being a rectangular grid containing and surpassing \(U\) -- To compute \(\phi\) we test IsoMap, Laplacian Eigenmaps and Hessian local linear embedding (best results with HLLE) -- For the back mapping (NURBS) \(S\) the crucial step is to find a control polyhedron \(P\), which is an extrapolation of \(M\) -- We calculate \(P\) by extrapolating radial basis functions that interpolate points inside \(\phi (M)\) -- We successfully test our implementation with several datasets presenting concavities, holes, and are extremely nondevelopable -- Ongoing work is being devoted to manifold segmentation which facilitates mesh parameterization

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rainflow counting methods convert a complex load time history into a set of load reversals for use in fatigue damage modeling. Rainflow counting methods were originally developed to assess fatigue damage associated with mechanical cycling where creep of the material under load was not considered to be a significant contributor to failure. However, creep is a significant factor in some cyclic loading cases such as solder interconnects under temperature cycling. In this case, fatigue life models require the dwell time to account for stress relaxation and creep. This study develops a new version of the multi-parameter rainflow counting algorithm that provides a range-based dwell time estimation for use with time-dependent fatigue damage models. To show the applicability, the method is used to calculate the life of solder joints under a complex thermal cycling regime and is verified by experimental testing. An additional algorithm is developed in this study to provide data reduction in the results of the rainflow counting. This algorithm uses a damage model and a statistical test to determine which of the resultant cycles are statistically insignificant to a given confidence level. This makes the resulting data file to be smaller, and for a simplified load history to be reconstructed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Doutoramento em Engenharia Florestal e dos Recursos Naturais - Instituto Superior de Agronomia - UL

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a proposal to detect interface in atmospheric oil tanks by installing a differential pressure level transmitter to infer the oil-water interface. The main goal of this project is to maximize the quantity of free water that is delivered to the drainage line by controlling the interface. A Fuzzy Controller has been implemented by using the interface transmitter as the Process Variable. Two ladder routine was generated to perform the control. One routine was developed to calculate the error and error variation. The other was generate to develop the fuzzy controller itself. By using rules, the fuzzy controller uses these variables to set the output. The output is the position variation of the drainage valve. Although the ladder routine was implemented into an Allen Bradley PLC, Control Logix family it can be implemented into any brand of PLCs

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurement and modeling techniques were developed to improve over-water gaseous air-water exchange measurements for persistent bioaccumulative and toxic chemicals (PBTs). Analytical methods were applied to atmospheric measurements of hexachlorobenzene (HCB), polychlorinated biphenyls (PCBs), and polybrominated diphenyl ethers (PBDEs). Additionally, the sampling and analytical methods are well suited to study semivolatile organic compounds (SOCs) in air with applications related to secondary organic aerosol formation, urban, and indoor air quality. A novel gas-phase cleanup method is described for use with thermal desorption methods for analysis of atmospheric SOCs using multicapillary denuders. The cleanup selectively removed hydrogen-bonding chemicals from samples, including much of the background matrix of oxidized organic compounds in ambient air, and thereby improved precision and method detection limits for nonpolar analytes. A model is presented that predicts gas collection efficiency and particle collection artifact for SOCs in multicapillary denuders using polydimethylsiloxane (PDMS) sorbent. An approach is presented to estimate the equilibrium PDMS-gas partition coefficient (Kpdms) from an Abraham solvation parameter model for any SOC. A high flow rate (300 L min-1) multicapillary denuder was designed for measurement of trace atmospheric SOCs. Overall method precision and detection limits were determined using field duplicates and compared to the conventional high-volume sampler method. The high-flow denuder is an alternative to high-volume or passive samplers when separation of gas and particle-associated SOCs upstream of a filter and short sample collection time are advantageous. A Lagrangian internal boundary layer transport exchange (IBLTE) Model is described. The model predicts the near-surface variation in several quantities with fetch in coastal, offshore flow: 1) modification in potential temperature and gas mixing ratio, 2) surface fluxes of sensible heat, water vapor, and trace gases using the NOAA COARE Bulk Algorithm and Gas Transfer Model, 3) vertical gradients in potential temperature and mixing ratio. The model was applied to interpret micrometeorological measurements of air-water exchange flux of HCB and several PCB congeners in Lake Superior. The IBLTE Model can be applied to any scalar, including water vapor, carbon dioxide, dimethyl sulfide, and other scalar quantities of interest with respect to hydrology, climate, and ecosystem science.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-orthogonal multiple access (NOMA) is emerging as a promising multiple access technology for the fifth generation cellular networks to address the fast growing mobile data traffic. It applies superposition coding in transmitters, allowing simultaneous allocation of the same frequency resource to multiple intra-cell users. Successive interference cancellation is used at the receivers to cancel intra-cell interference. User pairing and power allocation (UPPA) is a key design aspect of NOMA. Existing UPPA algorithms are mainly based on exhaustive search method with extensive computation complexity, which can severely affect the NOMA performance. A fast proportional fairness (PF) scheduling based UPPA algorithm is proposed to address the problem. The novel idea is to form user pairs around the users with the highest PF metrics with pre-configured fixed power allocation. Systemlevel simulation results show that the proposed algorithm is significantly faster (seven times faster for the scenario with 20 users) with a negligible throughput loss than the existing exhaustive search algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kiwi fruit is a highly nutritional fruit due to the high level of vitamin C and its strong antioxidant capacity due to a wide number of phytonutrients including carotenoids, lutein, phenolics, flavonoids and chlorophyll [1]. Drying consists of a complex process in which simultaneous heat and mass transfer occur. Several alterations occur during the drying of foods at many levels (physical, chemical, nutritional or sensorial) which are influenced by a number of factors, including processing conditions [2]. Temperature is particularly important because of the effects it produces at the chemical and also at the physical level, particularly colour and texture [3]. In the present work were evaluated the changes in sliced kiwi when exposed to air drying at different temperatures (50, 60, 70, 80 ºC), namely in terms of some chemical properties like ascorbic acid or phenolic compounds, physical characteristics like colour and texture and also at the sensorial level. All experiments followed standard established procedures and several replicates were done to assess each property. The results obtained indicated that moisture was reduced with drying by 74 to 87%, depending on the temperature. Also ascorbic acid decreased with drying, being 7% for 50 ºC and increasing up to 28% for the highest temperature (80 ºC). The phenolic compounds and antioxidant activity were also very much affected by the drying temperature. The water activity of the dried samples varied from 0.658 to 0.753, being compatible with a good preservation. Regarding colour, the total colour difference between the dried samples and the fresh sample was found to vary in the range 9.45 – 17.17. The textural parameters were also much affected by drying, namely hardness which decreased by 45 to 72 %, and all other parameters increased: cohesiveness (approximately doubled), springiness (increased 2 to 3 times) and chewiness which increased up to 2.5 times that off the fresh sample. Adhesiveness, which was observed for the fresh samples (-4.02 N.s) disappeared in all the dried samples. The sensorial analysis made to the dried samples allowed establishing the sensorial profiles as shown in Figure 1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, a machine learning approach was used to develop a predictive model for residual methanol concentration in industrial formalin produced at the Akzo Nobel factory in Kristinehamn, Sweden. The MATLABTM computational environment supplemented with the Statistics and Machine LearningTM toolbox from the MathWorks were used to test various machine learning algorithms on the formalin production data from Akzo Nobel. As a result, the Gaussian Process Regression algorithm was found to provide the best results and was used to create the predictive model. The model was compiled to a stand-alone application with a graphical user interface using the MATLAB CompilerTM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Las organizaciones y sus entornos son sistemas complejos. Tales sistemas son difíciles de comprender y predecir. Pese a ello, la predicción es una tarea fundamental para la gestión empresarial y para la toma de decisiones que implica siempre un riesgo. Los métodos clásicos de predicción (entre los cuales están: la regresión lineal, la Autoregresive Moving Average y el exponential smoothing) establecen supuestos como la linealidad, la estabilidad para ser matemática y computacionalmente tratables. Por diferentes medios, sin embargo, se han demostrado las limitaciones de tales métodos. Pues bien, en las últimas décadas nuevos métodos de predicción han surgido con el fin de abarcar la complejidad de los sistemas organizacionales y sus entornos, antes que evitarla. Entre ellos, los más promisorios son los métodos de predicción bio-inspirados (ej. redes neuronales, algoritmos genéticos /evolutivos y sistemas inmunes artificiales). Este artículo pretende establecer un estado situacional de las aplicaciones actuales y potenciales de los métodos bio-inspirados de predicción en la administración.