868 resultados para Rule-based techniques
Resumo:
Despite ethical and technical concerns, the in vivo method, or more commonly referred to mouse bioassay (MBA), is employed globally as a reference method for phycotoxin analysis in shellfish. This is particularly the case for paralytic shellfish poisoning (PSP) and emerging toxin monitoring. A high-performance liquid chromatography method (HPLC-FLD) has been developed for PSP toxin analysis, but due to difficulties and limitations in the method, this procedure has not been fully implemented as a replacement. Detection of the diarrhetic shellfish poisoning (DSP) toxins has moved towards LC-mass spectrometry (MS) analysis, whereas the analysis of the amnesic shellfish poisoning (ASP) toxin domoic acid is performed by HPLC. Although alternative methods of detection to the MBA have been described, each procedure is specific for a particular toxin and its analogues, with each group of toxins requiring separate analysis utilising different extraction procedures and analytical equipment. In addition, consideration towards the detection of unregulated and emerging toxins on the replacement of the MBA must be given. The ideal scenario for the monitoring of phycotoxins in shellfish and seafood would be to evolve to multiple toxin detection on a single bioanalytical sensing platform, i.e. 'an artificial mouse'. Immunologically based techniques and in particular surface plasmon resonance technology have been shown as a highly promising bioanalytical tool offering rapid, real-time detection requiring minimal quantities of toxin standards. A Biacore Q and a prototype multiplex SPR biosensor have been evaluated for their ability to be fit for purpose for the simultaneous detection of key regulated phycotoxin groups and the emerging toxin palytoxin. Deemed more applicable due to the separate flow channels, the prototype performance for domoic acid, okadaic acid, saxitoxin, and palytoxin calibration curves in shellfish achieved detection limits (IC20) of 4,000, 36, 144 and 46 μg/kg of mussel, respectively. A one-step extraction procedure demonstrated recoveries greater than 80 % for all toxins. For validation of the method at the 95 % confidence limit, the decision limits (CCα) determined from an extracted matrix curve were calculated to be 450, 36 and 24 μg/kg, and the detection capability (CCβ) as a screening method is ≤10 mg/kg, ≤160 μg/kg and ≤400 μg/kg for domoic acid, okadaic acid and saxitoxin, respectively.
Resumo:
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
Resumo:
In this paper we present a new event recognition framework, based on the Dempster-Shafer theory of evidence, which combines the evidence from multiple atomic events detected by low-level computer vision analytics. The proposed framework employs evidential network modelling of composite events. This approach can effectively handle the uncertainty of the detected events, whilst inferring high-level events that have semantic meaning with high degrees of belief. Our scheme has been comprehensively evaluated against various scenarios that simulate passenger behaviour on public transport platforms such as buses and trains. The average accuracy rate of our method is 81% in comparison to 76% by a standard rule-based method.
Resumo:
A resazurin (Rz) based photocatalyst indicator ink is used to test the activity of a commercial self-cleaning glass, using UV–vis spectroscopy and digital photography to monitor the photocatalyst-driven change in colour of the ink. UV–vis spectroscopy allows the change in film absorbance, ΔAbs, to be monitored as a function of irradiation time, whereas digital photography is used to monitor the concomitant change in the red component of the RGB values, i.e. ΔRGB (red). Initial work reveals the variation in ΔAbst and ΔRGB (red)t as a function of irradiation time, t, are linearly correlated. The rates of change of these parameters are also linearly correlated to the rates of oxidative destruction of stearic acid on self-cleaning glass under different irradiances. This work demonstrates that a measure of photocatalyst activity of self-cleaning glass, i.e. the time taken to change the colour of an Rz photocatalyst indicator ink, can be obtained using inexpensive digital photography, as alternative to more expensive lab-based techniques, such as UV–vis spectrophotometry.
Resumo:
There is an increasing use of the discrete element method (DEM) to study cemented (e.g. concrete and rocks) and sintered particulate materials. The chief advantage of the DEM over continuum based techniques is that it does not make assumptions about how cracking and fragmentation initiate and propagate, since the DEM system is naturally discontinuous. The ability for the DEM to produce a realistic representation of a cemented granular material depends largely on the implementation of an inter-particle bonded contact model. This paper presents a new bonded contact model based on the Timoshenko beam theory which considers axial, shear and bending behaviour of the bond. The bond model was first verified by simulating both the bending and dynamic response of a simply supported beam. The loading response of a concrete cylinder was then investigated and compared with the Eurocode equation prediction. The results show significant potential for the new model to produce satisfactory predictions for cementitious materials. A unique feature of this model is that it can also be used to accurately represent many deformable structures such as frames and shells, so that both particles and structures or deformable boundaries can be described in the same DEM framework.
Resumo:
Complete supervised training algorithms for B-spline neural networks and fuzzy rule-based systems are discussed. By interducing the relationship between B-spline neural networks and certain types of fuzzy models, training algorithms developed initially for neural networks can be adapted by fuzzy systems.
Resumo:
The objective of this thesis is to study the properties of resistive switching effect based on bistable resistive memory which is fabricated in the form of Al2O3/polymer diodes and to contribute to the elucidation of resistive switching mechanisms. Resistive memories were characterized using a variety of electrical techniques, including current-voltage measurements, small-signal impedance, and electrical noise based techniques. All the measurements were carried out over a large temperature range. Fast voltage ramps were used to elucidate the dynamic response of the memory to rapid varying electric fields. The temperature dependence of the current provided insight into the role of trapped charges in resistive switching. The analysis of fast current fluctuations using electric noise techniques contributed to the elucidation of the kinetics involved in filament formation/rupture, the filament size and correspondent current capabilities. The results reported in this thesis provide insight into a number of issues namely: (i) The fundamental limitations on the speed of operation of a bi-layer resistive memory are the time and voltage dependences of the switch-on mechanism. (ii) The results explain the wide spread in switching times reported in the literature and the apparently anomalous behaviour of the high conductance state namely the disappearance of the negative differential resistance region at high voltage scan rates which is commonly attributed to a “dead time” phenomenon which had remained elusive since it was first reported in the ‘60s. (iii) Assuming that the current is filamentary, Comsol simulations were performed and used to explain the observed dynamic properties of the current-voltage characteristics. Furthermore, the simulations suggest that filaments can interact with each other. (iv) The current-voltage characteristics have been studied as a function of temperature. The findings indicate that creation and annihilation of filaments is controlled by filling and neutralizing traps localized at the oxide/polymer interface. (v) Resistive switching was also studied in small-molecule OLEDs. It was shown that the degradation that leads to a loss of light output during operation is caused by the presence of a resistive switching layer. A diagnostic tool that predicts premature failure of OLEDs was devised and proposed. Resistive switching is a property of oxides. These layers can grow in a number of devices including, organic light emitting diodes (OLEDs), spin-valve transistors and photovoltaic devices fabricated in different types of material. Under strong electric fields the oxides can undergo dielectric breakdown and become resistive switching layers. Resistive switching strongly modifies the charge injection causing a number of deleterious effects and eventually device failure. In this respect the findings in this thesis are relevant to understand reliability issues in devices across a very broad field.
Resumo:
Dissertação de mestrado, Biologia Marinha, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2015
Resumo:
Tese de doutoramento, Informática (Engenharia Informática), Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
In the age of E-Business many companies faced with massive data sets that must be analysed for gaining a competitive edge. these data sets are in many instances incomplete and quite often not of very high quality. Although statistical analysis can be used to pre-process these data sets, this technique has its own limitations. In this paper we are presenting a system - and its underlying model - that can be used to test the integrity of existing data and pre-process the data into clearer data sets to be mined. LH5 is a rule-based system, capable of self-learning and is illustrated using a medical data set.
Resumo:
Fabricating Ge and Si integrated structures with nanoscale accuracy is a challenging pursuit essential for novel advances in electronics and photonics. While several scanning probe-based techniques have been proposed, no current technique offers control of nanostructure size, shape, placement, and chemical composition. To this end, atomic force microscope direct write uses a high electric field (> 109 V m-1) to create nanoscale features as fast as 1 cm s-1 by reacting a liquid precursor with a biased AFM tip. In this work, I present the first results on fabricating inorganic nanostructures via AFM direct write. Using diphenylgermane (DPG) and diphenylsilane (DPS), carbon-free germanium and silicon nanostructures (SIMS, x-ray PEEM) are fabricated. For this chemistry, I propose a model that involves electron capture and precursor fragmentation under the high electric field. To verify this model, experimental data and simulations are presented. High field chemistry for DPG and DPS has also been demonstrated for both sequential deposition and the creation of nanoscale heterostuctures, in addition to microscale deposition using a flexible stamp approach. This high field chemistry approach to the deposition of organometallic precursors could offer a low-cost, high throughput alternative for future optical, electronic, and photovoltaic applications.
Resumo:
De acordo com o parágrafo 46 da estrutura concetual do SNC as demonstrações financeiras devem mostrar uma imagem verdadeira e apropriada, ou apresentar apropriadamente, a posição financeira, o desempenho e as alterações da posição financeira de uma entidade. Todavia, a estrutura concetual do SNC não interpreta diretamente tais conceitos, situação que se pode refletir na qualidade do reporting financeiro apresentado e divulgado. O referencial contabilístico português apresenta, como equivalentes, as noções de true and fair view, presentfairly, fairly reflect ou fair reflection, e silenciando toda a polémica envolvente às expressões apresentadas. Com efeito, a primeira expressão é identificada, com a União Europeia, onde a apresentação e divulgação da informação financeira é baseada num conjunto de princípios — principies — based standards — e pressupõe o exercício de um julgamento, enquanto que as três expressões seguintes traduzem o standard de reporting financeiro, para os Estados Unidos, tendo subjacente a abordagem designado por — rule based standards — A não abordagem do SNC levou-nos a refletir sobre o assunto apresentado, apresentando uma análise multifacetada do conceito.
Resumo:
Relatório de estágio apresentado à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ensino do 1.º e do 2.º Ciclo do Ensino Básico
Resumo:
Com o aumento de plataformas móveis disponíveis no mercado e com o constante incremento na sua capacidade computacional, a possibilidade de executar aplicações e em especial jogos com elevados requisitos de desempenho aumentou consideravelmente. O mercado dos videojogos tem assim um cada vez maior número de potenciais clientes. Em especial, o mercado de jogos massive multiplayer online (MMO) tem-se tornado muito atractivo para as empresas de desenvolvimento de jogos. Estes jogos suportam uma elevada quantidade de jogadores em simultâneo que podem estar a executar o jogo em diferentes plataformas e distribuídos por um "mundo" de jogo extenso. Para incentivar a exploração desse "mundo", distribuem-se de forma inteligente pontos de interesse que podem ser explorados pelo jogador. Esta abordagem leva a um esforço substancial no planeamento e construção desses mundos, gastando tempo e recursos durante a fase de desenvolvimento. Isto representa um problema para as empresas de desenvolvimento de jogos, e em alguns casos, e impraticável suportar tais custos para equipas indie. Nesta tese e apresentada uma abordagem para a criação de mundos para jogos MMO. Estudam-se vários jogos MMO que são casos de sucesso de modo a identificar propriedades comuns nos seus mundos. O objectivo e criar uma framework flexível capaz de gerar mundos com estruturas que respeitam conjuntos de regras definidas por game designers. Para que seja possível usar a abordagem aqui apresentada em v arias aplicações diferentes, foram desenvolvidos dois módulos principais. O primeiro, chamado rule-based-map-generator, contem a lógica e operações necessárias para a criação de mundos. O segundo, chamado blocker, e um wrapper à volta do módulo rule-based-map-generator que gere as comunicações entre servidor e clientes. De uma forma resumida, o objectivo geral e disponibilizar uma framework para facilitar a geração de mundos para jogos MMO, o que normalmente e um processo bastante demorado e aumenta significativamente o custo de produção, através de uma abordagem semi-automática combinando os benefícios de procedural content generation (PCG) com conteúdo gráfico gerado manualmente.
Resumo:
ABSTRACT: q-Space-based techniques such as diffusion spectrum imaging, q-ball imaging, and their variations have been used extensively in research for their desired capability to delineate complex neuronal architectures such as multiple fiber crossings in each of the image voxels. The purpose of this article was to provide an introduction to the q-space formalism and the principles of basic q-space techniques together with the discussion on the advantages as well as challenges in translating these techniques into the clinical environment. A review of the currently used q-space-based protocols in clinical research is also provided.