257 resultados para authentication


Relevância:

10.00% 10.00%

Publicador:

Resumo:

O planeamento e gestão de stocks assume uma enorme relevância no contexto empresarial para que se possa responder de forma eficaz às flutuações do mercado e, consequentemente aumentar a produtividade e competitividade da empresa. O presente estudo foi desenvolvido numa empresa do setor vitivinícola português e tem como objetivo estudar os processos de gestão de stocks da mesma, de forma a melhorar os seus resultados operacionais. Mais especificamente, pretende-se elaborar um plano de gestão de stocks para que se possam definir políticas que se adequem a cada produto de forma a evitar quebras de stocks. Para alcançar os objetivos, considerou-se a seguinte metodologia: (1) análise da procura de produtos; (2) perceber de que forma se comporta a procura ao longo do ano; (3) definição do tipo de política de planeamento a ser adotada para cada grupo de produtos; (4) cálculo das quantidades de stock a produzir e o intervalo de tempo entre cada produção e (5) verificação da operacionalidade do plano de intervenção de modo a melhorar o planeamento da produção. As propostas de intervenção passaram pela implementação de políticas de gestão de stocks, nomeadamente a política de ponto de encomenda e a política de revisão cíclica. Passaram também pelo estudo da sazonalidade das vendas dos diferentes tipos de vinho de forma a facilitar o planeamento da preparação de espumantes. Embora as propostas não tenham sido postas em prática, são discutidas as vantagens e desvantagens das mesmas, bem como apresentadas propostas de melhoria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The evolution and maturation of Cloud Computing created an opportunity for the emergence of new Cloud applications. High-performance Computing, a complex problem solving class, arises as a new business consumer by taking advantage of the Cloud premises and leaving the expensive datacenter management and difficult grid development. Standing on an advanced maturing phase, today’s Cloud discarded many of its drawbacks, becoming more and more efficient and widespread. Performance enhancements, prices drops due to massification and customizable services on demand triggered an emphasized attention from other markets. HPC, regardless of being a very well established field, traditionally has a narrow frontier concerning its deployment and runs on dedicated datacenters or large grid computing. The problem with common placement is mainly the initial cost and the inability to fully use resources which not all research labs can afford. The main objective of this work was to investigate new technical solutions to allow the deployment of HPC applications on the Cloud, with particular emphasis on the private on-premise resources – the lower end of the chain which reduces costs. The work includes many experiments and analysis to identify obstacles and technology limitations. The feasibility of the objective was tested with new modeling, architecture and several applications migration. The final application integrates a simplified incorporation of both public and private Cloud resources, as well as HPC applications scheduling, deployment and management. It uses a well-defined user role strategy, based on federated authentication and a seamless procedure to daily usage with balanced low cost and performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seafood products fraud, the misrepresentation of them, have been discovered all around the world in different forms as false labeling, species substitution, short-weighting or over glazing in order to hide the correct identity, origin or weight of the seafood products. Due to the value of seafood products such as canned tuna, swordfish or grouper, these species are the subject of the commercial fraud is mainly there placement of valuable species with other little or no value species. A similar situation occurs with the shelled shrimp or shellfish that are reduced into pieces for the commercialization. Food fraud by species substitution is an emerging risk given the increasingly global food supply chain and the potential food safety issues. Economic food fraud is committed when food is deliberately placed on the market, for financial gain deceiving consumers (Woolfe, M. & Primrose, S. 2004). As a result of the increased demand and the globalization of the seafood supply, more fish species are encountered in the market. In this scenary, it becomes essential to unequivocally identify the species. The traditional taxonomy, based primarily on identification keys of species, has shown a number of limitations in the use of the distinctive features in many animal taxa, amplified when fish, crustacean or shellfish are commercially transformed. Many fish species show a similar texture, thus the certification of fish products is particularly important when fishes have undergone procedures which affect the overall anatomical structure, such as heading, slicing or filleting (Marko et al., 2004). The absence of morphological traits, a main characteristic usually used to identify animal species, represents a challenge and molecular identification methods are required. Among them, DNA-based methods are more frequently employed for food authentication (Lockley & Bardsley, 2000). In addition to food authentication and traceability, studies of taxonomy, population and conservation genetics as well as analysis of dietary habits and prey selection, also rely on genetic analyses including the DNA barcoding technology (Arroyave & Stiassny, 2014; Galimberti et al., 2013; Mafra, Ferreira, & Oliveira, 2008; Nicolé et al., 2012; Rasmussen & Morrissey, 2008), consisting in PCR amplification and sequencing of a COI mitochondrial gene specific region. The system proposed by P. Hebert et al. (2003) locates inside the mitochondrial COI gene (cytochrome oxidase subunit I) the bioidentification system useful in taxonomic identification of species (Lo Brutto et al., 2007). The COI region, used for genetic identification - DNA barcode - is short enough to allow, with the current technology, to decode sequence (the pairs of nucleotide bases) in a single step. Despite, this region only represents a tiny fraction of the mitochondrial DNA content in each cell, the COI region has sufficient variability to distinguish the majority of species among them (Biondo et al. 2016). This technique has been already employed to address the demand of assessing the actual identity and/or provenance of marketed products, as well as to unmask mislabelling and fraudulent substitutions, difficult to detect especially in manufactured seafood (Barbuto et al., 2010; Galimberti et al., 2013; Filonzi, Chiesa, Vaghi, & Nonnis Marzano, 2010). Nowadays,the research concerns the use of genetic markers to identify not only the species and/or varieties of fish, but also to identify molecular characters able to trace the origin and to provide an effective control tool forproducers and consumers as a supply chain in agreementwith local regulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent years have seen an astronomical rise in SQL Injection Attacks (SQLIAs) used to compromise the confidentiality, authentication and integrity of organisations’ databases. Intruders becoming smarter in obfuscating web requests to evade detection combined with increasing volumes of web traffic from the Internet of Things (IoT), cloud-hosted and on-premise business applications have made it evident that the existing approaches of mostly static signature lack the ability to cope with novel signatures. A SQLIA detection and prevention solution can be achieved through exploring an alternative bio-inspired supervised learning approach that uses input of labelled dataset of numerical attributes in classifying true positives and negatives. We present in this paper a Numerical Encoding to Tame SQLIA (NETSQLIA) that implements a proof of concept for scalable numerical encoding of features to a dataset attributes with labelled class obtained from deep web traffic analysis. In the numerical attributes encoding: the model leverages proxy in the interception and decryption of web traffic. The intercepted web requests are then assembled for front-end SQL parsing and pattern matching by applying traditional Non-Deterministic Finite Automaton (NFA). This paper is intended for a technique of numerical attributes extraction of any size primed as an input dataset to an Artificial Neural Network (ANN) and statistical Machine Learning (ML) algorithms implemented using Two-Class Averaged Perceptron (TCAP) and Two-Class Logistic Regression (TCLR) respectively. This methodology then forms the subject of the empirical evaluation of the suitability of this model in the accurate classification of both legitimate web requests and SQLIA payloads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta tese consiste no estudo, implementação e desenvolvimento da infra-estrutura para o projecto SEMCABO-WIFI. Este projecto está inserido na empresa Sem¬ Cabo, com o objectivo de levar a Internet aos clientes sob a forma de hotspots ou no acesso à ultima milha (Last Míle Access) à casa dos residentes. O projecto também deu origem à própria empresa, levando esta a ISP desde Setembro de 2007. Numa primeira fase, é feita abordagem a Sistemas de Authentícatíon, Authorization e Ac­countíng para ISPs, na vertente WI-FI e apresentadas possíveis soluções comercias e open source. Em seguida, é apresentado a empresa SemCabo, vertente comercial e tecnológica. O sistema base da SemCabo é referido, contemplando a tecnologia de suporte, rede, equipamentos activos, módulos de emissão de sinal WI-FI, segurança, monitorização e portal de autenticação. Os servidores base são indicados posteriormente, sendo efectuado a apresentação de todos os servidores com suporte ao projecto, incluindo alguns pormenores de configuração. São apresentados equipa­ mentos e sistemas utilizados para controlo de acesso à rede (NAS), sendo igualmente descritos pormenores de configuração. ABSTRACT; This thesis is about the study, implementation and development of the infrastructure created for the SEMCABO-WIFI project. This project is inserted in the company SemCabo, with the objective to bring the Internet to costumers in the form of hots­pots or access in last mile to the house of residents. The project also originated the company and led the company to ISP since September 2007. ln the first fase, the approach is about Systems Authentication, Authorization and Accounting for WISPs and presented possible commercial and open source solutions. ln next, the SemCabo company is presented and described their technological and commercial aspects. The base system of the SemCabo is refered, considering the support technology, network equipment, modules emission signal WI-FI, security, monitoring and portal authentication module. Base servers of the SemCabo project are shown, a presentation of all the servers that support the project is made, including some details of the configuration. The equipment and systems used to control network access (NAS) are presented, details of configuration are also described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2016.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examined variations in the Fulton condition factor, chemical composition, and stable isotopes of carbon and nitrogen in the Brazilian freshwater fish cachara (Pseudoplatystoma fasciatum), comparing farmed and wild fish in different seasons. Values for energy, protein, moisture, and Fulton's condition factor were higher for farmed than for wild fish in the rainy season, indicating better nutritional quality; however, these differences were not observed in the dry season. Likewise, we found significant enhancement of delta(15)N in farmed fish in the rainy season but not in the dry season, whereas enhancement of delta(13)C was observed in both seasons. The combined measurement of delta(13)C and delta(15)N provided traceability under all conditions. Our findings show that stable isotope analysis of C and N can be used to trace cachara origin, and that seasonal variations need to be considered when applying chemical and isotopic authentication of fish and fish products. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta investigación pretende aproximarse al papel de Change.org como plataforma de petición electrónica en España, donde no existen alternativas administradas por los poderes públicos. Mediante un análisis de contenido cualitativo y una entrevista semi-estructurada, investigamos el modelo de negocio de la página, con el objetivo de conocer su política de protección de datos, su sistema de verificación de los usuarios y, de forma más general, el marco legislativo en el que opera. Los resultados obtenidos muestran al proyecto alejado del derecho de petición español, con un sistema de testeo laxo y que basa sus beneficios en el coste por adquisición.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La machine à vecteurs de support à une classe est un algorithme non-supervisé qui est capable d’apprendre une fonction de décision à partir de données d’une seule classe pour la détection d’anomalie. Avec les données d’entraînement d’une seule classe, elle peut identifier si une nouvelle donnée est similaire à l’ensemble d’entraînement. Dans ce mémoire, nous nous intéressons à la reconnaissance de forme de dynamique de frappe par la machine à vecteurs de support à une classe, pour l’authentification d’étudiants dans un système d’évaluation sommative à distance à l’Université Laval. Comme chaque étudiant à l’Université Laval possède un identifiant court, unique qu’il utilise pour tout accès sécurisé aux ressources informatiques, nous avons choisi cette chaîne de caractères comme support à la saisie de dynamique de frappe d’utilisateur pour construire notre propre base de données. Après avoir entraîné un modèle pour chaque étudiant avec ses données de dynamique de frappe, on veut pouvoir l’identifier et éventuellement détecter des imposteurs. Trois méthodes pour la classification ont été testées et discutées. Ainsi, nous avons pu constater les faiblesses de chaque méthode dans ce système. L’évaluation des taux de reconnaissance a permis de mettre en évidence leur dépendance au nombre de signatures ainsi qu’au nombre de caractères utilisés pour construire les signatures. Enfin, nous avons montré qu’il existe des corrélations entre le taux de reconnaissance et la dispersion dans les distributions des caractéristiques des signatures de dynamique de frappe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chaque année, le piratage mondial de la musique coûte plusieurs milliards de dollars en pertes économiques, pertes d’emplois et pertes de gains des travailleurs ainsi que la perte de millions de dollars en recettes fiscales. La plupart du piratage de la musique est dû à la croissance rapide et à la facilité des technologies actuelles pour la copie, le partage, la manipulation et la distribution de données musicales [Domingo, 2015], [Siwek, 2007]. Le tatouage des signaux sonores a été proposé pour protéger les droit des auteurs et pour permettre la localisation des instants où le signal sonore a été falsifié. Dans cette thèse, nous proposons d’utiliser la représentation parcimonieuse bio-inspirée par graphe de décharges (spikegramme), pour concevoir une nouvelle méthode permettant la localisation de la falsification dans les signaux sonores. Aussi, une nouvelle méthode de protection du droit d’auteur. Finalement, une nouvelle attaque perceptuelle, en utilisant le spikegramme, pour attaquer des systèmes de tatouage sonore. Nous proposons tout d’abord une technique de localisation des falsifications (‘tampering’) des signaux sonores. Pour cela nous combinons une méthode à spectre étendu modifié (‘modified spread spectrum’, MSS) avec une représentation parcimonieuse. Nous utilisons une technique de poursuite perceptive adaptée (perceptual marching pursuit, PMP [Hossein Najaf-Zadeh, 2008]) pour générer une représentation parcimonieuse (spikegramme) du signal sonore d’entrée qui est invariante au décalage temporel [E. C. Smith, 2006] et qui prend en compte les phénomènes de masquage tels qu’ils sont observés en audition. Un code d’authentification est inséré à l’intérieur des coefficients de la représentation en spikegramme. Puis ceux-ci sont combinés aux seuils de masquage. Le signal tatoué est resynthétisé à partir des coefficients modifiés, et le signal ainsi obtenu est transmis au décodeur. Au décodeur, pour identifier un segment falsifié du signal sonore, les codes d’authentification de tous les segments intacts sont analysés. Si les codes ne peuvent être détectés correctement, on sait qu’alors le segment aura été falsifié. Nous proposons de tatouer selon le principe à spectre étendu (appelé MSS) afin d’obtenir une grande capacité en nombre de bits de tatouage introduits. Dans les situations où il y a désynchronisation entre le codeur et le décodeur, notre méthode permet quand même de détecter des pièces falsifiées. Par rapport à l’état de l’art, notre approche a le taux d’erreur le plus bas pour ce qui est de détecter les pièces falsifiées. Nous avons utilisé le test de l’opinion moyenne (‘MOS’) pour mesurer la qualité des systèmes tatoués. Nous évaluons la méthode de tatouage semi-fragile par le taux d’erreur (nombre de bits erronés divisé par tous les bits soumis) suite à plusieurs attaques. Les résultats confirment la supériorité de notre approche pour la localisation des pièces falsifiées dans les signaux sonores tout en préservant la qualité des signaux. Ensuite nous proposons une nouvelle technique pour la protection des signaux sonores. Cette technique est basée sur la représentation par spikegrammes des signaux sonores et utilise deux dictionnaires (TDA pour Two-Dictionary Approach). Le spikegramme est utilisé pour coder le signal hôte en utilisant un dictionnaire de filtres gammatones. Pour le tatouage, nous utilisons deux dictionnaires différents qui sont sélectionnés en fonction du bit d’entrée à tatouer et du contenu du signal. Notre approche trouve les gammatones appropriés (appelés noyaux de tatouage) sur la base de la valeur du bit à tatouer, et incorpore les bits de tatouage dans la phase des gammatones du tatouage. De plus, il est montré que la TDA est libre d’erreur dans le cas d’aucune situation d’attaque. Il est démontré que la décorrélation des noyaux de tatouage permet la conception d’une méthode de tatouage sonore très robuste. Les expériences ont montré la meilleure robustesse pour la méthode proposée lorsque le signal tatoué est corrompu par une compression MP3 à 32 kbits par seconde avec une charge utile de 56.5 bps par rapport à plusieurs techniques récentes. De plus nous avons étudié la robustesse du tatouage lorsque les nouveaux codec USAC (Unified Audion and Speech Coding) à 24kbps sont utilisés. La charge utile est alors comprise entre 5 et 15 bps. Finalement, nous utilisons les spikegrammes pour proposer trois nouvelles méthodes d’attaques. Nous les comparons aux méthodes récentes d’attaques telles que 32 kbps MP3 et 24 kbps USAC. Ces attaques comprennent l’attaque par PMP, l’attaque par bruit inaudible et l’attaque de remplacement parcimonieuse. Dans le cas de l’attaque par PMP, le signal de tatouage est représenté et resynthétisé avec un spikegramme. Dans le cas de l’attaque par bruit inaudible, celui-ci est généré et ajouté aux coefficients du spikegramme. Dans le cas de l’attaque de remplacement parcimonieuse, dans chaque segment du signal, les caractéristiques spectro-temporelles du signal (les décharges temporelles ;‘time spikes’) se trouvent en utilisant le spikegramme et les spikes temporelles et similaires sont remplacés par une autre. Pour comparer l’efficacité des attaques proposées, nous les comparons au décodeur du tatouage à spectre étendu. Il est démontré que l’attaque par remplacement parcimonieux réduit la corrélation normalisée du décodeur de spectre étendu avec un plus grand facteur par rapport à la situation où le décodeur de spectre étendu est attaqué par la transformation MP3 (32 kbps) et 24 kbps USAC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Na sociedade moderna, o uso de novas tecnologias e das correspondentes aplicações informáticas, levanta diversas questões, sendo, sem dúvida a mais importante, a segurança dos utilizadores e dos sistemas. A implementação de novos processos fazendo uso dos meios informáticos disponíveis permite o aumento da produtividade e a sua simplificação sem perder a fiabilidade, através da desmaterialização, desburocratização, acessibilidade, rapidez de execução, comodidade e segurança. A introdução do cartão de cidadão com todas a sua potencialidade contribui para a implementação dos processos acima referidos, os quais acompanharam a evolução do quadro legislativo nacional e europeu. Contudo verificam-se algumas lacunas, devido à sua imaturidade, sendo o seu desenvolvimento um processo ainda em curso. Com o presente trabalho pretende-se criar uma solução aberta a várias aplicações, que permite a optimização de processos através da sua simplificação, com recurso à assinatura digital, autenticação e uso de dados pessoais, tendo em atenção a legislação vigente, o actual cartão de cidadão e os requisitos de segurança. ABSTRACT: ln modern society, the increasing application of information technologies have arisen several questions, being the private and sensitive data security the most important. Today new informatics processes have come to increase productivity using faster and simplified mechanisms, keeping reliability and security. The introduction in Portugal, of the new citizen card is a great example of the above mentioned, in accordance with the National and European legislation. Nevertheless, being recently adopted, it stills vulnerable and therefore is in constant update and revision. The purpose of this thesis is the creation of an open solution to other new applications, aiming a simplification and optimization of citizen card. Seeking the maximum security, it is utilized digital signature, authentication and personal details, always according to the legislation in effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Colourants are substances used to change the colour of something, and are classified in three typology of colorants: a) pigments, b) dyes, and c) lakes and hybrid pigments. Their identification is very important when studying cultural heritage; it gives information about the artistic technique, can help in dating, and offers insights on the condition of the object. Besides, the study of the degradation phenomena constitutes a framework for the preventive conservation strategies, provides evidence of the object's original appearance, and contributes to the authentication of works of art. However, the complexity of these systems makes it impossible to achieve a complete understanding using a single technique, making necessary a multi-analytical approach. This work focuses on the set-up and application of advanced spectroscopic methods for the study of colourants in cultural heritage. The first chapter presents the identification of modern synthetic organic pigments using Metal Underlayer-ATR (MU-ATR), and the characterization of synthetic dyes extracted from wool fibres using a combination of Thin Layer Chromatography (TLC) coupled to MU-ATR using AgI@Au plates. The second chapter presents the study of the effect of metallic Ag in the photo-oxidation process of orpiment, and the influence of the different factors, such as light and relative humidity. We used a combination of vibrational and synchrotron radiation-based X-ray microspectroscopy techniques: µ-ATR-FT-IR, µ-Raman, SR-µ-XRF, µ-XANES at S K-, Ag L3- and As K-edges and SR-µ-XRD. The third chapter presents the study of metal carboxylates in paintings, specifically on the formation of Zn and Pb carboxylates in three different binders: stand linseed oil, whole egg, and beeswax. We used micro-ATR-FT-IR, macro FT-IR in total reflection (rMA-FT-IR), portable Near-Infrared spectroscopy (NIR), macro X-ray Powder Diffraction (MA-XRPD), XRPD, and Gas Chromatography Mass-Spectrometry (GC-MS). For the data processing, we explored the data from rMA-FT-IR and NIR with the Principal Component Analysis (PCA).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Honey bees are considered keystone species in ecosystem, the effect of harmful pesticides for the honey bees, the action of extreme climatic waves and their consequence on honey bees health can cause the loss of many colonies which could contribute to the reduction of the effective population size and incentive the use of non-autochthonous queens to replace dead colonies. Over the last decades, the use of non-ligustica bee subspecies in Italy has increased and together with the mentioned phenomena exposed native honey bees to hybridization, laeding to a dramatic loss of genetic erosion and admixture. Healthy genetic diversity within honey bee populations is critical to provide tolerance and resistance to current and future threatening. Nowadays it is urgent to design strategies for the conservation of local subspecies and their valorisation on a productive scale. In this Thesis we applied genomics tool for the analysis of the genetic diversity and the genomic integrity of honey bee populations in Italy are described. In this work mtDNA based methods are presented using honey bee DNA or honey eDNA as source of information of the genetic diversity of A. mellifera at different level. Taken together, the results derived from these studies should enlarge the knowledge of the genetic diversity and integrity of the honey bee populations in Italy, filling the gap of information necessary to design efficient conservation programmes. Furthermore, the methods presented in these works will provide a tool for the honey authentication to sustain and valorise beekeeping products and sector against frauds.