962 resultados para dust proof
Development and validation of gold nanoprobes for human SNP detection towards commercial application
Resumo:
Conventional molecular techniques for detection and characterization of relevant nucleic acid (i.e. DNA) sequences are, nowadays, cumbersome, expensive and with reduced portability. The main objective of this dissertation consisted in the optimization and validation of a fast and low-cost colorimetric nanodiagnostic methodology for the detection of single nucleotide polymorphisms (SNPs). This was done considering SNPs associated to obesity of commercial interest for STAB VIDA, and subsequent evaluation of other clinically relevant targets. Also, integration of this methodology into a microfluidic platform envisaging portability and application on points-of-care (POC) was achieved. To warrant success in pursuing these objectives, the experimental work was divided in four sections: i) genetic association of SNPs to obesity in the Portuguese population; ii) optimization and validation of the non-cross-linking approach for complete genotype characterization of these SNPs; iii) incorporation into a microfluidic platform; and iv) translation to other relevant commercial targets. FTO dbSNP rs#:9939609 carriers had higher body mass index (BMI), total body fat mass, waist perimeter and 2.5 times higher risk to obesity. AuNPs functionalized with thiolated oligonucleotides (Au-nanoprobes) were used via the non-cross-linking to validate a diagnostics approach against the gold standard technique - Sanger Sequencing - with high levels of sensitivity (87.50%) and specificity (91.67%). A proof-of-concept POC microfluidic device was assembled towards incorporation of the molecular detection strategy. In conclusion a successful framework was developed and validated for the detection of SNPs with commercial interest for STAB VIDA, towards future translation into a POC device.
Resumo:
A indústria alimentar tem como dever básico assegurar a qualidade dos seus produtos e garantir a segurança alimentar. Para uma eficácia no processo produtivo é necessário que o mesmo seja controlado e otimizado, sendo fundamental identificar problemas, avaliar a sua origem e encontrar soluções adequadas para a sua resolução. A José Maria da Fonseca é uma das empresas mais antigas no ramo vitivinícola, a sua missão é produzir e comercializar vinhos de qualidade reconhecida. Direcionada para a qualidade dos seus produtos, a empresa tem a consciencialização que é fundamental investir na identificação contínua de problemas do processo produtivo, que possam conduzir a oportunidades de melhoria. O objetivo deste trabalho consistiu na identificação das causas da quebra de vidro na zona de engarrafamento, identificação da causa raiz, e o impacto que as partículas de vidro resultantes dessa quebra, principalmente de uma quebra com explosão possam ter sobre a segurança do produto para o consumidor. Na linha de engarrafamento, podem ocorrer quebras de garrafas no despaletizador, na máquina de lavar, na máquina de encher e na máquina de rolhar ou capsular roscas (Pilfer-Proof). As quebras de garrafas no despaletizador não são consideradas uma vez que as garrafas ainda passam por um processo de lavagem, que tem como objetivo a eliminação de objetos estranhos existentes no interior da garrafa, sendo também feita a validação deste equipamento. Após a máquina de rolhar ou capsular Pilfer-Proof, o produto encontra-se selado e livre de contaminações físicas, pelo que todo o processo produtivo deve ser controlado, com as devidas medidas de monitorização e ações corretivas de forma a evitar a sua contaminação. Neste enquadramento efetuou-se o estudo nas linhas de produção 1, 2 e 3, pois estas apresentam quebras de garrafas com explosão de vidro, que é a principal causa da contaminação do produto e dos equipamentos. Deste estudo conclui-se e explica-se as várias causas que levam à quebra de garrafa na linha de produção e identifica-se a causa raiz do problema. Devido a não se conseguir evitar o problema na causa raiz, as ações de melhoria propostas vão atuar nas ações corretivas efetuadas na máquina de encher, de forma a evitar a contaminação do produto e a garantir a segurança do consumidor.
Resumo:
The global and increasingly technological society requires the States to adopt security measures that can maintain the balance between the freedom, on the one hand, and the security and the respect for fundamental rights of a democratic state, on the other. A State can only achieve this aim if it has an effective judicial system and in particular a criminal procedure adequate to the new criminogenic realities. In this context, the national legislator has adopted, following other international legal systems, special means of obtaining proof more stringent of rights. Within those special means are included the covert actions, that, being a means to use sparingly, is a key element to fight against violent and highly organized crime. Therefore, the undercover agent, voluntary by nature, develops a set of activities that enables the investigation to use other means of taking evidence and/or probationary diligences itself, with the purpose of providing sufficient proof to the case file. In this milieu, given the high risks involved during the investigation, as well as after its completion, the undercover agent can act upon fictitious identity. This measure can be maintained during the evidentiary phase of the trial. Similarly, given the latent threat that the undercover agent suffers by its inclusion in criminal organizations, as well as the need for his inclusion in future covert actions it is crucial that his participation as a witness in the trial is properly shielded. Thus, when the undercover agent provides, exceptionally, statements in the trial, he shall do so always through videoconference with voice and image distortion. This measure can guarantee the anonymity of the undercover agent and concomitantly, that the adversarial principle and the right of the accused to a fair trial is not prejudiced since, in those circumstances, the diligence will be supervised in its entirety (in the audience and with the undercover agent) by a judge.
Resumo:
The subject of study of this Thesis aims to highlight and recognize as an object of reflection the undoubted relationship between the Internet and the Justice System, based on the issue of digital evidence. The simultaneously crossing of the juridical-legal implications and the more technical computer issues is the actual trigger for the discussion of the issues established. The Convention on Cybercrime of the Council of Europe of 23rd November 2001 and the Council Framework Decision n.° 2005/222/JHA of 24th February 2005 were avant-garde in terms of the international work about the crimes in the digital environment. In addition they enabled the harmonization of national legislations on the matter and, consequently, a greater flexibility in international judicial cooperation. Portugal, in compliance with these international studies, ratified, implemented and approved Law n. º 109/2009 of 15th September concerning the Cybercrime Act, establishing a more specific investigation and collection of evidence in electronic support when it comes to combating this type of crime, as it reinforced the Substantive Criminal Law and Procedural Nature. Nevertheless, the constant debates about the New Technologies of Information and Communication have not neglected the positive role of these tools for the user. However, they express a particular concern for their counterproductive effects; a special caution prevails on the part of the judge in assessing the digital evidence, especially circumstantial evidence, due to the its fragility. Indisputably, the practice of crimes through the computer universe, given its inexorable technical complexity, entails many difficulties for the forensic investigation, since the proofs hold temporary, changeable, volatile, and dispersed features. In this pillar, after the consummation of iter criminis, the Fundamental Rights of the suspects may be debated in the course of the investigation and the construction of iter probatorium. The intent of this Thesis is to contribute in a reflective way on the issues presented in order to achieve a bigger technical and legal awareness regarding the collection of digital proof, looking for a much lighter approach to its suitability in terms of evidentiary value.
Resumo:
O presente Relatório tem como objetivo expor o trabalho desenvolvido durante o estágio, no âmbito da componente não letiva do mestrado em Ciências da Educação, realizado ao abrigo de um protocolo entre a Faculdade de Ciências Socais e Humanas (FCSH) da Universidade Nova de Lisboa (UNL) e a Escola Profissional Gustave Eiffel (EPGE), sob orientação científica do Professor Doutor Luís Manuel Bernardo da FCSH e orientação prática pela Mestre Maria Goreti Freitas, da EPGE. O estágio teve a duração de três meses, com um total de 400 horas. Deste modo, serão descritas as práticas de estágio, sustentadas nos conhecimentos adquiridos nas componentes letivas do mestrado em Ciências da Educação. O relatório está estruturado em três capítulos, finalizando com uma reflexão sobre a produtividade do estágio para a nossa aprendizagem. No primeiro capítulo, pretende-se justificar a escolha da temática deste relatório, procedendo-se à caracterização da secretaria e da escola, fundamentada através de uma revisão de literatura, bem como por algumas entrevistas realizadas durante o estágio. Neste capítulo dá-se importância ao facto deste estágio ter permitido uma interação com toda a comunidade educativa, nas suas respetivas diferenças, em diferentes níveis e situações. No segundo capítulo, efetuou-se a uma contextualização acerca das plataformas de gestão educativas bem como a sua importância para a gestão da comunidade educativa. Evidenciou-se em particular o papel das plataformas de gestão na educação, enquanto elo de ligação entre a secretaria e toda a comunidade educativa. No terceiro capítulo levou-se a cabo a descrição das principais atividades desenvolvidas durante o estágio curricular. Por fim, apresenta-se a uma reflexão crítica sobre o processo de estágio. Todo o trabalho remete para documentos anexos, os quais comprovam a documentação complementar. Alguns destes documentos foram elaborados de forma original, designadamente as entrevistas realizadas no decurso do estágio.
Resumo:
The use of stem cells is a promising therapeutic approach for the substantial challenge to regenerate cartilage. Considering the two prerequisites, namely the use of a 3D system to enable the chondrogenic differentiation and growth factors to avoid dedifferentiation, the diffusion efficiency of essential biomolecules is an intrinsic issue. We already proposed a liquified bioencapsulation system containing solid microparticles as cell adhesion sites1. Here, we intend to use the optimized system towards chondrogenic differentiation by encapsulating stem cells and collagenII-TGF-β3 PLLA microparticles. As a proof-of-concept, magnetite-nanoparticles were incorporated into the multilayered membrane. This can be a great advantage after implantation procedures to fixate the capsules in situ with the held of an external magnetic patch and for the follow-up through imaging. Results showed that the production of glycosaminoglycans and the expression of cartilage-relevant markers (collagen II, Sox9, aggrecan, and COMP) increased up to 28 days, while hypertrophic (collagen X) and fibrotic (collagen I) markers were downregulated. The presence of nanofibers in the newly deposited ECM was visualized by SEM, which resembles the collagen fibrils of native cartilage. The presence of the major constituent of cartilage, collagen II, was detected by immunocytochemistry and afranin-O and alcian blue stainings revealed a basophilic ECM deposition, which is characteristic of neocartilage. These findings suggest that the proposed system may provide a suitable environment for chondrogenic differentiation.
Resumo:
Cell encapsulation within hydrogel microspheres shows great promise in the field of tissue engineering and regenerative medicine (TERM). However, the assembling of microspheres as building blocks to produce complex tissues is a hard task because of their inability to place along length scales in space. We propose a proof-of-concept strategy to produce 3D constructs using cell encapsulated as building blocks by perfusion based LbL technique. This technique exploits the â bindingâ potential of multilayers apart from coating
Resumo:
Autor proof
Resumo:
Immune systems have been used in the last years to inspire approaches for several computational problems. This paper focus on behavioural biometric authentication algorithms’ accuracy enhancement by using them more than once and with different thresholds in order to first simulate the protection provided by the skin and then look for known outside entities, like lymphocytes do. The paper describes the principles that support the application of this approach to Keystroke Dynamics, an authentication biometric technology that decides on the legitimacy of a user based on his typing pattern captured on he enters the username and/or the password and, as a proof of concept, the accuracy levels of one keystroke dynamics algorithm when applied to five legitimate users of a system both in the traditional and in the immune inspired approaches are calculated and the obtained results are compared.
Resumo:
Dissertação de mestrado em Construção e Reabilitação Sustentáveis
Resumo:
Tese de Doutoramento Geografia (Área de Especialização: Geografia e Planeamento Regional)
Resumo:
Dissertação de mestrado em Direito Tributário e Fiscal
Resumo:
Relatório de atividade profissional de mestrado em Direito Judiciário
Resumo:
Dissertação de mestrado em Direito Tributário e Fiscal
Resumo:
This paper tries to remove what seems to be the remaining stumbling blocks in the way to a full understanding of the Curry-Howard isomorphism for sequent calculus, namely the questions: What do variables in proof terms stand for? What is co-control and a co-continuation? How to define the dual of Parigot's mu-operator so that it is a co-control operator? Answering these questions leads to the interpretation that sequent calculus is a formal vector notation with first-class co-control. But this is just the "internal" interpretation, which has to be developed simultaneously with, and is justified by, an "external" one, offered by natural deduction: the sequent calculus corresponds to a bi-directional, agnostic (w.r.t. the call strategy), computational lambda-calculus. Next, the duality between control and co-control is studied and proved in the context of classical logic, where one discovers that the classical sequent calculus has a distortion towards control, and that sequent calculus is the de Morgan dual of natural deduction.