983 resultados para modern techniques
Resumo:
Standards have been placed to regulate the microbial and preservative contents to assure that foods are safe to the consumer. In a case of a food-related disease outbreak, it is crucial to be able to detect and identify quickly and accurately the cause of the disease. In addition, for every day control of food microbial and preservative contents, the detection methods must be easily performed for numerous food samples. In this present study, quicker alternative methods were studied for identification of bacteria by DNA fingerprinting. A flow cytometry method was developed as an alternative to pulsed-field gel electrophoresis, the golden method . DNA fragment sizing by an ultrasensitive flow cytometer was able to discriminate species and strains in a reproducible and comparable manner to pulsed-field gel electrophoresis. This new method was hundreds times faster and 200,000 times more sensitive. Additionally, another DNA fingerprinting identification method was developed based on single-enzyme amplified fragment length polymorphism (SE-AFLP). This method allowed the differentiation of genera, species, and strains of pathogenic bacteria of Bacilli, Staphylococci, Yersinia, and Escherichia coli. These fingerprinting patterns obtained by SE-AFLP were simpler and easier to analyze than those by the traditional amplified fragment length polymorphism by double enzyme digestion. Nisin (E234) is added as a preservative to different types of foods, especially dairy products, around the world. Various detection methods exist for nisin, but they lack in sensitivity, speed or specificity. In this present study, a sensitive nisin-induced green fluorescent protein (GFPuv) bioassay was developed using the Lactococcus lactis two-component signal system NisRK and the nisin-inducible nisA promoter. The bioassay was extremely sensitive with detection limit of 10 pg/ml in culture supernatant. In addition, it was compatible for quantification from various food matrices, such as milk, salad dressings, processed cheese, liquid eggs, and canned tomatoes. Wine has good antimicrobial properties due to its alcohol concentration, low pH, and organic content and therefore often assumed to be microbially safe to consume. Another aim of this thesis was to study the microbiota of wines returned by customers complaining of food-poisoning symptoms. By partial 16S rRNA gene sequence analysis, ribotyping, and boar spermatozoa motility assay, it was identified that one of the wines contained a Bacillus simplex BAC91, which produced a heat-stable substance toxic to the mitochondria of sperm cells. The antibacterial activity of wine was tested on the vegetative cells and spores of B. simplex BAC91, B. cereus type strain ATCC 14579 and cereulide-producing B. cereus F4810/72. Although the vegetative cells and spores of B. simplex BAC91 were sensitive to the antimicrobial effects of wine, the spores of B. cereus strains ATCC 14579 and F4810/72 stayed viable for at least 4 months. According to these results, Bacillus spp., more specifically spores, can be a possible risk to the wine consumer.
Resumo:
Medical fields requires fast, simple and noninvasive methods of diagnostic techniques. Several methods are available and possible because of the growth of technology that provides the necessary means of collecting and processing signals. The present thesis details the work done in the field of voice signals. New methods of analysis have been developed to understand the complexity of voice signals, such as nonlinear dynamics aiming at the exploration of voice signals dynamic nature. The purpose of this thesis is to characterize complexities of pathological voice from healthy signals and to differentiate stuttering signals from healthy signals. Efficiency of various acoustic as well as non linear time series methods are analysed. Three groups of samples are used, one from healthy individuals, subjects with vocal pathologies and stuttering subjects. Individual vowels/ and a continuous speech data for the utterance of the sentence "iruvarum changatimaranu" the meaning in English is "Both are good friends" from Malayalam language are recorded using a microphone . The recorded audio are converted to digital signals and are subjected to analysis.Acoustic perturbation methods like fundamental frequency (FO), jitter, shimmer, Zero Crossing Rate(ZCR) were carried out and non linear measures like maximum lyapunov exponent(Lamda max), correlation dimension (D2), Kolmogorov exponent(K2), and a new measure of entropy viz., Permutation entropy (PE) are evaluated for all three groups of the subjects. Permutation Entropy is a nonlinear complexity measure which can efficiently distinguish regular and complex nature of any signal and extract information about the change in dynamics of the process by indicating sudden change in its value. The results shows that nonlinear dynamical methods seem to be a suitable technique for voice signal analysis, due to the chaotic component of the human voice. Permutation entropy is well suited due to its sensitivity to uncertainties, since the pathologies are characterized by an increase in the signal complexity and unpredictability. Pathological groups have higher entropy values compared to the normal group. The stuttering signals have lower entropy values compared to the normal signals.PE is effective in charaterising the level of improvement after two weeks of speech therapy in the case of stuttering subjects. PE is also effective in characterizing the dynamical difference between healthy and pathological subjects. This suggests that PE can improve and complement the recent voice analysis methods available for clinicians. The work establishes the application of the simple, inexpensive and fast algorithm of PE for diagnosis in vocal disorders and stuttering subjects.
Resumo:
In recent years, Caenorhabditis elegans has emerged as a new model to investigate the relationships between nuclear architecture, cellular differentiation, and organismal development. On one hand, C. elegans with its fixed lineage and transparent body is a great model organism to observe gene functions in vivo in specific cell types using microscopy. On the other hand, two different techniques have been applied in nematodes to identify binding sites for chromatin-associated proteins genome-wide: chromatin immunoprecipitation (ChIP), and Dam-mediated identification (DamID). We summarize here all three techniques together as they are complementary. We also highlight strengths and differences of the individual approaches.
Resumo:
The study investigates whether there is an association between different combinations of emphasis on generic strategies (product differentiation and cost efficiency) and perceived usefulness of management accounting techniques. Previous research has found that cost leadership is associated with traditional accounting techniques and product differentiation with a variety of modern management accounting approaches. The present study focuses on the possible existence of a strategy that mixes these generic strategies. The empirical results suggest that (a) there is no difference in the attitudes towards the usefulness of traditional management accounting techniques between companies that adhere either to a single strategy or a mixed strategy; (b) there is no difference in the attitudes towards modern and traditional techniques between companies that adhere to a single strategy, whether this is product differentiation or cost efficiency, and c) companies that favour a mixed strategy seem to have a more positive attitude towards modern techniques than companies adhering to a single strategy
Resumo:
The nematode Caenorhabditis elegans is characterized by many features that make it highly attractive to study nuclear pore complexes (NPCs) and nucleocytoplasmic transport. NPC composition and structure are highly conserved in nematodes and being amenable to a variety of genetic manipulations, key aspects of nuclear envelope dynamics can be observed in great details during breakdown, reassembly, and interphase. In this chapter, we provide an overview of some of the most relevant modern techniques that allow researchers unfamiliar with C. elegans to embark on studies of nucleoporins in an intact organism through its development from zygote to aging adult. We focus on methods relevant to generate loss-of-function phenotypes and their analysis by advanced microscopy. Extensive references to available reagents, such as mutants, transgenic strains, and antibodies are equally useful to scientists with or without prior C. elegans or nucleoporin experience.
Resumo:
Desde 1995, o Brasil apresenta uma nova onda de profissionalização do serviço público. No campo do Legislativo, esse movimento levou à criação de escolas do legislativo e à preocupação com a adoção de modernas técnicas de gestão pública. Há que se indagar, entretanto, qual o perfil desejado para o profissional do Legislativo. O curso de graduação tecnológica em Administração Legislativa, oferecido pela Unisul como uma oportunidade de formação específica dos servidores do Legislativo, traz uma definição desse profissional que é analisada criticamente neste estudo.
Resumo:
Enquanto os métodos tradicionais de processamento de restaurações cerâmicas tornaram-se notórios por sua complexidade, as técnicas mais modernas vêm privilegiando a simplicidade de execução e a automação. Dentre estas, destaca-se a injeção em moldes, que recentemente, foi associada a métodos CAD-CAM. Estudos anteriores demonstraram a viabilidade de utilização de um vidro feldspático de baixa expansão térmica, Alpha (Vita Zahnfabrik), para injeção, porém, faltam informações quanto às propriedades mecânicas e a microestrutura deste material quando submetido à injeção. Os objetivos deste estudo são: produzir pastilhas vidrocerâmicas para injeção a quente a partir de Alpha e da mistura deste vidro com partículas de alumina e zircônia; avaliar a resistência à flexão dos materiais processados, e compará-la a um material compatível existente no mercado (PM9 - Vita Zahnfabrik); estudar a estrutura microscópica dos materiais e correlacioná-la com suas propriedades mecânicas; identificar por meio de difração de raios X a formação de fases cristalinas durante as diferentes etapas de processamento. A injeção aumentou a resistência do vidro Alpha devido à redução na quantidade e tamanho dos defeitos internos, principalmente porosidades. Apesar de ter sido observada nucleação de cristais nos dois materiais, durante o processamento, não foi possível determinar de que forma este fenômeno afetou as propriedades mecânicas dos materiais. Não foi detectada alteração no padrão de distribuição das fases cristalinas observadas em microscópio eletrônico de varredura antes e depois da injeção. Não foi verificada diferença estatística significante entre a resistência à flexão de Alpha injetado e PM9. A adição de partículas de alumina e zirconia ao vidro Alpha provocou redução da resistência, devido à formação de aglomerados durante a confecção das pastilhas e a incapacidade da injeção em dispersá-los. Tais aglomerados funcionaram como concentradores de tensões, enfraquecendo o material.
Resumo:
O presente estudo procura examinar criticamente a forma como as competências legislativas são interpretadas no Brasil. Em especial, pretende-se demonstrar que o tema pode e deve se beneficiar das modernas técnicas e instrumentos desenvolvidos pela dogmática do Direito Constitucional contemporâneo. O trabalho se estrutura em três partes. Na Primeira Parte, serão expostas algumas premissas teóricas sobre a interpretação constitucional, o federalismo e a sindicabilidade judicial das competências, que nortearão o desenvolvimento do estudo. Na Segunda Parte, examinam-se os processos de qualificação das leis e de interpretação das competências legislativas. A partir do esboço de uma teoria das competências legislativas, será defendida a aplicação de parâmetros segundo os quais, em princípio, todos os dispositivos de competência devem ser interpretados da forma mais ampliativa possível, sendo as eventuais restrições, impostas por outras regras de competência, consideradas e justificadas argumentativamente. Em sua Terceira Parte, e última, o estudo identificará o fenômeno dos conflitos de competências legislativas em geral, esquecidos pela doutrina brasileira , examinando, na sequência, alguns critérios para sua solução. Afastada a possibilidade de recurso à supremacia do direito federal e ao princípio da subsidiariedade, bem como a preferências de mérito, serão desenvolvidos dois parâmetros formais e um material para a solução das inconsistências insolúveis entre competências.
Resumo:
Pelagic resources around Sri Lanka may be categorized into three major groups: (1) the small pelagic varieties such as the sprats, halmessa, sardines (salaya, soodaya), and herrings (hurulla). (2) the medium size pelagic species such as the mackerel (kumbala and bolla), barracuda (jeela), seer Spanish mackerel (thora), frigate mackeral (alagoduwa), mackerel tuna (atawalla) and the skipjack (balaya). (3) the large size fishes such as yellow fin tuna (kelawalla), big eye tuna, marlins (koppora and gappara), sail fish (thalapath), sharks (mora) and rays (maduwa). Production levels of exploited resources are noted, and seasonal patterns and annual in their abundance are considered. On the basis of observations and estimations of the existing fisheries, and the results of experimental fishing, figures are presented of the potential yield of those species already exploited. The development of that potential depends on the development of modern techniques of pole and line fishing, application of tuna longline and shark longline, increasing the number of units of drift nets and the introduction of a bait fishery for the longline and pole line fishery. Some features upon which the successes of any venture to exploit such resources are noted, particularly those which relate to the nature of the fishing vessels used.
Resumo:
© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.
Cleanup and quantification of polychlorinated dibenzo-p-dioxins/furans and polychlorinated biphenyls
Resumo:
By using modern techniques of isotope dilution, high-resolution gas chromatography/high-resolution mass spectrometry and multiple ions detection, an effective cleanup, qualitative and quantitative method was developed for polychlorinated dibenzo-p-dioxins/furans (PCDD/F) and polychlorinated biphenyls analysis. Based on the chromatographic relative retentions of PCDD/F, a software was established for automatic peak recognition of all the isomers from tetra- to octachlorine PCDD/F. It ensured good reliability and accuracy of the analytical data.
Resumo:
The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.
Resumo:
Depuis la colonisation jusqu’aux plus récents conflits qui affectent le « Moyen-Orient », le visuel participe à la création d’une image raciale et sexuelle du monde musulman dans laquelle l’Orient « archaïque » est représenté comme l’opposé moral de l’Occident « moderne ». Ce discours nommé Orientalisme (Saïd 1994) transforme l’Orient en objet du savoir et du regard occidental (Nochlin 1989). La peinture, la photographie, et les expositions universelles sont autant de moyens qui permettent d’instaurer un point de vue privilégié de l’Occident sur l’Orient. Avec le Web 2.0 et les technologies mobiles, le partage d’images fait partie intégrale de notre quotidien et celles-ci nous proviennent de partout et de n’importe qui. En considérant que l’Orientalisme est indissociable des techniques modernes de représentation du monde (Mitchell 2013), le présent mémoire souhaite interroger l’impact de ces nouvelles technologies sur la production, la circulation et la réception des images du dit Orient. Nous concentrerons notre étude sur les images captées et partagées depuis les manifestations de la Révolution verte iranienne de juin 2009, entre autres la vidéo de l’assassinat de la jeune Neda Agha Soltan qui a fait la une des médias occidentaux. En prenant comme base les écrits d’Edward Saïd, que nous réviserons par le biais de lectures féministes, nous verrons que l’Orientalisme visuel mute au rythme des changements politiques, culturels et technologiques qu’il rencontre. En plus d’éclairer les images de notre corpus, la question féministe nous permettra d’élargir la définition et les mécanismes de l’Orientalisme proposés par Saïd. Nous démontrerons que tout en ayant le potentiel de bouleverser l’image que construit l’Occident de l’Orient, le Web 2.0 actualise aussi l’Orientalisme visuel sous de nouveaux modes de production du savoir.
Resumo:
One of the basic functions of management is to employ capital efficiently so as to provide maximum customer service and earn a profit in the proces1s. It is possible to achieve these objectives in different ways with the given amount of capital, either by maximising the output or by maximising the margin of profit or by a combination of both these methods. This would mean that the management must try to make this capital work as fast as possible, which is often difficult to achieve under the present conditions of the factors of production. It is also not possible to increase extensively the margin of profit due to competition in business and in this process the capital turn over and productivity of capital often becomes totally ineffective. Several modern techniques have been developed and employed by managers to remedy this situation. Among these, materials management has become one of the most effective methods to achieve both the above goals. Materials management enables a manager to improve productivity of capital by reducing material costs, preventing blocking up of large working capital for long periods and improving the capital turn over This study examines the working of materials management departments in public sector undertakings in India and Suggests méthods to improve its efficiency.
Resumo:
Una de las actuaciones posibles para la gestión de los residuos sólidos urbanos es la valorización energética, es decir la incineración con recuperación de energía. Sin embargo es muy importante controlar adecuadamente el proceso de incineración para evitar en lo posible la liberación de sustancias contaminantes a la atmósfera que puedan ocasionar problemas de contaminación industrial.Conseguir que tanto el proceso de incineración como el tratamiento de los gases se realice en condiciones óptimas presupone tener un buen conocimiento de las dependencias entre las variables de proceso. Se precisan métodos adecuados de medida de las variables más importantes y tratar los valores medidos con modelos adecuados para transformarlos en magnitudes de mando. Un modelo clásico para el control parece poco prometedor en este caso debido a la complejidad de los procesos, la falta de descripción cuantitativa y la necesidad de hacer los cálculos en tiempo real. Esto sólo se puede conseguir con la ayuda de las modernas técnicas de proceso de datos y métodos informáticos, tales como el empleo de técnicas de simulación, modelos matemáticos, sistemas basados en el conocimiento e interfases inteligentes. En [Ono, 1989] se describe un sistema de control basado en la lógica difusa aplicado al campo de la incineración de residuos urbanos. En el centro de investigación FZK de Karslruhe se están desarrollando aplicaciones que combinan la lógica difusa con las redes neuronales [Jaeschke, Keller, 1994] para el control de la planta piloto de incineración de residuos TAMARA. En esta tesis se plantea la aplicación de un método de adquisición de conocimiento para el control de sistemas complejos inspirado en el comportamiento humano. Cuando nos encontramos ante una situación desconocida al principio no sabemos como actuar, salvo por la extrapolación de experiencias anteriores que puedan ser útiles. Aplicando procedimientos de prueba y error, refuerzo de hipótesis, etc., vamos adquiriendo y refinando el conocimiento, y elaborando un modelo mental. Podemos diseñar un método análogo, que pueda ser implementado en un sistema informático, mediante el empleo de técnicas de Inteligencia Artificial.Así, en un proceso complejo muchas veces disponemos de un conjunto de datos del proceso que a priori no nos dan información suficientemente estructurada para que nos sea útil. Para la adquisición de conocimiento pasamos por una serie de etapas: - Hacemos una primera selección de cuales son las variables que nos interesa conocer. - Estado del sistema. En primer lugar podemos empezar por aplicar técnicas de clasificación (aprendizaje no supervisado) para agrupar los datos y obtener una representación del estado de la planta. Es posible establecer una clasificación, pero normalmente casi todos los datos están en una sola clase, que corresponde a la operación normal. Hecho esto y para refinar el conocimiento utilizamos métodos estadísticos clásicos para buscar correlaciones entre variables (análisis de componentes principales) y así poder simplificar y reducir la lista de variables. - Análisis de las señales. Para analizar y clasificar las señales (por ejemplo la temperatura del horno) es posible utilizar métodos capaces de describir mejor el comportamiento no lineal del sistema, como las redes neuronales. Otro paso más consiste en establecer relaciones causales entre las variables. Para ello nos sirven de ayuda los modelos analíticos - Como resultado final del proceso se pasa al diseño del sistema basado en el conocimiento. El objetivo principal es aplicar el método al caso concreto del control de una planta de tratamiento de residuos sólidos urbanos por valorización energética. En primer lugar, en el capítulo 2 Los residuos sólidos urbanos, se trata el problema global de la gestión de los residuos, dando una visión general de las diferentes alternativas existentes, y de la situación nacional e internacional en la actualidad. Se analiza con mayor detalle la problemática de la incineración de los residuos, poniendo especial interés en aquellas características de los residuos que tienen mayor importancia de cara al proceso de combustión.En el capítulo 3, Descripción del proceso, se hace una descripción general del proceso de incineración y de los distintos elementos de una planta incineradora: desde la recepción y almacenamiento de los residuos, pasando por los distintos tipos de hornos y las exigencias de los códigos de buena práctica de combustión, el sistema de aire de combustión y el sistema de humos. Se presentan también los distintos sistemas de depuración de los gases de combustión, y finalmente el sistema de evacuación de cenizas y escorias.El capítulo 4, La planta de tratamiento de residuos sólidos urbanos de Girona, describe los principales sistemas de la planta incineradora de Girona: la alimentación de residuos, el tipo de horno, el sistema de recuperación de energía, y el sistema de depuración de los gases de combustión Se describe también el sistema de control, la operación, los datos de funcionamiento de la planta, la instrumentación y las variables que son de interés para el control del proceso de combustión.En el capítulo 5, Técnicas utilizadas, se proporciona una visión global de los sistemas basados en el conocimiento y de los sistemas expertos. Se explican las diferentes técnicas utilizadas: redes neuronales, sistemas de clasificación, modelos cualitativos, y sistemas expertos, ilustradas con algunos ejemplos de aplicación.Con respecto a los sistemas basados en el conocimiento se analizan en primer lugar las condiciones para su aplicabilidad, y las formas de representación del conocimiento. A continuación se describen las distintas formas de razonamiento: redes neuronales, sistemas expertos y lógica difusa, y se realiza una comparación entre ellas. Se presenta una aplicación de las redes neuronales al análisis de series temporales de temperatura.Se trata también la problemática del análisis de los datos de operación mediante técnicas estadísticas y el empleo de técnicas de clasificación. Otro apartado está dedicado a los distintos tipos de modelos, incluyendo una discusión de los modelos cualitativos.Se describe el sistema de diseño asistido por ordenador para el diseño de sistemas de supervisión CASSD que se utiliza en esta tesis, y las herramientas de análisis para obtener información cualitativa del comportamiento del proceso: Abstractores y ALCMEN. Se incluye un ejemplo de aplicación de estas técnicas para hallar las relaciones entre la temperatura y las acciones del operador. Finalmente se analizan las principales características de los sistemas expertos en general, y del sistema experto CEES 2.0 que también forma parte del sistema CASSD que se ha utilizado.El capítulo 6, Resultados, muestra los resultados obtenidos mediante la aplicación de las diferentes técnicas, redes neuronales, clasificación, el desarrollo de la modelización del proceso de combustión, y la generación de reglas. Dentro del apartado de análisis de datos se emplea una red neuronal para la clasificación de una señal de temperatura. También se describe la utilización del método LINNEO+ para la clasificación de los estados de operación de la planta.En el apartado dedicado a la modelización se desarrolla un modelo de combustión que sirve de base para analizar el comportamiento del horno en régimen estacionario y dinámico. Se define un parámetro, la superficie de llama, relacionado con la extensión del fuego en la parrilla. Mediante un modelo linealizado se analiza la respuesta dinámica del proceso de incineración. Luego se pasa a la definición de relaciones cualitativas entre las variables que se utilizan en la elaboración de un modelo cualitativo. A continuación se desarrolla un nuevo modelo cualitativo, tomando como base el modelo dinámico analítico.Finalmente se aborda el desarrollo de la base de conocimiento del sistema experto, mediante la generación de reglas En el capítulo 7, Sistema de control de una planta incineradora, se analizan los objetivos de un sistema de control de una planta incineradora, su diseño e implementación. Se describen los objetivos básicos del sistema de control de la combustión, su configuración y la implementación en Matlab/Simulink utilizando las distintas herramientas que se han desarrollado en el capítulo anterior.Por último para mostrar como pueden aplicarse los distintos métodos desarrollados en esta tesis se construye un sistema experto para mantener constante la temperatura del horno actuando sobre la alimentación de residuos.Finalmente en el capítulo Conclusiones, se presentan las conclusiones y resultados de esta tesis.