938 resultados para System-based


Relevância:

60.00% 60.00%

Publicador:

Resumo:

O trabalho teve por objetivo avaliar a dinâmica de nitrogênio, em cultivo heterotrófico, a partir da cianobactéria Aphanothece microscopica Nägeli, sob o escopo de uma biorrefinaria. Neste sentido, foi avaliada a contribuição dos compostos nitrogenados não proteicos, na dinâmica de distribuição do nitrogênio, na biomassa gerada pelo micro-organismo em estudo, quando cultivado em sistema autotrófico e heterotrófico. Para o cultivo em condições autotróficas, foi utilizado o meio padrão BG-11, enquanto que, para o cultivo em condições heterotróficas, foi empregado o efluente da indústria de laticínios. Inicialmente, foi avaliada a contribuição dos pigmentos na fração nitrogenada não proteica tendo como base dois experimentos. No primeiro experimento foi selecionada a melhor condição para a produção de pigmentos, expressos pela clorofila-a em sistema heterotrófico, tendo como base os parâmetros C/N (20, 40 e 60), N/P (5, 10 e 15) e concentração de inóculo (100, 200 e 300 mg.L-1), mediante um planejamento fatorial 23 . Os experimentos foram conduzidos em biorreator heterotrófico a 20°C, pH 7,6 e aeração contínua de 1VVM. A melhor condição de produção de pigmento foi indicada como sendo a 200 mg.L-1 de concentração celular, razões C/N 20 e N/P 10. Com base nestes resultados, um segundo experimento foi delineado, visando avaliar a contribuição de pigmentos na fração de nitrogênio não proteico, bem como avaliar a produção de clorofila-a e ficobiliproteínas (ficocianina, aloficocianina e ficoeritrina), sob influência da luz e do meio de cultivo. Foi possível destacar teores superiores de ficobiliproteínas na biomassa gerada no cultivo heterotrófico. No entanto, com notada diferença (p≤0,05) nos teores de clorofila-a, quando são comparadas as concentrações na biomassa de meios autotróficos (10,7 mg.g-1) e heterotróficos (1,0 mg.g-1). Fato este compensado pelo menor tempo de cultivo registrado para atingir o final do experimento, quando o micro-organismo é cultivado em condições heterotróficas. Fica demonstrado assim, ainda, a importante contribuição dos pigmentos na fração de nitrogênio não proteico. Na sequência, um terceiro e quarto experimentos foram delineados, visando avaliar a influência do nitrogênio inorgânico intracelular na fração não proteica e na produção de proteína, assim como a caracterização da fração proteica quanto ao seu perfil aminoacídico. O estudo da dinâmica do nitrogênio intracelular demonstrou que o N-NH4 + foi a forma nitrogenada predominante, perfazendo importante fração de N-NP, sendo, portanto, os teores de N-NP significativamente dependente dos teores de pigmentos e nitrogênio intracelular. Os aminogramas das biomassas geradas pelos cultivos autotróficos e heterotróficos indicaram como aminoácidos majoritários o ácido glutâmico e aspártico, seguidos por valina, leucina e isoleucina, e como minoritários, lisina, glicina e metionina. O perfil aminoacídico caracterizou-se por apresentar aminoácidos essenciais como isoleucina, metionina + cisteína, fenilalanina + tirosina, valina e treonina em concentrações superiores ao preconizado pela FAO/WHO. A caracterização da fração proteica quanto ao perfil aminoacídico qualificou esta biomassa como fonte potencial de proteína. Os resultados obtidos neste trabalho demonstram a influência e dinâmica de distribuição dos compostos nitrogenados em Aphanothece microscopica Nägeli. Fica demonstrado, ainda, que a implementação do conceito de biorrefino, no tipo de agroindústria estudado, poderá representar importantes possibilidades de aproveitamento sustentável do efluente gerado.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to increasing integration density and operating frequency of today's high performance processors, the temperature of a typical chip can easily exceed 100 degrees Celsius. However, the runtime thermal state of a chip is very hard to predict and manage due to the random nature in computing workloads, as well as the process, voltage and ambient temperature variability (together called PVT variability). The uneven nature (both in time and space) of the heat dissipation of the chip could lead to severe reliability issues and error-prone chip behavior (e.g. timing errors). Many dynamic power/thermal management techniques have been proposed to address this issue such as dynamic voltage and frequency scaling (DVFS), clock gating and etc. However, most of such techniques require accurate knowledge of the runtime thermal state of the chip to make efficient and effective control decisions. In this work we address the problem of tracking and managing the temperature of microprocessors which include the following sub-problems: (1) how to design an efficient sensor-based thermal tracking system on a given design that could provide accurate real-time temperature feedback; (2) what statistical techniques could be used to estimate the full-chip thermal profile based on very limited (and possibly noise-corrupted) sensor observations; (3) how do we adapt to changes in the underlying system's behavior, since such changes could impact the accuracy of our thermal estimation. The thermal tracking methodology proposed in this work is enabled by on-chip sensors which are already implemented in many modern processors. We first investigate the underlying relationship between heat distribution and power consumption, then we introduce an accurate thermal model for the chip system. Based on this model, we characterize the temperature correlation that exists among different chip modules and explore statistical approaches (such as those based on Kalman filter) that could utilize such correlation to estimate the accurate chip-level thermal profiles in real time. Such estimation is performed based on limited sensor information because sensors are usually resource constrained and noise-corrupted. We also took a further step to extend the standard Kalman filter approach to account for (1) nonlinear effects such as leakage-temperature interdependency and (2) varying statistical characteristics in the underlying system model. The proposed thermal tracking infrastructure and estimation algorithms could consistently generate accurate thermal estimates even when the system is switching among workloads that have very distinct characteristics. Through experiments, our approaches have demonstrated promising results with much higher accuracy compared to existing approaches. Such results can be used to ensure thermal reliability and improve the effectiveness of dynamic thermal management techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese de Doutoramento, Ciências do Ambiente (Ordenamento do Território), 5 de Abril de 2013, Universidade dos Açores.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Mecânica, 2015.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dendritic cells are antigen presenting cells that provide a vital link between the innate and adaptive immune system. Research into this family of cells has revealed that they perform the role of coordinating T-cell based immune responses, both reactive and for generating tolerance. We have derived an algorithm based on the functionality of these cells, and have used the signals and differentiation pathways to build a control mechanism for an artificial immune system. We present our algorithmic details in addition to some preliminary results, where the algorithm was applied for the purpose of anomaly detection. We hope that this algorithm will eventually become the key component within a large, distributed immune system, based on sound imnological concepts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Artificial immune systems have previously been applied to the problem of intrusion detection. The aim of this research is to develop an intrusion detection system based on the function of Dendritic Cells (DCs). DCs are antigen presenting cells and key to the activation of the human immune system, behaviour which has been abstracted to form the Dendritic Cell Algorithm (DCA). In algorithmic terms, individual DCs perform multi-sensor data fusion, asynchronously correlating the fused data signals with a secondary data stream. Aggregate output of a population of cells is analysed and forms the basis of an anomaly detection system. In this paper the DCA is applied to the detection of outgoing port scans using TCP SYN packets. Results show that detection can be achieved with the DCA, yet some false positives can be encountered when simultaneously scanning and using other network services. Suggestions are made for using adaptive signals to alleviate this uncovered problem.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Artificial immune systems, more specifically the negative selection algorithm, have previously been applied to intrusion detection. The aim of this research is to develop an intrusion detection system based on a novel concept in immunology, the Danger Theory. Dendritic Cells (DCs) are antigen presenting cells and key to the activation of the human immune system. DCs perform the vital role of combining signals from the host tissue and correlate these signals with proteins known as antigens. In algorithmic terms, individual DCs perform multi-sensor data fusion based on time-windows. The whole population of DCs asynchronously correlates the fused signals with a secondary data stream. The behaviour of human DCs is abstracted to form the DC Algorithm (DCA), which is implemented using an immune inspired framework, libtissue. This system is used to detect context switching for a basic machine learning dataset and to detect outgoing portscans in real-time. Experimental results show a significant difference between an outgoing portscan and normal traffic.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract. Dendritic cells are antigen presenting cells that provide a vital link between the innate and adaptive immune system. Research into this family of cells has revealed that they perform the role of coordinating T-cell based immune responses, both reactive and for generating tolerance. We have derived an algorithm based on the functionality of these cells, and have used the signals and differentiation pathways to build a control mechanism for an artificial immune system. We present our algorithmic details in addition to some preliminary results, where the algorithm was applied for the purpose of anomaly detection. We hope that this algorithm will eventually become the key component within a large, distributed immune system, based on sound immunological concepts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação de Mestrado em Gestão Integrada da Qualidade, Ambiente e Segurança

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El volumen de datos en bibliotecas ha aumentado enormemente en los últimos años, así como también la complejidad de sus fuentes y formatos de información, dificultando su gestión y acceso, especialmente como apoyo en la toma de decisiones. Sabiendo que una buena gestión de bibliotecas involucra la integración de indicadores estratégicos, la implementación de un Data Warehouse (DW), que gestione adecuadamente tal cantidad de información, así como su compleja mezcla de fuentes de datos, se convierte en una alternativa interesante a considerar. El artículo describe el diseño e implementación de un sistema de soporte de decisiones (DSS) basado en técnicas de DW para la biblioteca de la Universidad de Cuenca. Para esto, el estudio utiliza una metodología holística, propuesto por Siguenza-Guzman et al. (2014) para la evaluación integral de bibliotecas. Dicha metodología evalúa la colección y los servicios, incorporando importantes elementos para la gestión de bibliotecas, tales como: el desempeño de los servicios, el control de calidad, el uso de la colección y la interacción con el usuario. A partir de este análisis, se propone una arquitectura de DW que integra, procesa y almacena los datos. Finalmente, estos datos almacenados son analizados y visualizados a través de herramientas de procesamiento analítico en línea (OLAP). Las pruebas iniciales de implementación confirman la viabilidad y eficacia del enfoque propuesto, al integrar con éxito múltiples y heterogéneas fuentes y formatos de datos, facilitando que los directores de bibliotecas generen informes personalizados, e incluso permitiendo madurar los procesos transaccionales que diariamente se llevan a cabo.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reforma systemu emerytalnego polegająca na zmianie systemu zdefiniowanego świadczenia na system zdefiniowanej składki spowodowała zmniejszenie wysokości otrzymywanych świadczeń przez przyszłych emerytów. Działania takie zmusiły zainteresowanych do poszukiwania dodatkowych źródeł utrzymania. Jednym z takich źródeł może być uzupełniający system emerytalny, określany mianem trzeciego filaru emerytalnego. Jego cechą charakterystyczną jest dobrowolność. Dla jego rozwoju niezbędne jest zapewnienie wzmożonego poziomu ochrony ubezpieczonych. Środki te dotyczą zarówno ogólnej polityki konsumenckiej, jak i regulacji cywilnoprawnych oraz administracyjnoprawnych. W praktyce można wyróżnić dwa poziomy ochrony ubezpieczonych w ramach trzeciego filaru. Poziom pierwszy jest właściwy dla wszystkich ubezpieczonych, natomiast drugi poziom zawiera regulacje ograniczone do uzupełniającego systemu emerytalnego.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Entender a la organización de la empresa como un conjunto de procesos interrelacionados que se gestionan sistemáticamente y que son la base del mejoramiento continuo es imprescindible para obtener el mejor desempeño de los recursos. En base a lo dicho anteriormente, el propósito del presente trabajo de tesis es diseñar un sistema de gestión basado en procesos para la Dirección de Negocios de la Cooperativa de Ahorro y Crédito CREA Ltda. con el fin de lograr una mejorcoordinación entre los diferentes procesos que se realizan en el área de negocios en lo que se refiere tanto a la captaciones de ahorros y las colocaciones de crédito, mediante la estandarización de dichos procesos, logrando una efectiva comunicación entre los involucrados, estableciendo indicadores de gestión para los procesos objeto de estudio.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A lógica fuzzy admite infinitos valores lógicos intermediários entre o falso e o verdadeiro. Com esse princípio, foi elaborado neste trabalho um sistema baseado em regras fuzzy, que indicam o índice de massa corporal de animais ruminantes com objetivo de obter o melhor momento para o abate. O sistema fuzzy desenvolvido teve como entradas as variáveis massa e altura, e a saída um novo índice de massa corporal, denominado Índice de Massa Corporal Fuzzy (IMC Fuzzy), que poderá servir como um sistema de detecção do momento de abate de bovinos, comparando-os entre si através das variáveis linguísticas )Muito BaixaM, ,BaixaB, ,MédiaM, ,AltaA e Muito AltaM. Para a demonstração e aplicação da utilização deste sistema fuzzy, foi feita uma análise de 147 vacas da raça Nelore, determinando os valores do IMC Fuzzy para cada animal e indicando a situação de massa corpórea de todo o rebanho. A validação realizada do sistema foi baseado em uma análise estatística, utilizando o coeficiente de correlação de Pearson 0,923, representando alta correlação positiva e indicando que o método proposto está adequado. Desta forma, o presente método possibilita a avaliação do rebanho, comparando cada animal do rebanho com seus pares do grupo, fornecendo desta forma um método quantitativo de tomada de decisão para o pecuarista. Também é possível concluir que o presente trabalho estabeleceu um método computacional baseado na lógica fuzzy capaz de imitar parte do raciocínio humano e interpretar o índice de massa corporal de qualquer tipo de espécie bovina e em qualquer região do País.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Harvest efficiency is defined as the percentage of fruits harvested by total production. The percentage of fruits harvested is less than 100% when working with trunk shakers to detach olives. It is important to increase the percentage of fruits harvested in order to increase farmer’s income. This objective can be achieved knowing the evolution of the main factors affecting fruit detachment. Fruit removal force (FRF), fruit weight (P) and the ratio between them are important for harvest efficiency. Field trials took place for two years (2013-2014) in Vilariça Valley, northeast Portugal in an olive orchard with ‘Cobrançosa Transmontana’ cultivar. It was adopted a mechanical harvesting system based on a trunk shaker to detach fruits, and an inverted umbrella to collect fruits. Elementary operation times were measured in seconds to evaluate work rates. FRF and P were measured in the ripening period, to evaluate their evolution. In this paper are presented the preliminary results of the ratio FRF (fruit removal force)/fruit weight evolution during the ripening period (P) and the results of the equipment work rate (trees h-1). The ratio FRF/P has predominantly descendant values in the weeks before harvest, from 140 to 80 as a result of a FRF downward variation from 4.9 to 2.94 N and an upward variation of P from 0.0294 to 0.0637 N. The FRF/P ratio stabilizes the decline in the last week of November just before harvesting, registering in some cases a slight increase in consequence of FRF increase higher than P increase (contrary to the tendency of previous weeks). Equipment work rate showed values between 40 and 57 trees h-1, confirming previous results.