946 resultados para Balanced Nested Designs


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A produção de peptídeos bioativos de distintas fontes de proteínas vem ganhando espaço na produção científica e tecnológica, despertando interesse do setor empresarial. Paralelamente a isso, devido à elevada concentração de proteínas na biomassa das microalgas Spirulina e Chlorella, estas apresentam grande potencial para a extração de biocompostos com alto valor agregado, como biopeptídeos de microalgas. As proteínas são uma importante fonte de peptídeos bioativos, mas estes não estão ativos na proteína precursora e devem ser liberados para que apresentem efeitos fisiológicos desejados. Essa liberação pode ser feita através de hidrólise enzimática a partir de proteases, sendo um dos métodos mais utilizados para a produção destes biocompostos. Dentro deste contexto, vários estudos vêm mostrando o uso da tecnologia por secagem em spray dryer para a obtenção de nanopartículas que contenham compostos bioativos, sendo, essa técnica, amplamente utilizada para transformar líquidos em pós, podendo ser aplicada em materiais sensíveis à temperatura. Este estudo teve como objetivo obter peptídeos bioativos através da reação enzimática, tendo como substrato a biomassa de Spirulina sp. LEB 18 e Chlorella pyrenoidosa e, na sequência, obter nanopartículas contendo os biopeptídeos. Primeiramente, foram testadas as 3 proteases comerciais (Protemax 580 L, Protemax N 200 e pepsina) para a produção de hidrolisados proteicos de microalgas, para isso foram realizados 3 delineamentos compostos centrais para cada microalga em estudo (Chlorella e Spirulina). Os delineamentos utilizados foram do tipo 23 com três repetições no ponto central, variando-se a concentração de enzima (5 a 10 U.mL-1), a concentração de substrato (5 a 10 %) e o tempo de reação (60 a 240 min). Após, realizou-se 2 delineamentos compostos rotacionais do tipo 22 com pontos centrais, um para cada microalga, utilizando-se para a hidrólise a enzima Protemax 580L (5 U.mL-1) variando-se a concentração de substrato e tempo de reação, para todos ensaios estudou-se a solubilidade, capacidade de retenção de água, atividade antioxidante e digestibilidade. Foi selecionado um ensaio para cada microalga, levando em conta os melhores resultados. Então nova hidrólise enzimática foi realizada sendo o sistema reacional composto pela enzima Protemax 580 L (5 U.mL-1) e pela biomassa de Spirulina sp. LEB 18 ou Chlorella pyrenoidosa (4% de proteína) durante tempo de 200 min. Os hidrolisados foram purificados por filtração a vácuo com membranas millipores de diferentes tamanhos (0,45; 0,2 e 0,1 µm) e por colunas com membrana vertical Amicon® Ultra 0.5 (3K e 10K), sendo que após cada etapa, foi realizado teste de atividade antioxidante pelos métodos de poder redutor, DPPH e ABTS, a fim de verificar a permanência da atividade antioxidante. Utilizou-se nano spray dryer Büchi modelo B 90 para a secagem das amostras, sendo o tamanho das partículas obtidas analisados por microscopia eletrônica de varredura (MEV). Por fim, conclui-se que a biomassa de microalgas pode ser utilizada como fonte de produção de peptídeos bioativos com elevada atividade antioxidante e que dentre as microalgas estudadas, Spirulina sp. LEB 18 apresentou melhores resultados, em todas as análises realizadas, quando comparada com Chlorella pyrenoidosa. Esse estudo, também visou utilizar a nanobiotecnologia para obtenção de nanoparículas contendo os biopeptídeos, para tal, utilizou-se o nano Buchi Spray Dryer B-90, o qual gerou partículas nanométricas de 14 a 18 nm para o hidrolisado de Spirulina e de 72 a 108 nm para o hidrolisado de Chlorella.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O Balanced Scorecard (BSC) é uma ferramenta de avaliação e gestão de desempenho e de gestão estratégica, que tem como objetivo auxiliar a gestão das organizações. O objetivo principal do presente trabalho foi propor um BSC para uma escola superior politécnica pública, com o propósito de proporcionar à gestão da instituição uma ferramenta que permita melhorar o desempenho organizacional. O estudo foi realizado na Escola Superior de Tecnologia e Gestão de Águeda (ESTGA-UA), uma escola politécnica integrada na Universidade de Aveiro, uma instituição de ensino superior pública. O BSC foi desenvolvido recorrendo à análise documental de documentos internos e de legislação e a entrevistas semiestruturadas efetuadas ao Diretor da ESTGA-UA, aos diretores de curso, a funcionários não docentes e a estudantes. Foi igualmente utilizada a observação participante. Dada a pertinência desta ferramenta no auxílio à melhoria do desempenho e à gestão organizacional, sugere-se a implementação do BSC proposto na ESTGA-UA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The aim of this study was the evaluation of a fast Gradient Spin Echo Technique (GraSE) for cardiac T2-mapping, combining a robust estimation of T2 relaxation times with short acquisition times. The sequence was compared against two previously introduced T2-mapping techniques in a phantom and in vivo. Methods: Phantom experiments were performed at 1.5 T using a commercially available cylindrical gel phantom. Three different T2-mapping techniques were compared: a Multi Echo Spin Echo (MESE; serving as a reference), a T2-prepared balanced Steady State Free Precession (T2prep) and a Gradient Spin Echo sequence. For the subsequent in vivo study, 12 healthy volunteers were examined on a clinical 1.5 T scanner. The three T2-mapping sequences were performed at three short-axis slices. Global myocardial T2 relaxation times were calculated and statistical analysis was performed. For assessment of pixel-by-pixel homogeneity, the number of segments showing an inhomogeneous T2 value distribution, as defined by a pixel SD exceeding 20 % of the corresponding observed T2 time, was counted. Results: Phantom experiments showed a greater difference of measured T2 values between T2prep and MESE than between GraSE and MESE, especially for species with low T1 values. Both, GraSE and T2prep resulted in an overestimation of T2 times compared to MESE. In vivo, significant differences between mean T2 times were observed. In general, T2prep resulted in lowest (52.4 +/- 2.8 ms) and GraSE in highest T2 estimates (59.3 +/- 4.0 ms). Analysis of pixel-by-pixel homogeneity revealed the least number of segments with inhomogeneous T2 distribution for GraSE-derived T2 maps. Conclusions: The GraSE sequence is a fast and robust sequence, combining advantages of both MESE and T2prep techniques, which promises to enable improved clinical applicability of T2-mapping in the future. Our study revealed significant differences of derived mean T2 values when applying different sequence designs. Therefore, a systematic comparison of different cardiac T2-mapping sequences and the establishment of dedicated reference values should be the goal of future studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over the last decade, rapid development of additive manufacturing techniques has allowed the fabrication of innovative and complex designs. One field that can benefit from such technology is heat exchanger fabrication, as heat exchanger design has become more and more complex due to the demand for higher performance particularly on the air side of the heat exchanger. By employing the additive manufacturing, a heat exchanger design was successfully realized, which otherwise would have been very difficult to fabricate using conventional fabrication technologies. In this dissertation, additive manufacturing technique was implemented to fabricate an advanced design which focused on a combination of heat transfer surface and fluid distribution system. Although the application selected in this dissertation is focused on power plant dry cooling applications, the results of this study can directly and indirectly benefit other sectors as well, as the air-side is often the limiting side for in liquid or single phase cooling applications. Two heat exchanger designs were studied. One was an advanced metallic heat exchanger based on manifold-microchannel technology and the other was a polymer heat exchanger based on utilization of prime surface technology. Polymer heat exchangers offer several advantages over metals such as antifouling, anticorrosion, lightweight and often less expensive than comparable metallic heat exchangers. A numerical modeling and optimization were performed to calculate a design that yield an optimum performance. The optimization results show that significant performance enhancement is noted compared to the conventional heat exchangers like wavy fins and plain plate fins. Thereafter, both heat exchangers were scaled down and fabricated using additive manufacturing and experimentally tested. The manifold-micro channel design demonstrated that despite some fabrication inaccuracies, compared to a conventional wavy-fin surface, 15% - 50% increase in heat transfer coefficient was possible for the same pressure drop value. In addition, if the fabrication inaccuracy can be eliminated, an even larger performance enhancement is predicted. Since metal based additive manufacturing is still in the developmental stage, it is anticipated that with further refinement of the manufacturing process in future designs, the fabrication accuracy can be improved. For the polymer heat exchanger, by fabricating a very thin wall heat exchanger (150μm), the wall thermal resistance, which usually becomes the limiting side for polymer heat exchanger, was calculated to account for only up to 3% of the total thermal resistance. A comparison of air-side heat transfer coefficient of the polymer heat exchanger with some of the commercially available plain plate fin surface heat exchangers show that polymer heat exchanger performance is equal or superior to plain plate fin surfaces. This shows the promising potential for polymer heat exchangers to compete with conventional metallic heat exchangers when an additive manufacturing-enabled fabrication is utilized. Major contributions of this study are as follows: (1) For the first time demonstrated the potential of additive manufacturing in metal printing of heat exchangers that benefit from a sophisticated design to yield a performance substantially above the respective conventional systems. Such heat exchangers cannot be fabricated with the conventional fabrication techniques. (2) For the first time demonstrated the potential of additive manufacturing to produce polymer heat exchangers that by design minimize the role of thermal conductivity and deliver a thermal performance equal or better that their respective metallic heat exchangers. In addition of other advantages of polymer over metal like antifouling, anticorrosion, and lightweight. Details of the work are documented in respective chapters of this thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of Vehicle Routing Problems (VRP) and their variations is to transport a set of orders with the minimum number of vehicles at least cost. Most approaches are designed to solve specific problem variations independently, whereas in real world applications, different constraints are handled concurrently. This research extends solutions obtained for the traveling salesman problem with time windows to a much wider class of route planning problems in logistics. The work describes a novel approach that:  supports a heterogeneous fleet of vehicles  dynamically reduces the number of vehicles  respects individual capacity restrictions  satisfies pickup and delivery constraints  takes Hamiltonian paths (rather than cycles) The proposed approach uses Monte-Carlo Tree Search and in particular Nested Rollout Policy Adaptation. For the evaluation of the work, real data from the industry was obtained and tested and the results are reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The goal of this research was to analyze the model of strategic management of the MPRN concerning the methodological guidelines presented by Balanced Scorecard. It is based in a theoretical referential which contemplates the themes, new public management, strategic management and Balanced Scorecard, focusing on applying the methodology in the public sector. This research is classified as descriptive and exploratory. According to the methods applied, it is a case study and, according to its approach, it is qualitative. The subjects of this research are members of the institution involved in the process of its strategic management. The data was collected by means of semi-structured interviews and document analysis, done by means of method content analysis. Concerning the goal of this research, it points out that the MPRN has not concluded the implantation cycle of Balanced Scorecard, furthermore, important flaws in the steps of organizational alingment have been identified, specially when it refers to communication policy, implementing incentive actions and focused training in developing competences. It yet reveals that the implantation of BSC has allowed the introduction of changes in the Institution dynamics to seek better results, however the MPRN has faced and has not adequately gotten over the same difficulties reported in various cases of BSC implantation in public organizations

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Toppling analysis of a precariously balanced rock (PBR) can provide insights into the nature of ground motion that has not occurred at that location in the past and, by extension, realistic constraints on peak ground motions for use in engineering design. Earlier approaches have targeted simplistic 2-D models of the rock or modeled the rock-pedestal contact using spring-damper assemblies that require re-calibration for each rock. These analyses also assume that the rock does not slide on the pedestal. Here, a method to model PBRs in three dimensions is presented. The 3-D model is created from a point cloud of the rock, the pedestal, and their interface, obtained using Terrestrial Laser Scanning (TLS). The dynamic response of the model under earthquake excitation is simulated using a rigid body dynamics algorithm. The veracity of this approach is demonstrated by comparisons against data from shake table experiments. Fragility maps for toppling probability of the Echo Cliff PBR and the Pacifico PBR as a function of various ground motion parameters, rock-pedestal interface friction coefficient, and excitation direction are presented. The seismic hazard at these PBR locations is estimated using these maps. Additionally, these maps are used to assess whether the synthetic ground motions at these locations resulting from scenario earthquakes on the San Andreas Fault are realistic (toppling would indicate that the ground motions are unrealistically high).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coprime and nested sampling are well known deterministic sampling techniques that operate at rates significantly lower than the Nyquist rate, and yet allow perfect reconstruction of the spectra of wide sense stationary signals. However, theoretical guarantees for these samplers assume ideal conditions such as synchronous sampling, and ability to perfectly compute statistical expectations. This thesis studies the performance of coprime and nested samplers in spatial and temporal domains, when these assumptions are violated. In spatial domain, the robustness of these samplers is studied by considering arrays with perturbed sensor locations (with unknown perturbations). Simplified expressions for the Fisher Information matrix for perturbed coprime and nested arrays are derived, which explicitly highlight the role of co-array. It is shown that even in presence of perturbations, it is possible to resolve $O(M^2)$ under appropriate conditions on the size of the grid. The assumption of small perturbations leads to a novel ``bi-affine" model in terms of source powers and perturbations. The redundancies in the co-array are then exploited to eliminate the nuisance perturbation variable, and reduce the bi-affine problem to a linear underdetermined (sparse) problem in source powers. This thesis also studies the robustness of coprime sampling to finite number of samples and sampling jitter, by analyzing their effects on the quality of the estimated autocorrelation sequence. A variety of bounds on the error introduced by such non ideal sampling schemes are computed by considering a statistical model for the perturbation. They indicate that coprime sampling leads to stable estimation of the autocorrelation sequence, in presence of small perturbations. Under appropriate assumptions on the distribution of WSS signals, sharp bounds on the estimation error are established which indicate that the error decays exponentially with the number of samples. The theoretical claims are supported by extensive numerical experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compressed covariance sensing using quadratic samplers is gaining increasing interest in recent literature. Covariance matrix often plays the role of a sufficient statistic in many signal and information processing tasks. However, owing to the large dimension of the data, it may become necessary to obtain a compressed sketch of the high dimensional covariance matrix to reduce the associated storage and communication costs. Nested sampling has been proposed in the past as an efficient sub-Nyquist sampling strategy that enables perfect reconstruction of the autocorrelation sequence of Wide-Sense Stationary (WSS) signals, as though it was sampled at the Nyquist rate. The key idea behind nested sampling is to exploit properties of the difference set that naturally arises in quadratic measurement model associated with covariance compression. In this thesis, we will focus on developing novel versions of nested sampling for low rank Toeplitz covariance estimation, and phase retrieval, where the latter problem finds many applications in high resolution optical imaging, X-ray crystallography and molecular imaging. The problem of low rank compressive Toeplitz covariance estimation is first shown to be fundamentally related to that of line spectrum recovery. In absence if noise, this connection can be exploited to develop a particular kind of sampler called the Generalized Nested Sampler (GNS), that can achieve optimal compression rates. In presence of bounded noise, we develop a regularization-free algorithm that provably leads to stable recovery of the high dimensional Toeplitz matrix from its order-wise minimal sketch acquired using a GNS. Contrary to existing TV-norm and nuclear norm based reconstruction algorithms, our technique does not use any tuning parameters, which can be of great practical value. The idea of nested sampling idea also finds a surprising use in the problem of phase retrieval, which has been of great interest in recent times for its convex formulation via PhaseLift, By using another modified version of nested sampling, namely the Partial Nested Fourier Sampler (PNFS), we show that with probability one, it is possible to achieve a certain conjectured lower bound on the necessary measurement size. Moreover, for sparse data, an l1 minimization based algorithm is proposed that can lead to stable phase retrieval using order-wise minimal number of measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A gestão de uma organização, independentemente da sua natureza, representa na atualidade um grande desafio. Inicialmente, o sucesso no desenvolvimento do processo de gestão de uma organização depende, essencialmente, de uma estratégia bem formulada e implementada, mas no entanto, a longo prazo, este êxito só será possível se existir um mecanismo de monitorização que permita realizar ao longo do tempo uma avaliação do desempenho. O Balanced Scorecard (BSC), da autoria de Robert Kaplan e David Norton, na década de 90, surgiu inicialmente como uma metodologia de avaliação do desempenho e rapidamente começou a ser utlizado também como um sistema de gestão estratégica. Esta ferramenta pode ser utilizada nos mais variados tipos de organizações visto que pode ser adaptada às características e especificidades de cada uma. O presente estudo tem como objetivo adaptar o Balanced Scorecard a uma organização desportiva, as Piscinas da Associação Humanitária de Bombeiros Voluntários de Colares, de modo que, relativamente à metodologia empregue, este consiste num estudo de caso. Com o objetivo de suportar o estudo de uma forma coerente, foram utilizados como instrumentos de recolha de dados um conjunto de entrevistas e a análise documental. Como principal conclusão deste estudo retiramos que o BSC é de facto uma ferramenta muito útil para desenvolver a estratégia de uma organização e avaliar o seu desempenho.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

No contexto actual, de exigência de serviços de qualidade por parte da sociedade, os sistemas de avaliação do desempenho são essenciais para as organizações trilharem o caminho do sucesso. Assim sendo, a metodologia Balanced Scorecard, é a ideal para dar resposta a um combinado de problemas comuns a várias organizações, sendo um deles a gestão exclusivamente assente em indicadores financeiros. A metodologia empregue neste trabalho é uma investigação descritiva, representativa de um estudo de caso. Os instrumentos de pesquisa utilizados para a recolha de dados, foram as entrevistas não estruturadas, inquéritos de satisfação a utentes e colaboradores da escola e a análise documental. A principal conclusão deste estudo é que a metodologia Balanced Scorecard pode melhorar em vários aspectos, as perspectivas de aprendizagem e desenvolvimento, processo, financeira e clientes, possibilitando uma forma mais clara e precisa do caminho que deve ser percorrido pelo gestor, de forma a atingir todos os seus objectivos, nunca se desviando da sua missão.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de Mestrado, Finanças Empresariais, Faculdade de Economia, Universidade do Algarve, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação de mest. em Gestão Empresarial, Faculdade de Economia, Univ. do Algarve, 2004