941 resultados para Biofertilizer and optimization
Resumo:
Este documento se centra en la presentación de información y análisis de la misma a la hora de establecer la manera en que empresas del sector de extracción de gas natural y generación de energía a base de dicho recurso, toman decisiones en cuanto a inversión, centrándose en la lógica que usan a la hora de emprender este proceso. Esto debido a la constante necesidad de establecer procesos que permitan tomar decisiones más acertadas, incluyendo todas las herramientas posibles para tal fin. La lógica es una de estas herramientas, pues permite encadenar factores con el fin de obtener resultados positivos. Por tal razón, se hace importante conocer el uso de esta herramienta, teniendo en cuentas de qué manera y en que contextos es usada. Con el fin de tener una mayor orientación, este estudio estará centrado en un sector específico, el cual es el de la extracción de petróleo y gas natural. Lo anterior entendiendo la necesidad existente de fundamentación teórica que permita establecer de manera clara la forma apropiada de tomar decisiones en un sector tan diverso y complejo como lo es el mencionado. El contexto empresarial actual exige una visión global, no basada en la lógica lineal causal que hoy se tiene como referencia. El sector de extracción de petróleo y gas natural es un ejemplo particular en cuanto a la manera en cuanto se toman decisiones en inversión, puesto que en su mayoría son empresas de capital intensivo, las cuales mantienen un flujo elevado de recursos monetarios.
Resumo:
La primera part d'aquest treball s´ha centrat en la caracterització i optimització del procés d'activació de l´onconasa recombinant per tal d'obtenir l´enzim igual a la forma nativa. Per això, les reaccions d'eliminació de la Met-1 i la ciclació de la Glu1, necessàries per generar el piroglutamic han estat seguides per MALDI-TOF MS. La segona part d´aquest treball s´ha centrat en l´estudi de la contribució del pont disulfur 30-75 de l´onconasa a les seves propietats biològiques. Els resultats suggereixen que el potencial redox del citosol cel·lular podria reduir el pont disulfur 30-75 de l´onconasa salvatge afectant la unió onconasa -inhibidor proteic de ribonucleases. La tercera part ha consistit en la construcció de variants de l´HP-RNasa i onconasa amb activitat bactericida. Per això, s´ha introduït el determinant bactericida (YRWR) descrit per la proteïna catiònica d'eosinòfils en els dos enzims. Els resultats obtinguts han evidenciat que les dues ribonucleases amb el determinant bactericida presenten activitat citotòxica contra bacteris gram-negatius preferentment.
Resumo:
We have designed a highly parallel design for a simple genetic algorithm using a pipeline of systolic arrays. The systolic design provides high throughput and unidirectional pipelining by exploiting the implicit parallelism in the genetic operators. The design is significant because, unlike other hardware genetic algorithms, it is independent of both the fitness function and the particular chromosome length used in a problem. We have designed and simulated a version of the mutation array using Xilinix FPGA tools to investigate the feasibility of hardware implementation. A simple 5-chromosome mutation array occupies 195 CLBs and is capable of performing more than one million mutations per second. I. Introduction Genetic algorithms (GAs) are established search and optimization techniques which have been applied to a range of engineering and applied problems with considerable success [1]. They operate by maintaining a population of trial solutions encoded, using a suitable encoding scheme.
Resumo:
A parallel hardware random number generator for use with a VLSI genetic algorithm processing device is proposed. The design uses an systolic array of mixed congruential random number generators. The generators are constantly reseeded with the outputs of the proceeding generators to avoid significant biasing of the randomness of the array which would result in longer times for the algorithm to converge to a solution. 1 Introduction In recent years there has been a growing interest in developing hardware genetic algorithm devices [1, 2, 3]. A genetic algorithm (GA) is a stochastic search and optimization technique which attempts to capture the power of natural selection by evolving a population of candidate solutions by a process of selection and reproduction [4]. In keeping with the evolutionary analogy, the solutions are called chromosomes with each chromosome containing a number of genes. Chromosomes are commonly simple binary strings, the bits being the genes.
Resumo:
Measurement or prediction of the mechanical and fracture properties of foods is very important in the design, operation and optimization of processes, as well as for the control of quality of food products. This paper describes the measurement of yield stress of frozen sucrose solutions under indentation tests using a spherical indenter. Effects of composition, temperature and strain rate on yield stress of frozen sucrose solutions have also been investigated.
Resumo:
Control and optimization of flavor is the ultimate challenge for the food and flavor industry. The major route to flavor formation during thermal processing is the Maillard reaction, which is a complex cascade of interdependent reactions initiated by the reaction between a reducing sugar and an amino compd. The complexity of the reaction means that researchers turn to kinetic modeling in order to understand the control points of the reaction and to manipulate the flavor profile. Studies of the kinetics of flavor formation have developed over the past 30 years from single- response empirical models of binary aq. systems to sophisticated multi-response models in food matrixes, based on the underlying chem., with the power to predict the formation of some key aroma compds. This paper discusses in detail the development of kinetic models of thermal generation of flavor and looks at the challenges involved in predicting flavor.
Resumo:
The work involves investigation of a type of wireless power system wherein its analysis will yield the construction of a prototype modeled as a singular technological artifact. It is through exploration of the artifact that forms the intellectual basis for not only its prototypical forms, but suggestive of variant forms not yet discovered. Through the process it is greatly clarified the role of the artifact, its most suitable application given the constraints on the delivery problem, and optimization strategies to improve it. In order to improve maturity and contribute to a body of knowledge, this document proposes research utilizing mid-field region, efficient inductive-transfer for the purposes of removing wired connections and electrical contacts. While the description seems enough to state the purpose of this work, it does not convey the compromises of having to redraw the lines of demarcation between near and far-field in the traditional method of broadcasting. Two striking scenarios are addressed in this thesis: Firstly, the mathematical explanation of wireless power is due to J.C. Maxwell's original equations, secondly, the behavior of wireless power in the circuit is due to Joseph Larmor's fundamental works on the dynamics of the field concept. A model of propagation will be presented which matches observations in experiments. A modified model of the dipole will be presented to address the phenomena observed in the theory and experiments. Two distinct sets of experiments will test the concept of single and two coupled-modes. In a more esoteric context of the zero and first-order magnetic field, the suggestion of a third coupled-mode is presented. Through the remaking of wireless power in this context, it is the intention of the author to show the reader that those things lost to history, bound to a path of complete obscurity, are once again innovative and useful ideas.
Resumo:
We report the design and operation of a device for ac magnetic susceptibility measurements that can operate down to 1 mK. The device, a modification of the standard mutual inductance bridge, is designed with detailed consideration of the thermalization and optimization of each element. First, in order to reduce local heating, the primary coil is made with superconducting wire. Second, a low-temperature transformer which is thermally anchored to the mixing chamber of a dilution refrigerator, is used to match the output of the secondary coil to a high-sensitivity bridge detector. The careful thermal anchoring of the secondary coil and the matching transformer is required to reduce the overall noise temperature and maximize sensitivity. The sample is immersed in liquid (3)He to minimize the Kapitza thermal resistance. The magnetic susceptibility of several magnetic compounds, such as the well-known spin gap compound NiCl(2)-4SC(NH(2))(2) and other powdered samples, have been successfully measured to temperatures well below 10 mK.
Resumo:
This work describes the development and optimization of a sequential injection method to automate the determination of paraquat by square-wave voltammetry employing a hanging mercury drop electrode. Automation by sequential injection enhanced the sampling throughput, improving the sensitivity and precision of the measurements as a consequence of the highly reproducible and efficient conditions of mass transport of the analyte toward the electrode surface. For instance, 212 analyses can be made per hour if the sample/standard solution is prepared off-line and the sequential injection system is used just to inject the solution towards the flow cell. In-line sample conditioning reduces the sampling frequency to 44 h(-1). Experiments were performed in 0.10 M NaCl, which was the carrier solution, using a frequency of 200 Hz, a pulse height of 25 mV, a potential step of 2 mV, and a flow rate of 100 mu L s(-1). For a concentration range between 0.010 and 0.25 mg L(-1), the current (i(p), mu A) read at the potential corresponding to the peak maximum fitted the following linear equation with the paraquat concentration (mg L(-1)): ip = (-20.5 +/- 0.3) Cparaquat -(0.02 +/- 0.03). The limits of detection and quantification were 2.0 and 7.0 mu g L(-1), respectively. The accuracy of the method was evaluated by recovery studies using spiked water samples that were also analyzed by molecular absorption spectrophotometry after reduction of paraquat with sodium dithionite in an alkaline medium. No evidence of statistically significant differences between the two methods was observed at the 95% confidence level.
Resumo:
Random effect models have been widely applied in many fields of research. However, models with uncertain design matrices for random effects have been little investigated before. In some applications with such problems, an expectation method has been used for simplicity. This method does not include the extra information of uncertainty in the design matrix is not included. The closed solution for this problem is generally difficult to attain. We therefore propose an two-step algorithm for estimating the parameters, especially the variance components in the model. The implementation is based on Monte Carlo approximation and a Newton-Raphson-based EM algorithm. As an example, a simulated genetics dataset was analyzed. The results showed that the proportion of the total variance explained by the random effects was accurately estimated, which was highly underestimated by the expectation method. By introducing heuristic search and optimization methods, the algorithm can possibly be developed to infer the 'model-based' best design matrix and the corresponding best estimates.
Resumo:
No cenário atual, onde a globalização, aliada a um maior nível de exigência por parte do cliente, impõem às empresas um maior empenho por competitividade, a agdidade no desenvolvimento e otimização de produtos torna-se crucial para a sobrevivência das mesmas no mercado. Neste contexto, procurou-se compilar várias técnicas utilizadas em E n g e h dd Qd& em um método integrado para a Ot+o Expmmid de MWtwa. Essas técnicas fornecem resultados muito mais rápidos e econômicos do que a tradicional prática de variar um componente de cada vez na mistura, devido ao menor número de ensaios necessários. Entretanto, apesar de não serem tão recentes, as ferramentas aplicáveis à otimização de misturas não têm sido utilizadas pelo seu maior beneficiário (a indústria), provavelmente por falta de divulgação de sua existência, ou, principalmente, devido à complexidade dos cálculos envolvidos. Dessa forma, além do método proposto, desenvolveu-se também um sofiwa~q ue irnplementa todas os passos sugeridos, com o intuito de facilitar ainda mais a aplicação dos mesmos por pessoas não especializadas em técnicas estatísticas. Através do software (OptiMix), o método foi testado em uma situação real e em um estudo comparativo com um relato da literatura, a fim de testar sua validade, necessidade de adaptações e consistência dos resultados. A avaliaçio dos estudos de caso demonstrou que o método proposto fornece resultados coerentes com os de outras técnicas alternativas, com a vantagem de o usuário não precisar realizar cálculos, evitando assim, erros e agilizando o processo de otimização.
Resumo:
A terceirização de serviços é considerada uma ferramenta de gestão em tempos atuais. Essa modalidade de contratação de serviços se expande em qualquer das esferas pública ou privada. Nesta última, a globalização da economia e a competição desenfreada pelo mundo impulsionam a produtividade e a otimização das etapas da produção substituindo custo fixo por variável. Na esfera pública, a partir da década de 1970, a crise fiscal prevaleceu na maioria das discussões, sugerindo a idéia neoliberal de limitar a intervenção do Estado na economia para conter o déficit público. Emerge a solução reformista de isolar num pequeno núcleo as atividades principais, que são exclusivas do Estado e intransferíveis a terceiros. Por meio da desestatização, um dos eixos da reforma, os serviços sociais são publicizáveis e a produção de bens e de serviços públicos entregues ao mercado. O foco deste trabalho se concentra na apuração do resultado que subsidie estrategicamente a Secretaria da Fazenda do Estado de Pernambuco, em termos financeiros e de eficiência administrativa, a escolher a opção mais vantajosa para a Administração entre contratar servidores efetivos via concurso público para realizar as atividades acessórias de apoio ou terceirizar os serviços.
Resumo:
Equacionar aquecimento global, escassez de alimentos e a crescente necessidade energética, tornou-se, atualmente, o grande desafio mundial. Existem diversas culturas agrícolas que podem ser exploradas de maneira estratégica e assim colaborar com a solução deste problema. Dentre elas, pode-se destacar a cultura do girassol (Helianthus Annuus). O girassol é uma das quatro maiores culturas oleaginosas no mundo, cultivado com sucesso nos cinco continentes, ocupando uma área de cultivo superior a 22 milhões de hectares. A participação do Brasil nesse montante é inferior a 1%. Acredita-se que essa pequena participação se deva a fatores sócio-econômicos e tecnológicos. Salienta-se, porém que o Brasil, por suas vantagens comparativas naturais e vantagens competitivas construídas possui condições favoráveis para seu desenvolvimento. Diante dos fatos, o objetivo deste trabalho é aprofundar o conhecimento da cadeia produtiva do girassol e através de sua utilização como estratégia de competitividade, avaliar de maneira sistêmica os impactos na matriz agrícola do País. Dentre as inúmeras vantagens dessa cultura, pode-se destacar: características agronômicas, físicas, químicas, organolépticas e versatilidade, que permitem a utilização e otimização dos fatores de produção já disponíveis; época de plantio (adaptabilidade a diferentes condições edafoclimáticas), podendo ser cultivado desde o Rio Grande do Sul até o Estado de Roraima; sistema radicular (raiz pode chegar a dois metros de profundidade), permitindo o melhor aproveitamento dos nutrientes e da água do solo e promovendo a reciclagem de nutrientes; alto teor de óleo nas sementes (30% a 55%) e; alto valor comercial dos co-produtos. Esse conjunto de características é analisado sob a ótica da teoria das vantagens competitivas e das economias de escala e escopo, demonstrando que com inteligência e pragmatismo, a cultura do girassol pode repetir, com vantagens, o que a soja representou para o agronegócio brasileiro.
Resumo:
The multiphase flow occurrence in the oil and gas industry is common throughout fluid path, production, transportation and refining. The multiphase flow is defined as flow simultaneously composed of two or more phases with different properties and immiscible. An important computational tool for the design, planning and optimization production systems is multiphase flow simulation in pipelines and porous media, usually made by multiphase flow commercial simulators. The main purpose of the multiphase flow simulators is predicting pressure and temperature at any point at the production system. This work proposes the development of a multiphase flow simulator able to predict the dynamic pressure and temperature gradient in vertical, directional and horizontal wells. The prediction of pressure and temperature profiles was made by numerical integration using marching algorithm with empirical correlations and mechanistic model to predict pressure gradient. The development of this tool involved set of routines implemented through software programming Embarcadero C++ Builder® 2010 version, which allowed the creation of executable file compatible with Microsoft Windows® operating systems. The simulator validation was conduct by computational experiments and comparison the results with the PIPESIM®. In general, the developed simulator achieved excellent results compared with those obtained by PIPESIM and can be used as a tool to assist production systems development
Resumo:
Tuberculosis is a serious disease, but curable in practically 100% of new cases, since complied the principles of modern chemotherapy. Isoniazid (ISN), Rifampicin (RIF), Pyrazinamide (PYR) and Chloride Ethambutol (ETA) are considered first line drugs in the treatment of tuberculosis, by combining the highest level of efficiency with acceptable degree of toxicity. Concerning USP 33 - NF28 (2010) the chromatography analysis to 3 of 4 drugs (ISN, PYR and RIF) last in average 15 minutes and 10 minutes more to obtain the 4th drug (ETA) using a column and mobile phase mixture different, becoming its industrial application unfavorable. Thus, many studies have being carried out to minimize this problem. An alternative would use the UFLC, which is based with the same principles of HPLC, however it uses stationary phases with particles smaller than 2 μm. Therefore, this study goals to develop and validate new analytical methods to determine simultaneously the drugs by HPLC/DAD and UFLC/DAD. For this, a analytical screening was carried out, which verified that is necessary a gradient of mobile phase system A (acetate buffer:methanol 94:6 v/v) and B (acetate buffer:acetonitrile 55:45 v/v). Furthermore, to the development and optimization of the method in HPLC and UFLC, with achievement of the values of system suitability into the criteria limits required for both techniques, the validations have began. Standard solutions and tablets test solutions were prepared and injected into HPLC and UFLC, containing 0.008 mg/mL ISN, 0.043 mg/mL PYR, 0.030 mg.mL-1 ETA and 0.016 mg/mL RIF. The validation of analytical methods for HPLC and UFLC was carried out with the determination of specificity/selectivity, analytical curve, linearity, precision, limits of detection and quantification, accuracy and robustness. The methods were adequate for determination of 4 drugs separately without interfered with the others. Precise, due to the fact of the methods demonstrated since with the days variation, besides the repeatability, the values were into the level required by the regular agency. Linear (R> 0,99), once the methods were capable to demonstrate results directly proportional to the concentration of the analyte sample, within of specified range. Accurate, once the methods were capable to present values of variation coefficient and recovery percentage into the required limits (98 to 102%). The methods showed LOD and LOQ very low showing the high sensitivity of the methods for the four drugs. The robustness of the methods were evaluate, facing the temperature and flow changes, where they showed robustness just with the preview conditions established of temperature and flow, abrupt changes may influence with the results of methods