992 resultados para Algorithm transfer


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents an experimental program to assess the tensile strain distribution along prestressed carbon fiber reinforced polymer (CFRP) reinforcement flexurally applied on the tensile surface of RC beams according to near surface mounted (NSM) technique. Moreover, the current study aims to propose an analytical formulation, with a design framework, for the prediction of distribution of CFRP tensile strain and bond shear stress and, additionally, the prestress transfer length. After demonstration the good predictive performance of the proposed analytical approach, parametric studies were carried out to analytically evaluate the influence of the main material properties, and CFRP and groove cross section on the distribution of the CFRP tensile strain and bond shear stress, and on the prestress transfer length. The proposed analytical approach can also predict the evolution of the prestress transfer length during the curing time of the adhesive by considering the variation of its elasticity modulus during this period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Universities are increasingly institutionalizing activities related to technology transfer and one of the main institutional mechanisms that has emerged is the “technology transfer unit” (TTU). Many of them are focusing their activities on the management of the university intellectual property. Studies have investigated factors that seem to affect their performance, but few have looked in detail at internal procedures and techniques that are used in their processes related to technology evaluation and licensing. The aim of this paper is to provide a comprehensive overview of some of the several steps that comprises the processes regarding technology evaluation and licensing, providing an analysis of the critical issues that affect each step of the process. A review of the literature was made, complemented with interviews to seven university TTUs, which was used as a check and a complement to the literature review and as way of perceiving from an insider perspective, the problems and issues that this paper wants to emphasize and to state clearly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The artificial fish swarm algorithm has recently been emerged in continuous global optimization. It uses points of a population in space to identify the position of fish in the school. Many real-world optimization problems are described by 0-1 multidimensional knapsack problems that are NP-hard. In the last decades several exact as well as heuristic methods have been proposed for solving these problems. In this paper, a new simpli ed binary version of the artificial fish swarm algorithm is presented, where a point/ fish is represented by a binary string of 0/1 bits. Trial points are created by using crossover and mutation in the different fi sh behavior that are randomly selected by using two user de ned probability values. In order to make the points feasible the presented algorithm uses a random heuristic drop item procedure followed by an add item procedure aiming to increase the profit throughout the adding of more items in the knapsack. A cyclic reinitialization of 50% of the population, and a simple local search that allows the progress of a small percentage of points towards optimality and after that refines the best point in the population greatly improve the quality of the solutions. The presented method is tested on a set of benchmark instances and a comparison with other methods available in literature is shown. The comparison shows that the proposed method can be an alternative method for solving these problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Electromagnetism-like (EM) algorithm is a population- based stochastic global optimization algorithm that uses an attraction- repulsion mechanism to move sample points towards the optimal. In this paper, an implementation of the EM algorithm in the Matlab en- vironment as a useful function for practitioners and for those who want to experiment a new global optimization solver is proposed. A set of benchmark problems are solved in order to evaluate the performance of the implemented method when compared with other stochastic methods available in the Matlab environment. The results con rm that our imple- mentation is a competitive alternative both in term of numerical results and performance. Finally, a case study based on a parameter estimation problem of a biology system shows that the EM implementation could be applied with promising results in the control optimization area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose an extension of the firefly algorithm (FA) to multi-objective optimization. FA is a swarm intelligence optimization algorithm inspired by the flashing behavior of fireflies at night that is capable of computing global solutions to continuous optimization problems. Our proposal relies on a fitness assignment scheme that gives lower fitness values to the positions of fireflies that correspond to non-dominated points with smaller aggregation of objective function distances to the minimum values. Furthermore, FA randomness is based on the spread metric to reduce the gaps between consecutive non-dominated solutions. The obtained results from the preliminary computational experiments show that our proposal gives a dense and well distributed approximated Pareto front with a large number of points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a single-phase Series Active Power Filter (Series APF) for mitigation of the load voltage harmonic content, while maintaining the voltage on the DC side regulated without the support of a voltage source. The proposed series active power filter control algorithm eliminates the additional voltage source to regulate the DC voltage, and with the adopted topology it is not used a coupling transformer to interface the series active power filter with the electrical power grid. The paper describes the control strategy which encapsulates the grid synchronization scheme, the compensation voltage calculation, the damping algorithm and the dead-time compensation. The topology and control strategy of the series active power filter have been evaluated in simulation software and simulations results are presented. Experimental results, obtained with a developed laboratorial prototype, validate the theoretical assumptions, and are within the harmonic spectrum limits imposed by the international recommendations of the IEEE-519 Standard.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Olive oils may be commercialized as intense, medium or light, according to the intensity perception of fruitiness, bitterness and pungency attributes, assessed by a sensory panel. In this work, the capability of an electronic tongue to correctly classify olive oils according to the sensory intensity perception levels was evaluated. Cross-sensitivity and non-specific lipid polymeric membranes were used as sensors. The sensor device was firstly tested using quinine monohydrochloride standard solutions. Mean sensitivities of 14±2 to 25±6 mV/decade, depending on the type of plasticizer used in the lipid membranes, were obtained showing the device capability for evaluating bitterness. Then, linear discriminant models based on sub-sets of sensors, selected by a meta-heuristic simulated annealing algorithm, were established enabling to correctly classify 91% of olive oils according to their intensity sensory grade (leave-one-out cross-validation procedure). This capability was further evaluated using a repeated K-fold cross-validation procedure, showing that the electronic tongue allowed an average correct classification of 80% of the olive oils used for internal-validation. So, the electronic tongue can be seen as a taste sensor, allowing differentiating olive oils with different sensory intensities, and could be used as a preliminary, complementary and practical tool for panelists during olive oil sensory analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural selection favors the survival and reproduction of organisms that are best adapted to their environment. Selection mechanism in evolutionary algorithms mimics this process, aiming to create environmental conditions in which artificial organisms could evolve solving the problem at hand. This paper proposes a new selection scheme for evolutionary multiobjective optimization. The similarity measure that defines the concept of the neighborhood is a key feature of the proposed selection. Contrary to commonly used approaches, usually defined on the basis of distances between either individuals or weight vectors, it is suggested to consider the similarity and neighborhood based on the angle between individuals in the objective space. The smaller the angle, the more similar individuals. This notion is exploited during the mating and environmental selections. The convergence is ensured by minimizing distances from individuals to a reference point, whereas the diversity is preserved by maximizing angles between neighboring individuals. Experimental results reveal a highly competitive performance and useful characteristics of the proposed selection. Its strong diversity preserving ability allows to produce a significantly better performance on some problems when compared with stat-of-the-art algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACTThe Amazon várzeas are an important component of the Amazon biome, but anthropic and climatic impacts have been leading to forest loss and interruption of essential ecosystem functions and services. The objectives of this study were to evaluate the capability of the Landsat-based Detection of Trends in Disturbance and Recovery (LandTrendr) algorithm to characterize changes in várzeaforest cover in the Lower Amazon, and to analyze the potential of spectral and temporal attributes to classify forest loss as either natural or anthropogenic. We used a time series of 37 Landsat TM and ETM+ images acquired between 1984 and 2009. We used the LandTrendr algorithm to detect forest cover change and the attributes of "start year", "magnitude", and "duration" of the changes, as well as "NDVI at the end of series". Detection was restricted to areas identified as having forest cover at the start and/or end of the time series. We used the Support Vector Machine (SVM) algorithm to classify the extracted attributes, differentiating between anthropogenic and natural forest loss. Detection reliability was consistently high for change events along the Amazon River channel, but variable for changes within the floodplain. Spectral-temporal trajectories faithfully represented the nature of changes in floodplain forest cover, corroborating field observations. We estimated anthropogenic forest losses to be larger (1.071 ha) than natural losses (884 ha), with a global classification accuracy of 94%. We conclude that the LandTrendr algorithm is a reliable tool for studies of forest dynamics throughout the floodplain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Production of citric acid from crude glycerol from biodiesel industry, in batch cultures of Yarrowia lipolytica W29 was performed in a lab-scale stirred tank bioreactor in order to assess the effect of oxygen mass transfer rate in this bioprocess. An empirical correlation was proposed to describe oxygen volumetric mass transfer coefficient (kLa) as a function of operating conditions (stirring speed and specific air flow rate) and cellular density. kLa increased according with a power function with specific power input and superficial gas velocity, and slightly decreased with cellular density. The increase of initial kLa from 7 h-1 to 55 h-1 led to 7.8-fold increase of citric acid final concentration. Experiments were also performed at controlled dissolved oxygen (DO) and citric acid concentration increased with DO up to 60% of saturation. Thus, due to the simpler operation setting an optimal kLa than at controlled DO, it can be concluded that kLa is an adequate parameter for the optimization of citric acid production from crude glycerol by Y. lipolytica and to be considered in bioprocess scale-up. Our empirical correlation, considering the operating conditions and cellular density, will be a valid tool for this purpose.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El reconocimiento temprano de anormalidades en la transferencia pasiva de inmunidad en equinos es importante para un manejo satisfactorio de los potrillos. La placenta de la yegua, epiteliocorial, no permite el pasaje de inmunoglobulinas.(Igs). La ingesta de calostro es vital ya que provee las Igs necesarias para alcanzar una concentración sérica de IgG mayor a 800 mg por ciento. Se considerara falla parcial con niveles de IgG entre 400 y 800 mgpor ciento; y total con niveles menores a 400 mg por ciento. La absorción de las Igs es máxima hasta 8 hs después del nacimiento y disminuye hasta hacerse nula a las 24 hs posparto. Los objetivos son: a) estudiar la cinética de la transferencia pasiva de Igs determinando la concentración de IgG sérica en potrillos en el primer trimestre de vida. b) relacionar la concentración de IgG del suero y calostro de la yegua con la concentración sérica de IgG en el potrillo. c) Relacionar en calostro la concentración de inmunoglobulina G con la densidad específica y la determinación semicuantitativa de inmunoglobulina G. d) Relacionar en el suero del potrillo a las 18 - 24 hs posparto la concentración de inmunoglobulina G con la densidad específica y la determinación semicuantitativa de inmunoglobulina G. Material y método: Diseño de estudio: de cohorte, observacional, descriptivo. Animales: 70 yeguas y 70 potrillos de raza Puro Polo. Calostros: 70 muestras Toma de muestras: Yeguas: se tomará una muestra de sangre en el periparto y una muestra de calostro posparto, antes del calostrado del potrillo. Potrillos: se tomarán muestras de sangre seriadas: al nacimiento (precalostrado), 6 hs posparto, 12 hs, 18 hs y 24 hs posparto y a los 21, 60 y 90 días posparto. Determinación de IgG (Suero y calostro): a) Técnica de inmunodifusión radial simple, los resultados se expresará en mg por ciento, en muestras seriadas en intervalos de tiempo preestablecidos. b)Refractometría (con refractómetro Modelo RHC-200/ATC- Arcano). c) Test de gluteralehído, Inmuno -G test. Análisis estadístico: Comparaciones de medias con prueba t apareada o de diferencia de medias, Se considera p significativa < 0,05. Se realizará un análisis de componentes principales. Se correlacionará la concentración de Ig G de suero y calostro de la yegua con la concentración en suero de potrillo. Con los resultados de este trabajo se determinarán los valores de inmunoglobulina en las yeguas y potrillos y su comportamiento en el tiempo, y se validará la sensibilidad y especificidad de las técnicas diagnósticas utilizadas. Los resultados permitirán obtener conocimientos para un manejo racional, desde la perspectiva inmunológica, de los potrillos, al establecer mediante técnicas cuantitativas y semicuantitativas los niveles de Igs séricos alcanzados, favoreciendo un diagnóstico precoz de inmunodeficiencia por fracaso de la transferencia de anticuerpos que pondría en riesgo la vida del potrillo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis details the findings of a study relating the transfer of 238U, 228Ra (232Th), 226Ra, and 137Cs from soil to vegetation in an Atlantic blanket bog, upland blanket bog and semi-natural grassland situated along the north-west coast of Ireland. The results of this study provide information on the uptake of these radionuclides by the indigenous vegetation found present in these ecosystems. The ecosystems chosen are internationally recognizable ecosystems and provide a wide variety of vegetation species and contrasting soil physiochemical properties which allow the influence of these parameters on radionuclide uptake to be assessed. The levels of radionuclides in the soil and vegetation were measured using gamma spectrometry, alpha spectrometry and ICP-MS. The nutrient status of the vegetation and soil physiochemical properties were measured using atomic absorption, flame photometry and other analytical techniques. The results of the study indicate that the uptake of 238U and 228Ra (232Th) by vegetation from all three ecosystems was negligible as the levels in all vegetation was below the limits of detection for the methods used in this study. These results appear to indicate that the vegetation studied do not possess the ability to accumulate significant levels of these radionuclides however this assumption cannot be upheld in the case of the Atlantic blanket bog as the levels in the soil of this ecosystem were too low for detection. Similar results were obtained for 226Ra uptake in both the Atlantic blanket bog and grassland for all vegetation with the exception of H. lanatus from the grassland ecosystem. Radium-226 uptake in upland blanket bog was higher and was detectable in the majority of vegetation indigenous to this ecosystem. Transfer factor values ranged from 0.07 to 2.35 and the TF values for E. tetralix were significantly higher than all other vegetation studied. This species of heather demonstrated the ability to accumulate 226Ra to a greater extent than all other vegetation. The uptake of 226Ra by upland blanket bog vegetation appears to be significantly influenced by a range of soil physiochemical properties. The nutrient status of the vegetation, in particular the calcium content in the vegetation appears to have a negative impact on the uptake of this radionuclide. Potassium-40 was detectable in all vegetation present in the three ecosystems and the levels in the grassland soil were significantly higher than the levels in both bogland soils. Transfer factor values for Atlantic blanket bog vegetation ranged from 0.9 to 13 .8 and were significantly higher in E. vaginatum in comparison to C. vulgaris. Potassium-40 TF values for upland blanket bog vegetation on average ranged from 1.4 for C. vulgaris (stems) to 5.2 for E. vaginatum and were statistically similar for all species of vegetation. Transfer factor values for grassland vegetation ranged from 0.7 to 3.8 and were also statistically similar for all species of vegetation indicating that the transfer of 40K to vegetation within the upland bog and grassland ecosystem is not dependent on plant species. Comparisons of 40K TF values for all three ecosystems indicate that the uptake in E. vaginatum from the Atlantic blanket bog was statistically higher than all other vegetation studied. This appears to indicate that E. vaginatum has the ability to accumulate 40K, however, this species of vegetation was also present in the upland blanket and did not demonstrate the same behaviour. The uptake of 40K by vegetation from all three ecosystems was significantly affected by a range of soil physiochemical properties and in some cases the results were contradictory in nature possibly indicating that the affect of these parameters on 40K uptake is species dependent. The most obvious trend in the data was the influence of soil CEC and magnesium levels in vegetation on 40K TF values. A positive correlation was apparent between the CEC of the soil and 40K uptake in vegetation from both the Atlantic blanket bog and grassland ecosystem. A similar trend was apparent between magnesium levels in vegetation and 40K TF values for the upland blanket bog and grassland vegetation. Caesium-13 7 levels were found to be significantly higher in the two bogland soils in comparison to the grassland soil and levels of 137Cs decreased with increasing soil depth. Transfer factor values for Atlantic blanket bog vegetation ranged from 1.9 to 9.6 and TF values were significantly higher in the leaves o f C. vulgaris in comparison to all other vegetation from this ecosystem. Caesium-13 7 TF values for the upland blanket bog vegetation on average ranged from 0.29 for E. tetralix to 1.6 for C. vulgaris. Uptake by the leaves of C. vulgaris was significantly higher than all other vegetation present thereby supporting the trend found within the Atlantic blanket bog vegetation. These results appear to indicate that the leaves of C. vulgaris have the ability to accumulate significant quantities of 137Cs and also that the uptake of 137Cs by this vegetation is dependent on plant compartment as the stems of this vegetation contained significantly lower levels than the leaves in both ecosystems. The uptake of 137Cs by grassland vegetation was very low and was only detectable in a fraction of the vegetation sampled. Caesium-137 TF values for grassland vegetation were in general lower than 0.02. The impact of soil physiochemical properties and nutrient status of vegetation on 137Cs uptake by vegetation appears to be complex and in some cases contradictory. The most apparent trend in the data was the positive influence of vegetation nutrients on 137Cs uptake in particular the magnesium levels present in the vegetation and to a lesser extent the calcium levels present. The results in general indicate that the uptake of 226Ra, 40K and 137Cs by the chosen vegetation is varied and complex and is significantly dependent on the species of vegetation, soil radionuclide concentration, soil physiochemical properties and the nutrient status of the vegetation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Verfahrens- und Systemtechnik, Diss., 2011