1000 resultados para CUDA technology


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Ciência e Tecnologia de Materiais - FC

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Las nuevas tendencias de compartir archivos multimedia a través de redes abiertas, demanda el uso de mejores técnicas de encriptación que garanticen la integridad, disponibilidad y confidencialidad, manteniendo y/o mejorando la eficiencia del proceso de cifrado sobre estos archivos. Hoy en día es frecuente la transferencia de imágenes a través de medios tecnológicos, siendo necesario la actualización de las técnicas de encriptación existentes y mejor aún, la búsqueda de nuevas alternativas. Actualmente los algoritmos criptográficos clásicos son altamente conocidos en medio de la sociedad informática lo que provoca mayor vulnerabilidad, sin contar los altos tiempos de procesamiento al momento de ser utilizados, elevando la probabilidad de ser descifrados y minimizando la disponibilidad inmediata de los recursos. Para disminuir estas probabilidades, el uso de la teoría de caos surge como una buena opción para ser aplicada en un algoritmo que tome partida del comportamiento caótico de los sistemas dinámicos, y aproveche las propiedades de los mapas logísticos para elevar el nivel de robustez en el cifrado. Es por eso que este trabajo propone la creación de un sistema criptográfico basado sobre una arquitectura dividida en dos etapas de confusión y difusión. Cada una de ellas utiliza una ecuación logística para generar números pseudoaleatorios que permitan desordenar la posición del píxel y cambiar su intensidad en la escala de grises. Este proceso iterativo es determinado por la cantidad total de píxeles de una imagen. Finalmente, toda la lógica de cifrado es ejecutada sobre la tecnología CUDA que permite el procesamiento en paralelo. Como aporte sustancial, se propone una nueva técnica de encriptación vanguardista de alta sensibilidad ante ruidos externos manteniendo no solo la confidencialidad de la imagen, sino también la disponibilidad y la eficiencia en los tiempos de proceso.---ABSTRACT---New trends to share multimedia files over open networks, demand the best use of encryption techniques to ensure the integrity, availability and confidentiality, keeping and/or improving the efficiency of the encryption process on these files. Today it is common to transfer pictures through technological networks, thus, it is necessary to update existing techniques encryption, and even better, the searching of new alternatives. Nowadays, classic cryptographic algorithms are highly known in the midst of the information society which not only causes greater vulnerability, but high processing times when this algorithms are used. It raise the probability of being deciphered and minimizes the immediate availability of resources. To reduce these odds, the use of chaos theory emerged as a good option to be applied on an algorithm that takes advantage of chaotic behavior of dynamic systems, and take logistic maps’ properties to raise the level of robustness in the encryption. That is why this paper proposes the creation of a cryptographic system based on an architecture divided into two stages: confusion and diffusion. Each stage uses a logistic equation to generate pseudorandom numbers that allow mess pixel position and change their intensity in grayscale. This iterative process is determined by the total number of pixels of an image. Finally, the entire encryption logic is executed on the CUDA technology that enables parallel processing. As a substantial contribution, it propose a new encryption technique with high sensitivity on external noise not only keeping the confidentiality of the image, but also the availability and efficiency in processing times.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El método de Muskingum-Cunge, con más de 45 años de historia, sigue siendo uno de los más empleados a la hora de calcular el tránsito en un cauce. Una vez calibrado, permite realizar cálculos precisos, siendo asimismo mucho más rápido que los métodos que consideran las ecuaciones completas. Por esta razón, en el presente trabajo de investigación se llevó a cabo un análisis de su precisión, comparándolo con los resultados de un modelo hidráulico bidimensional. En paralelo se llevó a cabo un análisis de sus limitaciones y se ensayó una metodología práctica de aplicación. Con esta motivación se llevaron a cabo más de 200 simulaciones de tránsito en cauces prismáticos y naturales. Los cálculos se realizaron empleando el programa HEC-HMS con el método de Muskingum-Cunge de sección de 8 puntos, así como con la herramienta de cálculo hidráulico bidimensional InfoWorks ICM. Se eligieron HEC-HMS por su gran difusión e InfoWorks ICM por su rapidez de cálculo, pues emplea la tecnología CUDA (Arquitectura Unificada de Dispositivos de Cálculo). Inicialmente se validó el modelo hidráulico bidimensional contrastándolo con la formulación unidimensional en régimen uniforme y variado, así como con fórmulas analíticas de régimen variable, consiguiéndose resultados muy satisfactorios. También se llevó a cabo un análisis de la sensibilidad al mallado del modelo bidimensional aplicado a tránsitos, obteniéndose unos ábacos con tamaños recomendados de los elementos 2D que cuantifican el error cometido. Con la técnica del análisis dimensional se revisó una correlación de los resultados obtenidos entre ambos métodos, ponderando su precisión y definiendo intervalos de validez para la mejor utilización del método de Muskingum-Cunge. Simultáneamente se desarrolló una metodología que permite obtener la sección característica media de 8 puntos para el cálculo de un tránsito, basándose en una serie de simulaciones bidimensionales simplificadas. De este modo se pretende facilitar el uso y la correcta definición de los modelos hidrológicos. The Muskingum-Cunge methodology, which has been used for more 45 than years, is still one of the main procedures to calculate stream routing. Once calibrated, it gives precise results, and it is also much faster than other methods that consider the full hydraulic equations. Therefore, in the present investigation an analysis of its accuracy was carried out by comparing it with the results of a two-dimensional hydraulic model. At the same time, reasonable ranges of applicability as well as an iterative method for its adequate use were defined. With this motivation more than 200 simulations of stream routing were conducted in both synthetic and natural waterways. Calculations were performed with the aid of HEC-HMS choosing the Muskingum-Cunge 8 point cross-section method and in InfoWorks ICM, a two-dimensional hydraulic calculation software. HEC-HMS was chosen because its extensive use and InfoWorks ICM for its calculation speed as it takes advantage of the CUDA technology (Compute Unified Device Architecture). Initially, the two-dimensional hydraulic engine was compared to one-dimensional formulation in both uniform and varied flow. Then it was contrasted to variable flow analytical formulae, achieving most satisfactory results. A sensitivity size analysis of the two-dimensional rooting model mesh was also conduced, obtaining charts with suggested 2D element sizes to narrow the committed error. With the technique of dimensional analysis a correlation of results between the two methods was reviewed, assessing their accuracy and defining valid intervals for improved use of the Muskingum-Cunge method. Simultaneously, a methodology to draw a representative 8 point cross-section was developed, based on a sequence of simplified two-dimensional simulations. This procedure is intended to provide a simplified approach and accurate definition of hydrological models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tool path generation is one of the most complex problems in Computer Aided Manufacturing. Although some efficient strategies have been developed, most of them are only useful for standard machining. However, the algorithms used for tool path computation demand a higher computation performance, which makes the implementation on many existing systems very slow or even impractical. Hardware acceleration is an incremental solution that can be cleanly added to these systems while keeping everything else intact. It is completely transparent to the user. The cost is much lower and the development time is much shorter than replacing the computers by faster ones. This paper presents an optimisation that uses a specific graphic hardware approach using the power of multi-core Graphic Processing Units (GPUs) in order to improve the tool path computation. This improvement is applied on a highly accurate and robust tool path generation algorithm. The paper presents, as a case of study, a fully implemented algorithm used for turning lathe machining of shoe lasts. A comparative study will show the gain achieved in terms of total computing time. The execution time is almost two orders of magnitude faster than modern PCs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AMS Subj. Classification: 49J15, 49M15

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are several tools in the literature that support innovation in organizations. Some of the most cited are the so-called technology roadmapping methods, also known as TRM. However, these methods are designed primarily for organizations that adopt the market pull strategy of technology-product integration. Organizations that adopt the technology push integration strategy are neglected in the literature. Furthermore, with the advent of open innovation, it is possible to note the need to consider the adoption of partnerships in the innovation process. Thus, this study proposes a method of technology roadmapping, identified as method for technology push (MTP), applicable to organizations that adopt the technology push integration strategy, such as SMEs and independent research centers in an open-innovation environment. The method was developed through action-research and was assessed from two analytical standpoints: externally, via a specific literature review on its theoretical contributions, and internally, through the analysis of potential users` perceptions on the feasibility of applying MTP. The results indicate both the unique character of the method and its perceived implementation feasibility. Future research is suggested in order to validate the method in different types of organizations (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many authors point out that the front-end of new product development (NPD) is a critical success factor in the NPD process and that numerous companies face difficulties in carrying it out appropriately. Therefore, it is important to develop new theories and proposals that support the effective implementation of this earliest phase of NPD. This paper presents a new method to support the development of front-end activities based on integrating technology roadmapping (TRM) and project portfolio management (PPM). This new method, called the ITP Method, was implemented at a small Brazilian high-tech company in the nanotechnology industry to explore the integration proposal. The case study demonstrated that the ITP Method provides a systematic procedure for the fuzzy front-end and integrates innovation perspectives into a single roadmap, which allows for a better alignment of business efforts and communication of product innovation goals. Furthermore, the results indicated that the method may also improve quality, functional integration and strategy alignment. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a proposal for a Quality Management System for a generic GNSS Surveying Company as an alternative for management and service quality improvements. As a result of the increased demand for GNSS measurements, a large number of new or restructured companies were established to operate in that market. Considering that GNSS surveying is a new process, some changes must be performed in order to accommodate the old surveying techniques and the old fashioned management to the new reality. This requires a new management model that must be based on a well-described procedure sequence aiming at the Total Management Quality for the company. The proposed Quality Management System was based on the requirements of the Quality System ISO 9000:2000, applied to the whole company, focusing on the productive process of GNSS surveying work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considering the increasing popularity of network-based control systems and the huge adoption of IP networks (such as the Internet), this paper studies the influence of network quality of service (QoS) parameters over quality of control parameters. An example of a control loop is implemented using two LonWorks networks (CEA-709.1) interconnected by an emulated IP network, in which important QoS parameters such as delay and delay jitter can be completely controlled. Mathematical definitions are provided according to the literature, and the results of the network-based control loop experiment are presented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents a case study on technology assessment for power quality improvement devices. A system compatibility test protocol for power quality mitigation devices was developed in order to evaluate the functionality of three-phase voltage restoration devices. In order to validate this test protocol, the micro-DVR, a reduced power development platform for DVR (dynamic voltage restorer) devices, was tested and the results are discussed based on voltage disturbances standards. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A green ceramic tape micro-heat exchanger was developed using Low Temperature Co-fired Ceramics technology (LTCC). The device was designed by using Computational Aided Design software and simulations were made using a Computational Fluid Dynamics package (COMSOL Multiphysics) to evaluate the homogeneity of fluid distribution in the microchannels. Four geometries were proposed and simulated in two and three dimensions to show that geometric details directly affect the distribution of velocity in the micro-heat exchanger channels. The simulation results were quite useful for the design of the microfluidic device. The micro-heat exchanger was then constructed using the LTCC technology and is composed of five thermal exchange plates in cross-flow arrangement and two connecting plates, with all plates stacked to form a device with external dimensions of 26 x 26 x 6 mm(3).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of airborne laser scanning (ALS) technologies in forest inventories has shown great potential to improve the efficiency of forest planning activities. Precise estimates, fast assessment and relatively low complexity can explain the good results in terms of efficiency. The evolution of GPS and inertial measurement technologies, as well as the observed lower assessment costs when these technologies are applied to large scale studies, can explain the increasing dissemination of ALS technologies. The observed good quality of results can be expressed by estimates of volumes and basal area with estimated error below the level of 8.4%, depending on the size of sampled area, the quantity of laser pulses per square meter and the number of control plots. This paper analyzes the potential of an ALS assessment to produce certain forest inventory statistics in plantations of cloned Eucalyptus spp with precision equal of superior to conventional methods. The statistics of interest in this case were: volume, basal area, mean height and dominant trees mean height. The ALS flight for data assessment covered two strips of approximately 2 by 20 Km, in which clouds of points were sampled in circular plots with a radius of 13 m. Plots were sampled in different parts of the strips to cover different stand ages. The clouds of points generated by the ALS assessment: overall height mean, standard error, five percentiles (height under which we can find 10%, 30%, 50%,70% and 90% of the ALS points above ground level in the cloud), and density of points above ground level in each percentile were calculated. The ALS statistics were used in regression models to estimate mean diameter, mean height, mean height of dominant trees, basal area and volume. Conventional forest inventory sample plots provided real data. For volume, an exploratory assessment involving different combinations of ALS statistics allowed for the definition of the most promising relationships and fitting tests based on well known forest biometric models. The models based on ALS statistics that produced the best results involved: the 30% percentile to estimate mean diameter (R(2)=0,88 and MQE%=0,0004); the 10% and 90% percentiles to estimate mean height (R(2)=0,94 and MQE%=0,0003); the 90% percentile to estimate dominant height (R(2)=0,96 and MQE%=0,0003); the 10% percentile and mean height of ALS points to estimate basal area (R(2)=0,92 and MQE%=0,0016); and, to estimate volume, age and the 30% and 90% percentiles (R(2)=0,95 MQE%=0,002). Among the tested forest biometric models, the best fits were provided by the modified Schumacher using age and the 90% percentile, modified Clutter using age, mean height of ALS points and the 70% percentile, and modified Buckman using age, mean height of ALS points and the 10% percentile.

Relevância:

20.00% 20.00%

Publicador: