53 resultados para Pentium


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Model Predictive Control (MPC) is increasingly being proposed for application to miniaturized devices, fast and/or embedded systems. A major obstacle to this is its computation time requirement. Continuing our previous studies of implementing constrained MPC on Field Programmable Gate Arrays (FPGA), this paper begins to exploit the possibilities of parallel computation, with the aim of speeding up the MPC implementation. Simulation studies on a realistic example show that it is possible to implement constrained MPC on an FPGA chip with a 25MHz clock and achieve MPC implementation rates comparable to those achievable on a Pentium 3.0 GHz PC. Copyright © 2007 International Federation of Automatic Control All Rights Reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

在科学计算中,稀疏矩阵向量乘(SpMV, y=Ax)是一个十分重要的,且经常被大量调用的计算内核,广泛应用在科学计算、信息检索、气象、航天、油藏模拟、天体物理、数据挖掘等科学计算和实际应用中。在实际工程应用中,重复调用稀疏矩阵向量乘内核的次数常常会达到成千上万次。但在现代基于Cache存储层次的计算平台上,稀疏矩阵向量乘的性能很低。如果能够提高稀疏矩阵向量乘的运算速度,整个工程计算的运行效率将会得到很大的改善,计算时间也会大幅度的减少。因此优化稀疏矩阵向量乘的性能成为提高工程效率的关键,在实际应用中有着十分重要的意义。 SpMV的传统算法实现形式运行效率很低,主要原因是浮点计算操作和存储访问操作比率非常低,且稀疏矩阵非零元分布的不规则性使得存储访问模式非常复杂。寄存器分块算法和启发式选择分块算法,通过自适应选择性能最佳的分块大小,然后将稀疏矩阵分成小的稠密分块,所有的非零子块顺序计算,达到重用保存在寄存器中向量x元素的目的,减少存储访问次数和时间,从而提高这一重要内核的性能。我们在Pentium IV、Alpha EV6和AMD Athlon三个平台上,分别测试了十个矩阵下的两种不同算法形式(压缩行存储算法和寄存器分块算法)的性能,平均加速比分别达到1.69、1.90和 1.48。同时也测试了不同次数调用SpMV两种算法所用的时间,发现在实际的迭代算法应用过程中,若想采用启发式-寄存器分块算法达到性能提高的目的,一般情况下,迭代次数需要达到上百次才能有加速效果。 DRAM(h)模型是基于存储层次的并行计算模型,指出算法的复杂性包括计算复杂性和存储访问复杂性,具有近乎相同时间和空间复杂性的同一算法的不同实现形式,会有不同的存储访问复杂度,导致程序实际运行性能的差异;利用DRAM(h)模型进行分析并比较不同算法实现形式的存储访问复杂度,可以判断两种算法形式的优劣,从而为选取性能更高的实现形式提供指导。但利用DRAM(h)模型分析SpMV存储访问复杂度的工作以前没有人做过,并且SpMV的计算性能和存储访问行为跟具体的稀疏矩阵有关,只有到程序运行的时候才能知道。本文中,我们提出模板法和动态统计分析法两种分析SpMV存储访问复杂度的方法。在Pentium IV和Alpha EV6平台上,用RAM(h) 模型分析和计算了稀疏矩阵向量乘两种算法实现形式(即压缩行存储算法和寄存器分块算法)的存储访问复杂度,通过分析和计算在SpMV过程中需要访问的所有数据的存储访问复杂度,可知存储访问行为对整体程序的实际性能有直接影响。我们还在Pentium IV平台上,测试了七个稀疏矩阵的SpMV性能,并统计了两种算法中L1, L2,和TLB的缺失率,实验结果与模型分析的结果一致。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

介绍了一个应用于 Pentium机型上 ISA总线的接口电路 ,这个接口电路具有高速度、高可靠性的特点 ,是为光谱测量与能级寿命测量装置的控制及数据获取系统专门设计研制的。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

随着物理实验要求的提高,利用先进的计算机技术来改造现有的物理实验设备具有重要意义。因此,我们设计研制了光谱测量与能级寿命测量数据获取电子学系统。本文全面论述了这一系统的组成结构。本系统由微机进行控制,通过基于Pentium机上工SA总线的接口电路来控制数据采集过程,实现了测量设备的智能化,并具有高速度、高可靠性的特点。配合软件工作可完全实现实验的无人监控。论文第一部分介绍了这个系统开发的背景及意义。第二部分是这个系统的组成和硬件结构、性能。这个系统完成的主要功能是:(1)采集数据并处理;(2)控制外设马达的运动;(3)显示系统的状态。第三部分是调试过程以及针对在调试中出现的各种实际问题提出的解决办法和预防措施。在设计工作完成之后,本系统在兰州近代物理研究所的加速器实验大厅中进行了模拟实验,并采集了部分数据。通过对获取数据的处理,获得了比较满意的结果。在论文的最后一部分中给出了本系统在实际运行后得出的实验结果以及此系统中可待完善之处。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Server performance has become a crucial issue for improving the overall performance of the World-Wide Web. This paper describes Webmonitor, a tool for evaluating and understanding server performance, and presents new results for a realistic workload. Webmonitor measures activity and resource consumption, both within the kernel and in HTTP processes running in user space. Webmonitor is implemented using an efficient combination of sampling and event-driven techniques that exhibit low overhead. Our initial implementation is for the Apache World-Wide Web server running on the Linux operating system. We demonstrate the utility of Webmonitor by measuring and understanding the performance of a Pentium-based PC acting as a dedicated WWW server. Our workload uses a file size distribution with a heavy tail. This captures the fact that Web servers must concurrently handle some requests for large audio and video files, and a large number of requests for small documents, containing text or images. Our results show that in a Web server saturated by client requests, over 90% of the time spent handling HTTP requests is spent in the kernel. Furthermore, keeping TCP connections open, as required by TCP, causes a factor of 2-9 increase in the elapsed time required to service an HTTP request. Data gathered from Webmonitor provide insight into the causes of this performance penalty. Specifically, we observe a significant increase in resource consumption along three dimensions: the number of HTTP processes running at the same time, CPU utilization, and memory utilization. These results emphasize the important role of operating system and network protocol implementation in determining Web server performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tony Mann provides a review of the book: Simon Biggs, Book of Shadows, Ellipsis (Electric Art Series: 1), 64pp. with CD-Rom, 1996, ISBN 1-899858-156. £15. [Needs Multimedia PC (Windows, 486 or Pentium processor), or Macintosh (68040 or PowerPC)]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Parallel processing techniques have been used in the past to provide high performance computing resources for activities such as fire-field modelling. This has traditionally been achieved using specialized hardware and software, the expense of which would be difficult to justify for many fire engineering practices. In this article we demonstrate how typical office-based PCs attached to a Local Area Network has the potential to offer the benefits of parallel processing with minimal costs associated with the purchase of additional hardware or software. It was found that good speedups could be achieved on homogeneous networks of PCs, for example a problem composed of ~100,000 cells would run 9.3 times faster on a network of 12 800MHz PCs than on a single 800MHz PC. It was also found that a network of eight 3.2GHz Pentium 4 PCs would run 7.04 times faster than a single 3.2GHz Pentium computer. A dynamic load balancing scheme was also devised to allow the effective use of the software on heterogeneous PC networks. This scheme also ensured that the impact between the parallel processing task and other computer users on the network was minimized.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A FORTRAN 90 program is presented which calculates the total cross sections, and the electron energy spectra of the singly and doubly differential cross sections for the single target ionization of neutral atoms ranging from hydrogen up to and including argon. The code is applicable for the case of both high and low Z projectile impact in fast ion-atom collisions. The theoretical models provided for the program user are based on two quantum mechanical approximations which have proved to be very successful in the study of ionization in ion-atom collisions. These are the continuum-distorted-wave (CDW) and continuum-distorted-wave eikonal-initial-state (CDW-EIS) approximations. The codes presented here extend previously published. codes for single ionization of. target hydrogen [Crothers and McCartney, Comput. Phys. Commun. 72 (1992) 288], target helium [Nesbitt, O'Rourke and Crothers, Comput. Phys. Commun. 114 (1998) 385] and target atoms ranging from lithium to neon [O'Rourke, McSherry and Crothers, Comput. Phys. Commun. 131 (2000) 129]. Cross sections for all of these target atoms may be obtained as limiting cases from the present code. Title of program: ARGON Catalogue identifier: ADSE Program summary URL: http://cpc.cs.qub.ac.uk/cpc/summaries/ADSE Program obtainable from: CPC Program Library Queen's University of Belfast, N. Ireland Licensing provisions: none Computer for which the program is designed and others on which it is operable: Computers: Four by 200 MHz Pro Pentium Linux server, DEC Alpha 21164; Four by 400 MHz Pentium 2 Xeon 450 Linux server, IBM SP2 and SUN Enterprise 3500 Installations: Queen's University, Belfast Operating systems under which the program has been tested: Red-hat Linux 5.2, Digital UNIX Version 4.0d, AIX, Solaris SunOS 5.7 Compilers: PGI workstations, DEC CAMPUS Programming language used: FORTRAN 90 with MPI directives No. of bits in a word: 64, except on Linux servers 32 Number of processors used: any number Has the code been vectorized or parallelized? Parallelized using MPI No. of bytes in distributed program, including test data, etc.: 32 189 Distribution format: tar gzip file Keywords: Single ionization, cross sections, continuum-distorted-wave model, continuum- distorted-wave eikonal-initial-state model, target atoms, wave treatment Nature of physical problem: The code calculates total, and differential cross sections for the single ionization of target atoms ranging from hydrogen up to and including argon by both light and heavy ion impact. Method of solution: ARGON allows the user to calculate the cross sections using either the CDW or CDW-EIS [J. Phys. B 16 (1983) 3229] models within the wave treatment. Restrictions on the complexity of the program: Both the CDW and CDW-EIS models are two-state perturbative approximations. Typical running time: Times vary according to input data and number of processors. For one processor the test input data for double differential cross sections (40 points) took less than one second, whereas the test input for total cross sections (20 points) took 32 minutes. Unusual features of the program: none (C) 2003 Elsevier B.V All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis in Thermal Flow Drilling and Flowtap in thin metal sheet and pipes of copper and copper alloys had as objectives to know the comportment of copper and copper alloys sheet metal during the Thermal Flow Drill processes with normal tools, to know the best Speed and Feed machine data for the best bushing quality, to known the best Speed for Form Tapping processes and to know the best bush long in pure copper pipes for water solar interchange equipment. Thermal Flow Drilling (TFD) and Form Tapping (FT) is one of the research lines of the Institute of Production and Logistics (IPL) at University of Kassel. At December 1995, a work meeting of IPL, Santa Catarina University, Brazil, Buenos Aires University, Argentine, Tarapacá University (UTA), Chile members and the CEO of Flowdrill B.V. was held in Brazil. The group decided that the Manufacturing Laboratory (ML) of UTA would work with pure copper and brass alloys sheet metal and pure copper pipes in order to develop a water interchange solar heater. The Flowdrill BV Company sent tools to Tarapacá University in 1996. In 1999 IPL and the ML carried out an ALECHILE research project promoted by the DAAD and CONICyT in copper sheet metal and copper pipes and sheet metal a-brass alloys. The normal tools are lobed, conical tungsten carbide tool. When rotated at high speed and pressed with high axial force into sheet metal or thin walled tube generated heat softens the metal and allows the drill to feed forward produce a hole and simultaneously form a bushing from the displacement material. In the market exist many features but in this thesis is used short and longs normal tools of TFD. For reach the objectives it was takes as references four qualities of the frayed end bushing, where the best one is the quality class I. It was used pure copper and a-brass alloys sheet metals, with different thickness. It was used different TFD drills diameter for four thread type, from M-5 to M10. Similar to the Aluminium sheet metals studies it was used the predrilling processes with HSS drills around 30% of the TFD diameter (1,5 – 3,0 mm D). In the next step is used only 2,0 mm thick metal sheet, and 9,2 mm TFD diameter for M-10 thread. For the case of pure commercial copper pipes is used for ¾” inch diameter and 12, 8 mm (3/8”) TFD drill for holes for 3/8” pipes and different normal HSS drills for predrilling processes. The chemical sheet metal characteristics were takes as reference for the material behaviour. The Chilean pure copper have 99,35% of Cu and 0,163% of Zinc and the Chilean a-brass alloys have 75,6% of Cu and 24,0% of Zinc. It is used two German a-brass alloys; Nº1 have 61,6% of Cu, 36,03 % of Zinc and 2,2% of Pb and the German a-brass alloys Nº2 have 63,1% of Cu, 36,7% of Zinc and 0% of Pb. The equipments used were a HAAS CNC milling machine centre, a Kistler dynamometer, PC Pentium II, Acquisition card, TESTPOINT and XAct software, 3D measurement machine, micro hardness, universal test machine, and metallographic microscope. During the test is obtained the feed force and momentum curves that shows the material behaviour with TFD processes. In general it is take three phases. It was possible obtain the best machining data for the different sheet of copper and a-brass alloys thick of Chilean materials and bush quality class I. In the case of a-brass alloys, the chemical components and the TFD processes temperature have big influence. The temperature reach to 400º Celsius during the TFD processes and the a-brass alloys have some percents of Zinc the bush quality is class I. But when the a-brass alloys have some percents of Lead who have 200º C melting point is not possible to obtain a bush, because the Lead gasify and the metallographic net broke. During the TFD processes the recrystallization structures occur around the Copper and a-brass alloy bush, who gives more hardness in these zones. When the threads were produce with Form Tapping processes with Flowtap tools, this hardness amount gives a high limit load of the thread when hey are tested in a special support that was developed for it. For eliminated the predrilling processes with normal HSS drills it was developed a compound tool. With this new tool it was possible obtain the best machining data for quality class I bush. For the copper pipes it is made bush without predrilling and the quality class IV was obtained. When it is was used predrilling processes, quality classes I bush were obtained. Then with different HSS drill diameter were obtained different long bush, where were soldering with four types soldering materials between pipes with 3/8” in a big one as ¾”. Those soldering unions were tested by traction test and all the 3/8” pipes broken, and the soldering zone doesn’t have any problem. Finally were developed different solar water interchange heaters and tested. As conclusions, the present Thesis shows that the Thermal Flow Drilling in thinner metal sheets of cooper and cooper alloys needs a predrilling process for frayed end quality class I bushings, similar to thinner sheets of aluminium bushes. The compound tool developed could obtain quality class I bushings and excludes predrilling processes. The bush recrystalization, product of the friction between the tool and the material, the hardness grows and it is advantageous for the Form Tapping. The methodology developed for commercial copper pipes permits to built water solar interchange heaters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tít. de la carátula del CD-ROM: Comunicación. También consta: Educación Secundaria para Personas Adultas : Distancia ; Educación de Personas Adultas : Aprendizaje permanente. Los requisitos mínimos aconsejados para la lectura del CD-ROM son: Ordenador Pentium II con 32 MB de memoria RAM. Lector de CD-ROM. Tarjeta de sonido y altavoces (no es imprescindible). Tarjeta de video con una resolución preferible de 1024x768 pixels. Sistema Operativo Windows 98 o superior. Navegador de páginas web (preferiblemente Internet Explorer 5.5 o superior)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. Realizar un estudio exhaustivo del Análisis Discriminante para evaluar su robustez con el fin de hacer las pertinentes recomendaciones al psicólogo aplicado; 2. Determinar criterios estadísticos que ayuden a las interpretaciones heurísticas de los coeficientes más relevantes, para la evaluación de las contribuciones de las variables a las funciones discriminantes. Primera investigación: Se trabajó con un diseño factorial 4x2x3x2x2 lo que supone 96 condiciones experimentales. Las cinco variables eran: a. Normalidad de las variables, b. Varianza de los grupos, c. Número de variables, d. Número de grupos, 5. Número de sujetos en cada grupo. Variable Dependiente: Para cada una de las 200 replicaciones Monte Carlo se obtuvieron las lambdas de Wilks, las V de Bartlett y su probabilidad asociada, como índice de la significación de criterio discriminante. Segunda investigación: Para esta investigación se replicó el diseño de la primera investigación, es decir, las 96 condiciones experimentales con todos los factores, otorgando ahora el perfil de diferencias grupales siguiente para las condiciones con tres grupos y para las condiciones con seis grupos. Se mantuvieron constantes las correlaciones entre las variables e iguales a las de la primera investigación, 0,70. El valor de los parámetros fue obtenido mediante el programa DISCRIMINANT del SPSS/PC+. Hardware: El trabajo de simulación se llevó a cabo con ocho ordenadores personales clónicos PC:PENTIUM/100 Mhz., con 16 MB de RAM. Software: Los procedimientos necesarios para la investigación fueron realizados en el lenguaje de programación GAUSS 386i, versión 3.1 (Aptech Systems, 1994). 1. Los métodos de simulación y concretamente, el método de muestreo bootstrap, son de gran utilidad para los estudios de robustez de las técnicas estadísticas, así como en los de inferencia estadística: cálculo de intervalos de confianza; 2. El Análisis Discriminante es una técnica robusta, siempre y cuando se cumpla la condición de homogeneidad de las varianzas; 3. El Análisis Discriminante no es robusto ante problemas de heterogeneidad en las siguientes condiciones: Con seis o menos variables,cuando los tamaños grupales son diferentes. Para tamaños iguales, si además se presenta una alteración conjunta de asimetría y apuntamiento; 4. Cuando la violación del supuesto de homogeneidad viene provocada porque la varianza mayor la presenta el grupo con menos sujetos la técnica se vuelve demasiado liberal, es decir, se produce un alto grado de error tipo I; 5. Los coeficientes de estructura son más estables e insesgados que los típicos; 6. Es posible determinar los intervalos confidenciales de los coeficientes de estructura mediante el procedimiento sugerido por Dalgleish (1994). 1. El Análisis Discriminante se puede utilizar siempre que se cumpla la condición de Homogeneidad de varianzas. Es por tanto, absolutamente necesario comprobar antes de realizar un Análisis Discriminante este principio, lo cual se puede llevar a cabo a través de cualquiera de los estadísticos pertinentes y, en especial, la prueba de Box; 2. Ante la heterogeneidad de varianzas si el número de variables independientes es seis o inferior, deberá tenerse en cuenta que el número de sujetos debe ser igual en todos los grupos y que las variables no presenten alteraciones conjuntas de asimetría y apuntamiento,por lo que, como paso previo deberá comprobarse la distribución de las variables y detectar si se presenta esta alteración. En cualquier otra condición, y ante la presencia de heterogeneidad de varianzas no se puede utilizar la técnica. Cuando el número de variables predictoras sea nueve o más, podrá utilizarse la técnica siempre, a excepción de diferentes tamaños grupales y no normalidad de las variables. El investigador aplicado deberá conocer la posibilidad que proponemos de apoyatura estadística para la toma de decisiones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta innovación tiene como objetivo final generar un documento que pueda ayudar a los administradores informáticos de los centros de enseñanza a adaptar sus redes locales para ofrecer a sus usuarios un nuevo abanico de servicios, más en la línea con las filosofías modernas de las intranets y extranets. El trabajo se desarrolla en dos vertientes principales: una primera parte que documenta diferentes fórmulas para aprovechar las máquinas más viejas de los institutos (Pentium, 486 o hasta 386); una segunda parte que desarrolla el proceso para la creación, administración y mantenimiento de una intranet/extranet en un centro de enseñanza.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se adjuntan a la memoria, las actas de las reuniones, listado de gastos, facturas originales, cuenta final y certificado de gastos. Resumen tomado de los autores

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La actividad se ha desarrollado en el Departamento de Informática del Instituto de Enseñanza Secundaria Ribera de Castilla. Las prácticas se han desarrollado en las aulas de informática, los objetivos han sido instalar un servidor web de ftp, news y chat, controlar el acceso a Internet. El sistema de trabajo ha sido el reparto de tareas y posteriormente la puesta en el grupo. Se elaboró un programa con los puntos a tratar, hubo reparto de tareas, se recopiló información y recursos, se hizo puesta en común en el grupo, hubo implementación y puesta en marcha de las tareas. Los resultados fueron la creación de una red segmentada en cinco subredes, distintos servidores (web, ftp, news, chat). Los materiales son: hardward PC Pentium II, y AMD con 32 o 64 MB, servidor PC a 500 MZ, placa dual 194 Mb, línea ADSL, software: distintos sistemas operativos Windows 98 SE, Windows 2000, Linux. Otras aplicaciones Sntpbeamer, Outlook Express.