940 resultados para Computer Organization
Resumo:
Advances in computer memory technology justify research towards new and different views on computer organization. This paper proposes a novel memory-centric computing architecture with the goal to merge memory and processing elements in order to provide better conditions for parallelization and performance. The paper introduces the architectural concepts and afterwards shows the design and implementation of a corresponding assembler and simulator.
Resumo:
This paper describes a PC-based mainframe computer emulator called VisibleZ and its use in teaching mainframe Computer Organization and Assembly Programming classes. VisibleZ models IBM’s z/Architecture and allows direct interpretation of mainframe assembly language object code in a graphical user interface environment that was developed in Java. The VisibleZ emulator acts as an interactive visualization tool to simulate enterprise computer architecture. The provided architectural components include main storage, CPU, registers, Program Status Word (PSW), and I/O Channels. Particular attention is given to providing visual clues to the user by color-coding screen components, machine instruction execution, and animation of the machine architecture components. Students interact with VisibleZ by executing machine instructions in a step-by-step mode, simultaneously observing the contents of memory, registers, and changes in the PSW during the fetch-decode-execute machine instruction cycle. The object-oriented design and implementation of VisibleZ allows students to develop their own instruction semantics by coding Java for existing specific z/Architecture machine instructions or design and implement new machine instructions. The use of VisibleZ in lectures, labs, and assignments is described in the paper and supported by a website that hosts an extensive collection of related materials. VisibleZ has been proven a useful tool in mainframe Assembly Language Programming and Computer Organization classes. Using VisibleZ, students develop a better understanding of mainframe concepts, components, and how the mainframe computer works. ACM Computing Classification System (1998): C.0, K.3.2.
Resumo:
Este proyecto de investigación busca usar un sistema de cómputo basado en modelación por agentes para medir la percepción de marca de una organización en una población heterogénea. Se espera proporcionar información que permita dar soluciones a una organización acerca del comportamiento de sus consumidores y la asociada percepción de marca. El propósito de este sistema es el de modelar el proceso de percepción-razonamiento-acción para simular un proceso de razonamiento como el resultado de una acumulación de percepciones que resultan en las acciones del consumidor. Este resultado definirá la aceptación de marca o el rechazo del consumidor hacia la empresa. Se realizó un proceso de recolección información acerca de una organización específica en el campo de marketing. Después de compilar y procesar la información obtenida de la empresa, el análisis de la percepción de marca es aplicado mediante procesos de simulación. Los resultados del experimento son emitidos a la organización mediante un informe basado en conclusiones y recomendaciones a nivel de marketing para mejorar la percepción de marca por parte de los consumidores.
Resumo:
"August 10, 1954"
Resumo:
"This work was supported in part by the Atomic Energy Commission and the Office of Naval Research under AEC Contract AT(11-1)-415."
Resumo:
Little consensus exists in the literature regarding methods for determination of the onset of electromyographic (EMG) activity. The aim of this study was to compare the relative accuracy of a range of computer-based techniques with respect to EMG onset determined visually by an experienced examiner. Twenty-seven methods were compared which varied in terms of EMG processing (low pass filtering at 10, 50 and 500 Hz), threshold value (1, 2 and 3 SD beyond mean of baseline activity) and the number of samples for which the mean must exceed the defined threshold (20, 50 and 100 ms). Three hundred randomly selected trials of a postural task were evaluated using each technique. The visual determination of EMG onset was found to be highly repeatable between days. Linear regression equations were calculated for the values selected by each computer method which indicated that the onset values selected by the majority of the parameter combinations deviated significantly from the visually derived onset values. Several methods accurately selected the time of onset of EMG activity and are recommended for future use. Copyright (C) 1996 Elsevier Science Ireland Ltd.
Resumo:
The modern computer systems that are in use nowadays are mostly processor-dominant, which means that their memory is treated as a slave element that has one major task – to serve execution units data requirements. This organization is based on the classical Von Neumann's computer model, proposed seven decades ago in the 1950ties. This model suffers from a substantial processor-memory bottleneck, because of the huge disparity between the processor and memory working speeds. In order to solve this problem, in this paper we propose a novel architecture and organization of processors and computers that attempts to provide stronger match between the processing and memory elements in the system. The proposed model utilizes a memory-centric architecture, wherein the execution hardware is added to the memory code blocks, allowing them to perform instructions scheduling and execution, management of data requests and responses, and direct communication with the data memory blocks without using registers. This organization allows concurrent execution of all threads, processes or program segments that fit in the memory at a given time. Therefore, in this paper we describe several possibilities for organizing the proposed memory-centric system with multiple data and logicmemory merged blocks, by utilizing a high-speed interconnection switching network.
Resumo:
Little is known about the relation between the genome organization and gene expression in Leishmania. Bioinformatic analysis can be used to predict genes and find homologies with known proteins. A model was proposed, in which genes are organized into large clusters and transcribed from only one strand, in the form of large polycistronic primary transcripts. To verify the validity of this model, we studied gene expression at the transcriptional, post-transcriptional and translational levels in a unique locus of 34kb located on chr27 and represented by cosmid L979. Sequence analysis revealed 115 ORFs on either DNA strand. Using computer programs developed for Leishmania genes, only nine of these ORFs, localized on the same strand, were predicted to code for proteins, some of which show homologies with known proteins. Additionally, one pseudogene, was identified. We verified the biological relevance of these predictions. mRNAs from nine predicted genes and proteins from seven were detected. Nuclear run-on analyses confirmed that the top strand is transcribed by RNA polymerase II and suggested that there is no polymerase entry site. Low levels of transcription were detected in regions of the bottom strand and stable transcripts were identified for four ORFs on this strand not predicted to be protein-coding. In conclusion, the transcriptional organization of the Leishmania genome is complex, raising the possibility that computer predictions may not be comprehensive.
Resumo:
This paper analyzes the formation of Research Corporations as an alternative governance structure for performing R&D compared to pursuing in-house R&D projects. Research Corporations are privatefor-profit research centers that bring together several firms with similar research goals. In a Research Corporation formal authority over the choice of projects is jointly exercised by the top management of the member firms. A private for-profit organization cannot commit not to interfere with the project choice of the researchers. However, increasing the number of member firms of the Research Corporation reduces the incentive of member firms to meddle with the research projects of researchers because exercising formal authority over the choice of research projects is a public good. The Research Corporation thus offers researchers greater autonomy than a single firm pursuing an identical research program in its in-house R&D department. This attracts higher ability researchers to the Research Corporation compared to the internal R&D department. The paper uses the theoretical model to analyze the organization of the Microelectronics and Computer Technology Corporation (MCC). The facts of this case confirm the existence of a tension between control over the choice of research projects and the ability of researchers that the organization is able to attract or hold onto.
Resumo:
This report describes the work accomplished to date on research project HR-173, A Computer Based Information System for County Equipment Cost Records, and presents the initial design for this system. The specific topics discussed here are findings from the analysis of information needs, the system specifications developed from these findings, and the proposed system design based upon the system specifications. The initial system design will include tentative input designs for capturing input data, output designs to show the output formats and the items to be output for use in decision making, file design showing the organization of information to be kept on each piece of equipment in the computer data file, and general system design explaining how the entire system will operate. The Steering Committee appointed by Iowa Highway Research Board is asked to study this report, make appropriate suggestions, and give approval to the proposed design subject to any suggestions made. This approval will permit the designer to proceed promptly with the development of the computer program implementation phase of the design.
Resumo:
We uncover the global organization of clustering in real complex networks. To this end, we ask whether triangles in real networks organize as in maximally random graphs with given degree and clustering distributions, or as in maximally ordered graph models where triangles are forced into modules. The answer comes by way of exploring m-core landscapes, where the m-core is defined, akin to the k-core, as the maximal subgraph with edges participating in at least m triangles. This property defines a set of nested subgraphs that, contrarily to k-cores, is able to distinguish between hierarchical and modular architectures. We find that the clustering organization in real networks is neither completely random nor ordered although, surprisingly, it is more random than modular. This supports the idea that the structure of real networks may in fact be the outcome of self-organized processes based on local optimization rules, in contrast to global optimization principles.
Resumo:
Monte Carlo (MC) simulations have been used to study the structure of an intermediate thermal phase of poly(R-octadecyl ç,D-glutamate). This is a comblike poly(ç-peptide) able to adopt a biphasic structure that has been described as a layered arrangement of backbone helical rods immersed in a paraffinic pool of polymethylene side chains. Simulations were performed at two different temperatures (348 and 363 K), both of them above the melting point of the paraffinic phase, using the configurational bias MC algorithm. Results indicate that layers are constituted by a side-by-side packing of 17/5 helices. The organization of the interlayer paraffinic region is described in atomistic terms by examining the torsional angles and the end-to-end distances for the octadecyl side chains. Comparison with previously reported comblike poly(â-peptide)s revealed significant differences in the organization of the alkyl side chains.
Resumo:
The computer game industry has grown steadily for years, and in revenues it can be compared to the music and film industries. The game industry has been moving to digital distribution. Computer gaming and the concept of business model are discussed among industrial practitioners and the scientific community. The significance of the business model concept has increased in the scientific literature recently, although there is still a lot of discussion going on on the concept. In the thesis, the role of the business model in the computer game industry is studied. Computer game developers, designers, project managers and organization leaders in 11 computer game companies were interviewed. The data was analyzed to identify the important elements of computer game business model, how the business model concept is perceived and how the growth of the organization affects the business model. It was identified that the importance of human capital is crucial to the business. As games are partly a product of creative thinking also innovation and the creative process are highly valued. The same applies to technical skills when performing various activities. Marketing and customer relationships are also considered as key elements in the computer game business model. Financing and partners are important especially for startups, when the organization is dependent on external funding and third party assets. The results of this study provide organizations with improved understanding on how the organization is built and what business model elements are weighted.
Resumo:
Modern organisms are adapted to a wide variety of habitats and lifestyles. The processes of evolution have led to complex, interdependent, well-designed mechanisms of todays world and this research challenge is to transpose these innovative solutions to resolve problems in the context of architectural design practice, e.g., to relate design by nature with design by human. In a design by human environment, design synthesis can be performed with the use of rapid prototyping techniques that will enable to transform almost instantaneously any 2D design representation into a physical three-dimensional model, through a rapid prototyping printer machine. Rapid prototyping processes add layers of material one on top of another until a complete model is built and an analogy can be established with design by nature where the natural lay down of earth layers shapes the earth surface, a natural process occurring repeatedly over long periods of time. Concurrence in design will particularly benefit from rapid prototyping techniques, as the prime purpose of physical prototyping is to promptly assist iterative design, enabling design participants to work with a three-dimensional hardcopy and use it for the validation of their design-ideas. Concurrent design is a systematic approach aiming to facilitate the simultaneous involvment and commitment of all participants in the building design process, enabling both an effective reduction of time and costs at the design phase and a quality improvement of the design product. This paper presents the results of an exploratory survey investigating both how computer-aided design systems help designers to fully define the shape of their design-ideas and the extent of the application of rapid prototyping technologies coupled with Internet facilities by design practice. The findings suggest that design practitioners recognize that these technologies can greatly enhance concurrence in design, though acknowledging a lack of knowledge in relation to the issue of rapid prototyping.
Resumo:
An unaltered rearrangement of the original computation of a neural based predictor at the algorithmic level is introduced as a new organization. Its FPGA implementation generates circuits that are 1.7 faster than a direct implementation of the original algorithm. This faster clock rate allows to implement predictors with longer history lengths using the nearly the same hardware budget.