922 resultados para Computer arithmetic and logic units


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this concluding chapter, we bring together the threads and reflections on the chapters contained in this text and show how they relate to multi-level issues. The book has focused on the world of Human Resource Management (HRM) and the systems and practices it must put in place to foster innovation. Many of the contributions argue that in order to bring innovation about, organisations have to think carefully about the way in which they will integrate what is, in practice, organisationally relevant — but socially distributed — knowledge. They need to build a series of knowledge-intensive activities and networks, both within their own boundaries and across other important external inter-relationships. In so doing, they help to co-ordinate important information structures. They have, in effect, to find ways of enabling people to collaborate with each other at lower cost, by reducing both the costs of their co-ordination and the levels of unproductive search activity. They have to engineer these behaviours by reducing the risks for people that might be associated with incorrect ideas and help individuals, teams and business units to advance incomplete ideas that are so often difficult to codify. In short, a range of intangible assets must flow more rapidly throughout the organisation and an appropriate balance must be found between the rewards and incentives associated with creativity, novelty and innovation, versus the risks that innovation may also bring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Regional climate models (RCMs) provide reliable climatic predictions for the next 90 years with high horizontal and temporal resolution. In the 21st century northward latitudinal and upward altitudinal shift of the distribution of plant species and phytogeographical units is expected. It is discussed how the modeling of phytogeographical unit can be reduced to modeling plant distributions. Predicted shift of the Moesz line is studied as case study (with three different modeling approaches) using 36 parameters of REMO regional climate data-set, ArcGIS geographic information software, and periods of 1961-1990 (reference period), 2011-2040, and 2041-2070. The disadvantages of this relatively simple climate envelope modeling (CEM) approach are then discussed and several ways of model improvement are suggested. Some statistical and artificial intelligence (AI) methods (logistic regression, cluster analysis and other clustering methods, decision tree, evolutionary algorithm, artificial neural network) are able to provide development of the model. Among them artificial neural networks (ANN) seems to be the most suitable algorithm for this purpose, which provides a black box method for distribution modeling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A számítógépes munkavégzés elterjedése számos szempontból és módszerrel vizsgálható. A szerző kutatásai során az ember-gép-környezet összhang megteremtésének igényéből kiindulva arra keresi a választ, hogy milyen tényezők befolyásolják a számítógéppel végzett tevékenységek megszervezésének és végrehajtásának hatékonyságát. Tanulmányában a kutatás indítófázisának főbb eredményeiről, a számítógép-használati szokásokról ad áttekintést, ami elengedhetetlen az irodai-adminisztratív jellegű tevékenységekhez kötődő kritikus tényezők feltárásához, továbbá a mérési és fejlesztési feladatok megalapozásához. A szűkebb értelemben vett ergonómiai szempontok mellett a digitális kompetenciák kérdéskörét vonta be a munkába, amit releváns kérdésnek tart a hatékonyság mérése szempontjából, mivel a számítógép megválasztása és a munkahely kialakítása nem értékelhető az emberi tényező alkalmassága és az elvégzendő feladat tartalma nélkül. __________ Office and administrative work, business corres-pondence, private contacts and learning are increasingly supported by computers. Moreover the technical possibilities of correspondence are wider than using a PC. It is accessible on the go by a cell phone. The author analysed the characteristics of the used devices, the working environment, satisfaction factors in connection with computer work and the digital competence by a survey. In his opinion development in an ergonomic approach is important not only to establish the technological novelties but to utilize the present possibilities of hardware and environment. The reason for this is that many people can not (or do not want) to follow the dynamic technological development of computers with buying the newest devices. The study compares the results of home and work characteristic of computer work. This research was carried out as part of the “TAMOP-4.2.1.B-10/2/ KONV-2010-0001” project with support by the European Union, co-financed by the European Social Fund.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is about the research carried on developing an MPS (Multipurpose Portable System) which consists of an instrument and many accessories. The instrument is portable, hand-held, and rechargeable battery operated, and it measures temperature, absorbance, and concentration of samples by using optical principles. The system also performs auxiliary functions like incubation and mixing. This system can be used in environmental, industrial, and medical applications. ^ Research emphasis is on system modularity, easy configuration, accuracy of measurements, power management schemes, reliability, low cost, computer interface, and networking. The instrument can send the data to a computer for data analysis and presentation, or to a printer. ^ This dissertation includes the presentation of a full working system. This involved integration of hardware and firmware for the micro-controller in assembly language, software in C and other application modules. ^ The instrument contains the Optics, Transimpedance Amplifiers, Voltage-to-Frequency Converters, LCD display, Lamp Driver, Battery Charger, Battery Manager, Timer, Interface Port, and Micro-controller. ^ The accessories are a Printer, Data Acquisition Adapter (to transfer the measurements to a computer via the Printer Port and expand the Analog/Digital conversion capability), Car Plug Adapter, and AC Transformer. This system has been fully evaluated for fault tolerance and the schemes will also be presented. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation is a study of customer relationship management theory and practice. Customer Relationship Management (CRM) is a business strategy whereby companies build strong relationships with existing and prospective customers with the goal of increasing organizational profitability. It is also a learning process involving managing change in processes, people, and technology. CRM implementation and its ramifications are also not completely understood as evidenced by the high number of failures in CRM implementation in organizations and the resulting disappointments. ^ The goal of this dissertation is to study emerging issues and trends in CRM, including the effect of computer software and the accompanying new management processes on organizations, and the dynamics of the alignment of marketing, sales and services, and all other functions responsible for delivering customers a satisfying experience. ^ In order to understand CRM better a content analysis of more than a hundred articles and documents from academic and industry sources was undertaken using a new methodological twist to the traditional method. An Internet domain name (http://crm.fiu.edu) was created for the purpose of this research by uploading an initial one hundred plus abstracts of articles and documents onto it to form a knowledge database. Once the database was formed a search engine was developed to enable the search of abstracts using relevant CRM keywords to reveal emergent dominant CRM topics. The ultimate aim of this website is to serve as an information hub for CRM research, as well as a search engine where interested parties can enter CRM-relevant keywords or phrases to access abstracts, as well as submit abstracts to enrich the knowledge hub. ^ Research questions were investigated and answered by content analyzing the interpretation and discussion of dominant CRM topics and then amalgamating the findings. This was supported by comparisons within and across individual, paired, and sets-of-three occurrences of CRM keywords in the article abstracts. ^ Results show that there is a lack of holistic thinking and discussion of CRM in both academics and industry which is required to understand how the people, process, and technology in CRM impact each other to affect successful implementation. Industry has to get their heads around CRM and holistically understand how these important dimensions affect each other. Only then will organizational learning occur, and overtime result in superior processes leading to strong profitable customer relationships and a hard to imitate competitive advantage. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, over 15,000 Ion Mobility Spectrometry (IMS) analyzers are employed at worldwide security checkpoints to detect explosives and illicit drugs. Current portal IMS instruments and other electronic nose technologies detect explosives and drugs by analyzing samples containing the headspace air and loose particles residing on a surface. Canines can outperform these systems at sampling and detecting the low vapor pressure explosives and drugs, such as RDX, PETN, cocaine, and MDMA, because these biological detectors target the volatile signature compounds available in the headspace rather than the non-volatile parent compounds of explosives and drugs.^ In this dissertation research volatile signature compounds available in the headspace over explosive and drug samples were detected using SPME as a headspace sampling tool coupled to an IMS analyzer. A Genetic Algorithm (GA) technique was developed to optimize the operating conditions of a commercial IMS (GE Itemizer 2), leading to the successful detection of plastic explosives (Detasheet, Semtex H, and C-4) and illicit drugs (cocaine, MDMA, and marijuana). Short sampling times (between 10 sec to 5 min) were adequate to extract and preconcentrate sufficient analytes (> 20 ng) representing the volatile signatures in the headspace of a 15 mL glass vial or a quart-sized can containing ≤ 1 g of the bulk explosive or drug.^ Furthermore, a research grade IMS with flexibility for changing operating conditions and physical configurations was designed and fabricated to accommodate future research into different analytes or physical configurations. The design and construction of the FIU-IMS were facilitated by computer modeling and simulation of ion’s behavior within an IMS. The simulation method developed uses SIMION/SDS and was evaluated with experimental data collected using a commercial IMS (PCP Phemto Chem 110). The FIU-IMS instrument has comparable performance to the GE Itemizer 2 (average resolving power of 14, resolution of 3 between two drugs and two explosives, and LODs range from 0.7 to 9 ng). ^ The results from this dissertation further advance the concept of targeting volatile components to presumptively detect the presence of concealed bulk explosives and drugs by SPME-IMS, and the new FIU-IMS provides a flexible platform for future IMS research projects.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given the importance of color processing in computer vision and computer graphics, estimating and rendering illumination spectral reflectance of image scenes is important to advance the capability of a large class of applications such as scene reconstruction, rendering, surface segmentation, object recognition, and reflectance estimation. Consequently, this dissertation proposes effective methods for reflection components separation and rendering in single scene images. Based on the dichromatic reflectance model, a novel decomposition technique, named the Mean-Shift Decomposition (MSD) method, is introduced to separate the specular from diffuse reflectance components. This technique provides a direct access to surface shape information through diffuse shading pixel isolation. More importantly, this process does not require any local color segmentation process, which differs from the traditional methods that operate by aggregating color information along each image plane. ^ Exploiting the merits of the MSD method, a scene illumination rendering technique is designed to estimate the relative contributing specular reflectance attributes of a scene image. The image feature subset targeted provides a direct access to the surface illumination information, while a newly introduced efficient rendering method reshapes the dynamic range distribution of the specular reflectance components over each image color channel. This image enhancement technique renders the scene illumination reflection effectively without altering the scene’s surface diffuse attributes contributing to realistic rendering effects. ^ As an ancillary contribution, an effective color constancy algorithm based on the dichromatic reflectance model was also developed. This algorithm selects image highlights in order to extract the prominent surface reflectance that reproduces the exact illumination chromaticity. This evaluation is presented using a novel voting scheme technique based on histogram analysis. ^ In each of the three main contributions, empirical evaluations were performed on synthetic and real-world image scenes taken from three different color image datasets. The experimental results show over 90% accuracy in illumination estimation contributing to near real world illumination rendering effects. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The trend of green consumerism and increased standardization of environmental regulations has driven multinational corporations (MNCs) to seek standardization of environmental practices or at least seek to be associated with such behavior. In fact, many firms are seeking to free ride on this global green movement, without having the actual ecological footprint to substantiate their environmental claims. While scholars have articulated the benefits from such optimization of uniform global green operations, the challenges for MNCs to control and implement such operations are understudied. For firms to translate environmental commitment to actual performance, the obstacles are substantial, particularly for the MNC. This is attributed to headquarters' (HQ) control challenges (1) in managing core elements of the corporate environmental management (CEM) process and specifically matching verbal commitment and policy with ecological performance and by (2) the fact that the MNC operates in multiple markets and the HQ is required to implement policy across complex subsidiary networks consisting of diverse and distant units. Drawing from the literature on HQ challenges of MNC management and control, this study examines (1) how core components of the CEM process impact optimization of global environmental performance (GEP) and then uses network theory to examine how (2) a subsidiary network's dimensions can present challenges to the implementation of green management policies. It presents a framework for CEM which includes (1) MNCs' Verbal environmental commitment, (2) green policy Management which guides standards for operations, (3) actual environmental Performance reflected in a firm's ecological footprint and (4) corporate environmental Reputation (VMPR). Then it explains how an MNC's key subsidiary network dimensions (density, diversity, and dispersion) create challenges that hinder the relationship between green policy management and actual environmental performance. It combines content analysis, multiple regression, and post-hoc hierarchal cluster analysis to study US manufacturing MNCs. The findings support a positive significant effect of verbal environmental commitment and green policy management on actual global environmental performance and environmental reputation, as well as a direct impact of verbal environmental commitment on green policy management. Unexpectedly, network dimensions were not found to moderate the relationship between green management policy and GEP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fueled by increasing human appetite for high computing performance, semiconductor technology has now marched into the deep sub-micron era. As transistor size keeps shrinking, more and more transistors are integrated into a single chip. This has increased tremendously the power consumption and heat generation of IC chips. The rapidly growing heat dissipation greatly increases the packaging/cooling costs, and adversely affects the performance and reliability of a computing system. In addition, it also reduces the processor's life span and may even crash the entire computing system. Therefore, dynamic thermal management (DTM) is becoming a critical problem in modern computer system design. Extensive theoretical research has been conducted to study the DTM problem. However, most of them are based on theoretically idealized assumptions or simplified models. While these models and assumptions help to greatly simplify a complex problem and make it theoretically manageable, practical computer systems and applications must deal with many practical factors and details beyond these models or assumptions. The goal of our research was to develop a test platform that can be used to validate theoretical results on DTM under well-controlled conditions, to identify the limitations of existing theoretical results, and also to develop new and practical DTM techniques. This dissertation details the background and our research efforts in this endeavor. Specifically, in our research, we first developed a customized test platform based on an Intel desktop. We then tested a number of related theoretical works and examined their limitations under the practical hardware environment. With these limitations in mind, we developed a new reactive thermal management algorithm for single-core computing systems to optimize the throughput under a peak temperature constraint. We further extended our research to a multicore platform and developed an effective proactive DTM technique for throughput maximization on multicore processor based on task migration and dynamic voltage frequency scaling technique. The significance of our research lies in the fact that our research complements the current extensive theoretical research in dealing with increasingly critical thermal problems and enabling the continuous evolution of high performance computing systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The local area network (LAN) interconnecting computer systems and soft- ware can make a significant contribution to the hospitality industry. The author discusses the advantages and disadvantages of such systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research was to investigate the relationship of computer anxiety to selected demographic variables: learning styles, age, gender, ethnicity, teaching/professional areas, educational level, and school types among vocational-technical educators.^ The subjects (n = 202) were randomly selected vocational-technical educators from Dade County Public School System, Florida, stratified across teaching/professional areas. All subjects received the same survey package in the spring of 1996. Subjects self-reported their learning style and level of computer anxiety by completing Kolb's Learning Style Inventory (LSI) and Oetting's Computer Anxiety Scale (COMPAS, Short Form). Subjects' general demographic information and their experience with computers were collected through a self-reported Participant Inventory Form.^ The distribution of scores suggested that some educators (25%) experienced some overall computer anxiety. There were significant correlations between computer related experience as indicated by self-ranked computer competence and computer based training and computer anxiety. One-way analyses of variance (ANOVA) indicated no significant differences between computer anxiety and/or computer related experiences, and learning style, age, and ethnicity. There were significant differences between educational level, teaching area, school type, and computer anxiety and/or computer related experiences. T-tests indicated significant differences between gender and computer related experiences. However, there was no difference between gender and computer anxiety.^ Analyses of covariance (ANCOVA) were performed for each independent variable on computer anxiety, with computer related experiences (self-ranked computer competence and computer based training) as the respective covariates. There were significant main effects for the educational level and school type on computer anxiety. All other variables were insignificant on computer anxiety. ANCOVA also revealed an effect for learning style varied notably on computer anxiety. All analyses were conducted at the.05 level of significance. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, over 15,000 Ion Mobility Spectrometry (IMS) analyzers are employed at worldwide security checkpoints to detect explosives and illicit drugs. Current portal IMS instruments and other electronic nose technologies detect explosives and drugs by analyzing samples containing the headspace air and loose particles residing on a surface. Canines can outperform these systems at sampling and detecting the low vapor pressure explosives and drugs, such as RDX, PETN, cocaine, and MDMA, because these biological detectors target the volatile signature compounds available in the headspace rather than the non-volatile parent compounds of explosives and drugs. In this dissertation research volatile signature compounds available in the headspace over explosive and drug samples were detected using SPME as a headspace sampling tool coupled to an IMS analyzer. A Genetic Algorithm (GA) technique was developed to optimize the operating conditions of a commercial IMS (GE Itemizer 2), leading to the successful detection of plastic explosives (Detasheet, Semtex H, and C-4) and illicit drugs (cocaine, MDMA, and marijuana). Short sampling times (between 10 sec to 5 min) were adequate to extract and preconcentrate sufficient analytes (> 20 ng) representing the volatile signatures in the headspace of a 15 mL glass vial or a quart-sized can containing ≤ 1 g of the bulk explosive or drug. Furthermore, a research grade IMS with flexibility for changing operating conditions and physical configurations was designed and fabricated to accommodate future research into different analytes or physical configurations. The design and construction of the FIU-IMS were facilitated by computer modeling and simulation of ion’s behavior within an IMS. The simulation method developed uses SIMION/SDS and was evaluated with experimental data collected using a commercial IMS (PCP Phemto Chem 110). The FIU-IMS instrument has comparable performance to the GE Itemizer 2 (average resolving power of 14, resolution of 3 between two drugs and two explosives, and LODs range from 0.7 to 9 ng). The results from this dissertation further advance the concept of targeting volatile components to presumptively detect the presence of concealed bulk explosives and drugs by SPME-IMS, and the new FIU-IMS provides a flexible platform for future IMS research projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the major problems in the analysis of beams with Moment of Inertia varying along their length, is to find the Fixed End Moments, Stiffness, and Carry-Over Factors. In order to determine Fixed End Moments, it is necessary to consider the non-prismatic member as integrated by a large number of small sections with constant Moment of Inertia, and to find the M/EI values for each individual section. This process takes a lot of time from Designers and Structural Engineers. The object of this thesis is to design a computer program to simplify this repetitive process, obtaining rapidly and effectively the Final Moments and Shears in continuous non-prismatic Beams. For this purpose the Column Analogy and the Moment Distribution Methods of Professor Hardy Cross have been utilized as the principles toward the methodical computer solutions. The program has been specifically designed to analyze continuous beams of a maximum of four spans of any length, integrated by symmetrical members with rectangular cross sections and with rectilinear variation of the Moment of Inertia. Any load or combination of uniform and concentrated loads must be considered. Finally sample problems will be solved with the new Computer Program and with traditional systems, to determine the accuracy and applicability of the Program.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Minimum Student Performance Standards in Computer Literacy and Science were passed by the Florida Legislature through the Educational Reform Act of 1983. This act mandated that all Florida high school graduates receive training in computer literacy. Schools and school systems were charged with the task of determining the best methods to deliver this instruction to their students. The scope of this study is to evaluate one school's response to the state of Florida's computer literacy mandate. The study was conducted at Miami Palmetto Senior High School, located in Dade County, Florida. The administration of Miami Palmetto Senior High School chose to develop and implement a new program to comply with the state mandate - integrating computer literacy into the existing biology curriculum. The study evaluated the curriculum to determine if computer literacy could be integrated successfully and meet both the biology and computer literacy objectives. The findings in this study showed that there were no significant differences between biology scores of the students taking the integrated curriculum and those taking a traditional curriculum of biology. Student in the integrated curriculum not only met the biology objectives as well as those in the traditional curriculum, they also successfully completed the intended objectives for computer literacy. Two sets of objectives were successfully completed in the integrated classes in the same amount of time used to complete one set of objectives in the traditional biology classes. Therefore, integrated curriculum was the more efficient means of meeting the intended objectives of both biology and computer literacy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geo-referenced catch and fishing effort data of the bigeye tuna fisheries in the Indian Ocean over 1952-2014 were analysed and standardized to facilitate population dynamics modelling studies. During this sixty-two years historical period of exploitation, many changes occurred both in the fishing techniques and the monitoring of activity. This study includes a series of processing steps used for standardization of spatial resolution, conversion and standardization of catch and effort units, raising of geo-referenced catch into nominal catch level, screening and correction of outliers, and detection of major catchability changes over long time series of fishing data, i.e., the Japanese longline fleet operating in the tropical Indian Ocean. A total of thirty fisheries were finally determined from longline, purse seine and other-gears data sets, from which 10 longline and four purse seine fisheries represented 96% of the whole historical catch. The geo-referenced records consists of catch, fishing effort and associated length frequency samples of all fisheries.