898 resultados para Time and hardware redundancy


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides concordance procedures for product-level trade and production data in the EU and examines the implications of changing product classifications on measured product adding and dropping at Belgian firms. Using the algorithms developed by Pierce and Schott (2012a, 2012b), the paper develops concordance procedures that allow researchers to trace changes in coding systems over time and to translate product-level production and trade data into a common classification that is consistent both within a single year and over time. Separate procedures are created for the eightdigit Combined Nomenclature system used to classify international trade activities at the product level within the European Union as well as for the eight-digit Prodcom categories used to classify products in European domestic production data. The paper further highlights important differences in coverage between the Prodcom and Combined Nomenclature classifications which need to be taken into account when generating combined domestic production and international trade data at the product level. The use of consistent product codes over time results in less product adding and dropping at continuing firms in the Belgian export and production data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"September 1994."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automaticity (in this essay defined as short response time) and fluency in language use are closely connected to each other and some research has been conducted regarding some of the aspects involved. In fact, the notion of automaticity is still debated and many definitions and opinions on what automaticity is have been suggested (Andersson,1987, 1992, 1993, Logan, 1988, Segalowitz, 2010). One aspect that still needs more research is the correlation between vocabulary proficiency (a person’s knowledge about words and ability to use them correctly) and response time in word recognition. Therefore, the aim of this study has been to investigate this correlation using two different tests; one vocabulary size test (Paul Nation) and one lexical decision task (SuperLab) that measures both response time and accuracy. 23 Swedish students partaking in the English 7 course in upper secondary Swedish school were tested. The data were analyzed using a quantitative method where the average values and correlations from the test were used to compare the results. The correlations were calculated using Pearson’s Coefficient Correlations Calculator. The empirical study indicates that vocabulary proficiency is not strongly correlated with shorter response times in word recognition. Rather, the data indicate that L2 learners instead are sensitive to the frequency levels of the vocabulary. The accuracy (number of correct recognized words) and response times correlate with the frequency level of the tested words. This indicates that factors other than vocabulary proficiency are important for the ability to recognize words quickly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present research investigated the separate and interactive effects of the minor tranquilliser, temazepam, and a low dose of alcohol on the amplitude and latency of P300 and on reaction time. Twenty-four participants completed four drug treatments in a repeated measures design. The four drug treatments, organised as a fully repeated 2 x 2 design, included a placebo condition, an alcohol only condition, a temazepam only condition, and an alcohol and temazepam combined condition. Event-related potentials were recorded from midline sites Fz, Cz, and Pz within an oddball paradigm. The results indicated that temazepam, with or without the presence of alcohol, reduced P300 amplitude. Alcohol, on the other hand, with or without the presence of temazepam, affected processing speed and stimulus evaluation as indexed by reaction time and P300 latency. At the low dose levels used in this experiment alcohol and temazepam appear not to interact, which suggests that they affect different aspects of processing in the central nervous system. (C) 2003 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mechanisms that produce behavior which increase future survival chances provide an adaptive advantage. The flexibility of human behavior is at least partly the result of one such mechanism, our ability to travel mentally in time and entertain potential future scenarios. We can study mental time travel in children using language. Current results suggest that key developments occur between the ages of three to five. However, linguistic performance can be misleading as language itself is developing. We therefore advocate the use of methodologies that focus on future-oriented action. Mental time travel required profound changes in humans' motivational system, so that current behavior could be directed to secure not just present, but individually anticipated future needs. Such behavior should be distinguishable from behavior based on current drives, or on other mechanisms. We propose an experimental paradigm that provides subjects with an opportunity to act now to satisfy a need not currently experienced. This approach may be used to assess mental time travel in nonhuman animals. We conclude by describing a preliminary study employing an adaptation of this paradigm for children. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to examine the effects of different methods of measuring training volume, controlled in different ways, on selected variables that reflect acute neuromuscular responses. Eighteen resistance-trained males performed three fatiguing protocols of dynamic constant external resistance exercise, involving elbow flexors, that manipulated either time-under-tension (TUT) or volume load (VL), defined as the product of training load and repetitions. Protocol A provided a standard for TUT and VL. Protocol B involved the same VL as Protocol A but only 40% concentric TUT; Protocol C was equated to Protocol A for TUT but only involved 50% VL. Fatigue was assessed by changes in maximum voluntary isometric contraction (MVIC), interpolated doublet (ID), muscle twitch characteristics (peak twitch, time to peak twitch, 0.5 relaxation time, and mean rates of force development and twitch relaxation). All protocols produced significant changes (P

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bang-bang phase detector based PLLs are simple to design, suffer no systematic phase error, and can run at the highest speed a process can make a working flip-flop. For these reasons designers are employing them in the design of very high speed Clock Data Recovery (CDR) architectures. The major drawback of this class of PLL is the inherent jitter due to quantized phase and frequency corrections. Reducing loop gain can proportionally improve jitter performance, but also reduces locking time and pull-in range. This paper presents a novel PLL design that dynamically scales its gain in order to achieve fast lock times while improving fitter performance in lock. Under certain circumstances the design also demonstrates improved capture range. This paper also analyses the behaviour of a bang-bang type PLL when far from lock, and demonstrates that the pull-in range is proportional to the square root of the PLL loop gain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major impediment to developing real-time computer vision systems has been the computational power and level of skill required to process video streams in real-time. This has meant that many researchers have either analysed video streams off-line or used expensive dedicated hardware acceleration techniques. Recent software and hardware developments have greatly eased the development burden of realtime image analysis leading to the development of portable systems using cheap PC hardware and software exploiting the Multimedia Extension (MMX) instruction set of the Intel Pentium chip. This paper describes the implementation of a computationally efficient computer vision system for recognizing hand gestures using efficient coding and MMX-acceleration to achieve real-time performance on low cost hardware.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of technology management in achieving improved manufacturing performance has been receiving increased attention as enterprises are becoming more exposed to competition from around the world. In the modern market for manufactured goods the demand is now for more product variety, better quality, shorter delivery and greater flexibility, while the financial and environmental cost of resources has become an urgent concern to manufacturing managers. This issue of the International Journal of Technology Management addresses the question of how the diffusion, implementation and management of technology can improve the performance of manufacturing industries. The authors come from a large number of different countries and their contributions cover a wide range of topics within this general theme. Some papers are conceptual, others report on research carried out in a range of different industries including steel production, iron founding, electronics, robotics, machinery, precision engineering, metal working and motor manufacture. In some cases they describe situations in specific countries. Several are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the conference theme was 'Achieving Competitive Edge: Getting Ahead Through Technology and People'. The first two papers deal with questions of advanced manufacturing technology implementation and management. Firstly Beatty describes a three year longitudinal field study carried out in ten Canadian manufacturing companies using CADICAM and CIM systems. Her findings relate to speed of implementation, choice of system type, the role of individuals in implementation, organization and job design. This is followed by a paper by Bessant in which he argues that a more a strategic approach should be taken towards the management of technology in the 1990s and beyond. Also considered in this paper are the capabilities necessary in order to deploy advanced manufacturing technology as a strategic resource and the way such capabilities might be developed within the firm. These two papers, which deal largely with the implementation of hardware, are supplemented by Samson and Sohal's contribution in which they argue that a much wider perspective should be adopted based on a new approach to manufacturing strategy formulation. Technology transfer is the topic of the following two papers. Pohlen again takes the case of advanced manufacturing technology and reports on his research which considers the factors contributing to successful realisation of AMT transfer. The paper by Lee then provides a more detailed account of technology transfer in the foundry industry. Using a case study based on a firm which has implemented a number of transferred innovations a model is illustrated in which the 'performance gap' can be identified and closed. The diffusion of technology is addressed in the next two papers. In the first of these, by Lowe and Sim, the managerial technologies of 'Just in Time' and 'Manufacturing Resource Planning' (or MRP 11) are examined. A study is described from which a number of factors are found to influence the adoption process including, rate of diffusion and size. Dahlin then considers the case of a specific item of hardware technology, the industrial robot. Her paper reviews the history of robot diffusion since the early 1960s and then tries to predict how the industry will develop in the future. The following two papers deal with the future of manufacturing in a more general sense. The future implementation of advanced manufacturing technology is the subject explored by de Haan and Peters who describe the results of their Dutch Delphi forecasting study conducted among a panel of experts including scientists, consultants, users and suppliers of AMT. Busby and Fan then consider a type of organisational model, 'the extended manufacturing enterprise', which would represent a distinct alternative pure market-led and command structures by exploiting the shared knowledge of suppliers and customers. The three country-based papers consider some strategic issues relating manufacturing technology. In a paper based on investigations conducted in China He, Liff and Steward report their findings from strategy analyses carried out in the steel and watch industries with a view to assessing technology needs and organizational change requirements. This is followed by Tang and Nam's paper which examines the case of machinery industry in Korea and its emerging importance as a key sector in the Korean economy. In his paper which focuses on Venezuela, Ernst then considers the particular problem of how this country can address the problem of falling oil revenues. He sees manufacturing as being an important contributor to Venezuela's future economy and proposes a means whereby government and private enterprise can co-operate in development of the manufacturing sector. The last six papers all deal with specific topics relating to the management manufacturing. Firstly Youssef looks at the question of manufacturing flexibility, introducing and testing a conceptual model that relates computer based technologies flexibility. Dangerfield's paper which follows is based on research conducted in the steel industry. He considers the question of scale and proposes a modelling approach determining the plant configuration necessary to meet market demand. Engstrom presents the results of a detailed investigation into the need for reorganising material flow where group assembly of products has been adopted. Sherwood, Guerrier and Dale then report the findings of a study into the effectiveness of Quality Circle implementation. Stillwagon and Burns, consider how manufacturing competitiveness can be improved individual firms by describing how the application of 'human performance engineering' can be used to motivate individual performance as well as to integrate organizational goals. Finally Sohal, Lewis and Samson describe, using a case study example, how just-in-time control can be applied within the context of computer numerically controlled flexible machining lines. The papers in this issue of the International Journal of Technology Management cover a wide range of topics relating to the general question of improving manufacturing performance through the dissemination, implementation and management of technology. Although they differ markedly in content and approach, they have the collective aim addressing the concepts, principles and practices which provide a better understanding the technology of manufacturing and assist in achieving and maintaining a competitive edge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three different stoichiometric forms of RbMn[Fe(CN) ]y·zHO [x = 0.96, y = 0.98, z = 0.75 (1); x = 0.94, y = 0.88, z = 2.17 (2); x = 0.61, y = 0.86, z = 2.71 (3)] Prussian blue analogues were synthesized and investigated by magnetic, calorimetric, Raman spectroscopic, X-ray diffraction, and Fe Mössbauer spectroscopic methods. Compounds 1 and 2 show a hysteresis loop between the high-temperature (HT) Fe(S = 1/2)-CN-Mn(S = 5/2) and the low-temperature (LT) Fe(S = 0)-CN-Mn(S = 2) forms of 61 and 135 K width centered at 273 and 215 K, respectively, whereas the third compound remains in the HT phase down to 5 K. The splitting of the quadrupolar doublets in the Fe Mössbauer spectra reveal the electron-transfer-active centers. Refinement of the X-ray powder diffraction profiles shows that electron-transfer-active materials have the majority of the Rb ions on only one of the two possible interstitial sites, whereas nonelectron-transfer-active materials have the Rb ions equally distributed. Moreover, the stability of the compounds with time and following heat treatment is also discussed. © Wiley-VCH Verlag GmbH & Co. KGaA, 2009.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep hole drilling is one of the most complicated metal cutting processes and one of the most difficult to perform on CNC machine-tools or machining centres under conditions of limited manpower or unmanned operation. This research work investigates aspects of the deep hole drilling process with small diameter twist drills and presents a prototype system for real time process monitoring and adaptive control; two main research objectives are fulfilled in particular : First objective is the experimental investigation of the mechanics of the deep hole drilling process, using twist drills without internal coolant supply, in the range of diarneters Ø 2.4 to Ø4.5 mm and working length up to 40 diameters. The definition of the problems associated with the low strength of these tools and the study of mechanisms of catastrophic failure which manifest themselves well before and along with the classic mechanism of tool wear. The relationships between drilling thrust and torque with the depth of penetration and the various machining conditions are also investigated and the experimental evidence suggests that the process is inherently unstable at depths beyond a few diameters. Second objective is the design and implementation of a system for intelligent CNC deep hole drilling, the main task of which is to ensure integrity of the process and the safety of the tool and the workpiece. This task is achieved by means of interfacing the CNC system of the machine tool to an external computer which performs the following functions: On-line monitoring of the drilling thrust and torque, adaptive control of feed rate, spindle speed and tool penetration (Z-axis), indirect monitoring of tool wear by pattern recognition of variations of the drilling thrust with cumulative cutting time and drilled depth, operation as a data base for tools and workpieces and finally issuing of alarms and diagnostic messages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research, which was given the terms of reference, "To cut the lead time for getting new products into volume production", was sponsored by a company which develops and manufactures telecommunications equipment. The research described was based on studies made of the development of two processors which were designed to control telephone exchanges in the public network. It was shown that for each of these products, which were large electronic systems containing both hardware and software, most of their lead time was taken up with development. About half of this time was consumed by activities associated with redesign resulting from changes found to be necessary after the original design had been built. Analysing the causes of design changes showed the most significant to be Design Faults. The reasons why these predominated were investigated by seeking the collective opinion from design staff and their management using a questionnaire. Using the results from these studies to build upon the works of other authors, a model of the development process of large hierarchical systems is derived. An important feature of this model is its representation of iterative loops due to design changes. In order to reduce the development time, two closely related philosophies are proposed: By spending more time at the early stages of development (detecting and remedying faults in the design) even greater savings can be made later on, The collective performance of the development organisation would be improved by increasing the amount and speed of feedback about that performance. A trial was performed to test these philosophies using readily available techniques for design verification. It showed that about an 11 per cent saving would be made on the development time and that the philosophies might be equally successfully applied to other products and techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite the availability of various control techniques and project control software many construction projects still do not achieve their cost and time objectives. Research in this area so far has mainly been devoted to identifying causes of cost and time overruns. There is limited research geared towards studying factors inhibiting the ability of practitioners to effectively control their projects. To fill this gap, a survey was conducted on 250 construction project organizations in the UK, which was followed by face-to-face interviews with experienced practitioners from 15 of these organizations. The common factors that inhibit both time and cost control during construction projects were first identified. Subsequently 90 mitigating measures have been developed for the top five leading inhibiting factors—design changes, risks/uncertainties, inaccurate evaluation of project time/duration, complexities and non-performance of subcontractors were recommended. These mitigating measures were classified as: preventive, predictive, corrective and organizational measures. They can be used as a checklist of good practice and help project managers to improve the effectiveness of control of their projects.