964 resultados para Computer input-outpus equipment.
Resumo:
This paper is concerned with the integration of voice and data on an experimental local area network used by the School of Automation, of the Indian Institute of Science. SALAN (School of Automation Local Area Network) consists of a number of microprocessor-based communication nodes linked to a shared coaxial cable transmission medium. The communication nodes handle the various low-level functions associated with computer communication, and interface user data equipment to the network. SALAN at present provides a file transfer facility between an Intel Series III microcomputer development system and a Texas Instruments Model 990/4 microcomputer system. Further, a packet voice communication system has also been implemented on SALAN. The various aspects of the design and implementation of the above two utilities are discussed.
Resumo:
Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.
Resumo:
The present study examined how personality and social psychological factors affect third and fourth graders' computer-mediated communication. Personality was analysed in terms of the following strategies: optimism, pessimism and defensive pessimism. Students worked either individually or in dyads which were paired homogeneously or heterogeneously according to the strategies. Moreover, the present study compared horizontal and vertical interaction. The study also examined the role that popularity plays, and students were divided into groups based on their popularity level. The results show that an optimistic strategy is useful. Optimism was found to be related to the active production and processing of ideas. Although previous research has identified drawbacks to pessimism in achievement settings, this study shows that the pessimistic strategy is not as debilitating a strategy as is usually assumed. Pessimistic students were able to process their ideas. However, defensive pessimists were somewhat cautious in introducing or changing ideas. Heterogeneous dyads were not beneficial configurations with respect to producing, introducing, or changing ideas. Moreover, many differences were found to exist between the horizontal and vertical interaction; specifically, the students expressed more opinions and feelings when teachers took no part in the discussions. Strong emotions were observed especially in the horizontal interaction. Further, group working skills were found to be more important for boys than for girls, while rejected students were not at a disadvantage compared to popular ones. Schools can encourage emotional and social learning. The present study shows that students can use computers to express their feelings. In addition, students who are unpopular in non-computer contexts or students who use pessimism can benefit from computers. Participation in computer discussions can give unpopular children a chance to develop confidence when relating to peers.
Resumo:
Although incidence matrix representation has been used to analyze the Petri net based models of a system, it has the limitation that it does not preserve reflexive properties (i.e., the presence of selfloops) of Petri nets. But in many practical applications self-loops play very important roles. This paper proposes a new representation scheme for general Petri nets. This scheme defines a matrix called "reflexive incidence matrix (RIM) c which is a combination of two matrices, a "base matrix Cb,,, and a "power matrix CP." This scheme preserves the reflexive and other properties of the Petri nets. Through a detailed analysis it is shown that the proposed scheme requires less memory space and less processing time for answering commonly encountered net queries compared to other schemes. Algorithms to generate the RIM from the given net description and to decompose RIM into input and output function matrices are also given. The proposed Petri net representation scheme is very useful to model and analyze the systems having shared resources, chemical processes, network protocols, etc., and to evaluate the performance of asynchronous concurrent systems.
Resumo:
This paper describes an algorithm to compute the union, intersection and difference of two polygons using a scan-grid approach. Basically, in this method, the screen is divided into cells and the algorithm is applied to each cell in turn. The output from all the cells is integrated to yield a representation of the output polygon. In most cells, no computation is required and thus the algorithm is a fast one. The algorithm has been implemented for polygons but can be extended to polyhedra as well. The algorithm is shown to take O(N) time in the average case where N is the total number of edges of the two input polygons.
Resumo:
Deep convolutional network models have dominated recent work in human action recognition as well as image classification. However, these methods are often unduly influenced by the image background, learning and exploiting the presence of cues in typical computer vision datasets. For unbiased robotics applications, the degree of variation and novelty in action backgrounds is far greater than in computer vision datasets. To address this challenge, we propose an “action region proposal” method that, informed by optical flow, extracts image regions likely to contain actions for input into the network both during training and testing. In a range of experiments, we demonstrate that manually segmenting the background is not enough; but through active action region proposals during training and testing, state-of-the-art or better performance can be achieved on individual spatial and temporal video components. Finally, we show by focusing attention through action region proposals, we can further improve upon the existing state-of-the-art in spatio-temporally fused action recognition performance.
Resumo:
The potential of beef producers to profitably produce 500-kg steers at 2.5 years of age in northern Australia's dry tropics to meet specifications of high-value markets, using a high-input management (HIM) system was examined. HIM included targeted high levels of fortified molasses supplementation, short seasonal mating and the use of growth promotants. Using herds of 300-400 females plus steer progeny at three sites, HIM was compared at a business level to prevailing best-practice, strategic low-input management (SLIM) in which there is a relatively low usage of energy concentrates to supplement pasture intake. The data presented for each breeding-age cohort within management system at each site includes: annual pregnancy rates (range: 14-99%), time of conception, mortalities (range: 0-10%), progeny losses between confirmed pregnancy and weaning (range: 0-29%), and weaning rates (range: 14-92%) over the 2-year observation. Annual changes in weight and relative net worth were calculated for all breeding and non-breeding cohorts. Reasons for outcomes are discussed. Compared with SLIM herds, both weaning weights and annual growth were >= 30 kg higher, enabling 86-100% of HIM steers to exceed 500 kg at 2.5 years of age. Very few contemporary SLIM steers reached this target. HIM was most profitably applied to steers. Where HIM was able to achieve high pregnancy rates in yearlings, its application was recommended in females. Well managed, appropriate HIM systems increased profits by around $15/adult equivalent at prevailing beef and supplement prices. However, a 20% supplement price rise without a commensurate increase in values for young slaughter steers would generally eliminate this advantage. This study demonstrated the complexity of pro. table application of research outcomes to commercial business, even when component research suggests that specific strategies may increase growth and reproductive efficiency and/or be more pro. table. Because of the higher level of management required, higher costs and returns, and higher susceptibility to market changes and disease, HIM systems should only be applied after SLIM systems are well developed. To increase profitability, any strategy must ultimately either increase steer growth and sale values and/or enable a shift to high pregnancy rates in yearling heifers.
Resumo:
It has been suggested that semantic information processing is modularized according to the input form (e.g., visual, verbal, non-verbal sound). A great deal of research has concentrated on detecting a separate verbal module. Also, it has traditionally been assumed in linguistics that the meaning of a single clause is computed before integration to a wider context. Recent research has called these views into question. The present study explored whether it is reasonable to assume separate verbal and nonverbal semantic systems in the light of the evidence from event-related potentials (ERPs). The study also provided information on whether the context influences processing of a single clause before the local meaning is computed. The focus was on an ERP called N400. Its amplitude is assumed to reflect the effort required to integrate an item to the preceding context. For instance, if a word is anomalous in its context, it will elicit a larger N400. N400 has been observed in experiments using both verbal and nonverbal stimuli. Contents of a single sentence were not hypothesized to influence the N400 amplitude. Only the combined contents of the sentence and the picture were hypothesized to influence the N400. The subjects (n = 17) viewed pictures on a computer screen while hearing sentences through headphones. Their task was to judge the congruency of the picture and the sentence. There were four conditions: 1) the picture and the sentence were congruent and sensible, 2) the sentence and the picture were congruent, but the sentence ended anomalously, 3) the picture and the sentence were incongruent but sensible, 4) the picture and the sentence were incongruent and anomalous. Stimuli from the four conditions were presented in a semi-randomized sequence. Their electroencephalography was simultaneously recorded. ERPs were computed for the four conditions. The amplitude of the N400 effect was largest in the incongruent sentence-picture -pairs. The anomalously ending sentences did not elicit a larger N400 than the sensible sentences. The results suggest that there is no separate verbal semantic system, and that the meaning of a single clause is not processed independent of the context.
Resumo:
We present a fast algorithm for computing a Gomory-Hu tree or cut tree for an unweighted undirected graph G = (V, E). The expected running time of our algorithm is (O) over tilde (mc) where vertical bar E vertical bar = m and c is the maximum u-v edge connectivity, where u, v is an element of V. When the input graph is also simple (i.e., it has no parallel edges), then the u-v edge connectivity for each pair of vertices u and v is at most n - 1; so the expected run-ning time of our algorithm for simple unweighted graphs is (O) over tilde (mn). All the algorithms currently known for constructing a Gomory-Hu tree [8, 9] use n - 1 minimum s-t cut (i.e., max flow) subroutines. This in conjunction with the current fastest (O) over tilde (n(20/9)) max flow algorithm due to Karger and Levine[11] yields the current best running time of (O) over tilde (n(20/9)n) for Gomory-Hu tree construction on simple unweighted graphs with m edges and n vertices. Thus we present the first (O) over tilde (mn) algorithm for constructing a Gomory-Hu tree for simple unweighted graphs. We do not use a max flow subroutine here; we present an efficient tree packing algorithm for computing Steiner edge connectivity and use this algorithm as our main subroutine. The advantage in using a tree packing algorithm for constructing a Gomory-Hu tree is that the work done in computing a minimum Steiner cut for a Steiner set S subset of V can be reused for computing a minimum Steiner cut for certain Steiner sets S' subset of S.
Resumo:
This paper demonstrates the application of inverse filtering technique for power systems. In order to implement this method, the control objective should be based on a system variable that needs to be set on a specific value for each sampling time. A control input is calculated to generate the desired output of the plant and the relationship between the two is used design an auto-regressive model. The auto-regressive model is converted to a moving average model to calculate the control input based on the future values of the desired output. Therefore, required future values to construct the output are predicted to generate the appropriate control input for the next sampling time.
Resumo:
In this paper, we develop a cipher system based on finite field transforms. In this system, blocks of the input character-string are enciphered using congruence or modular transformations with respect to either primes or irreducible polynomials over a finite field. The polynomial system is shown to be clearly superior to the prime system for conventional cryptographic work.
Resumo:
This paper presents a new algorithm for the step-size change of instantaneous adaptive delta modulator. The present strategy is such that the step-size at any sampling instant can increase or decrease by either of the two constant factors or can remain the same, depending upon the combination of three or four most recent output bits. The quantizer has been simulated on a digital computer, and its performance compared with other quantizers. The figure of merit used is the SNR with gaussian signals as the input. The results indicate that the new design can give an improved SNR over a wider dynamic range and fast response to step inputs, as compared to the earlier systems.
Resumo:
The international traveller needs to plan ahead to ensure medicines are available and used as directed for optimal therapeutic outcome. The planning needs to take account of legal and customs requirements for travelling with medicines for personal use. The standard advice by travel health providers is that travellers should check with the country of destination for requirements when travelling into the country with medicines for personal use. This is akin to introducing a barrier to care for this category of travellers. Innovative method of care for this group of traveller is needed.
Resumo:
In 2002, AFL Queensland and the Brisbane Lions Football Club approached the Department of Primary Industries and Fisheries (Queensland) for advice on improving their Premier League sports fields. They were concerned about player safety and dissatisfaction with playing surfaces, particularly uneven turf cover and variable under-foot conditions. They wanted to get the best from new investments in ground maintenance equipment and irrigation infrastructure. Their sports fields were representative of community-standard, multi-use venues throughout Australia; generally ‘natural’ soil fields, with low maintenance budgets, managed by volunteers. Improvements such as reconstruction, drainage, or regular re-turfing are generally not affordable. Our project aimed to: (a) Review current world practice and performance benchmarks; (b) Demonstrate best-practice management for community-standard fields; (c) Adapt relevant methods for surface performance testing; (d) Assess current soils, and investigate useful amendments; (e) Improve irrigation system performance; and (e) Build industry capacity and encourage patterns for ongoing learning. Most global sports field research focuses on elite, sand-based fields. We adjusted elite standards for surface performance (hardness, traction, soil moisture, evenness, sward cover/height) and maintenance programs, to suit community-standard fields with lesser input resources. In regularly auditing ground conditions across 12 AFLQ fields in SE QLD, we discovered surface hardness (measured by Clegg Hammer) was the No. 1 factor affecting player safety and surface performance. Other important indices were turf coverage and surface compaction (measured by penetrometer). AFLQ now runs regularly audits affiliated fields, and closes grounds with hardness readings greater than 190 Gmax. Aerating every two months was the primary mechanical practice improving surface condition and reducing hardness levels to < 110 Gmax on the renovated project fields. With irrigation installation, these fields now record surface conditions comparable to elite fields. These improvements encouraged many other sporting organisations to seek advice / assistance from the project team. AFLQ have since substantially invested in an expanded ground improvement program, to cater for this substantially increased demand. In auditing irrigation systems across project fields, we identified low maintenance (with < 65% of sprinklers operating optimally) as a major problem. Retrofitting better nozzles and adjusting sprinklers improved irrigation distribution uniformity to 75-80%. Research showed that reducing irrigation frequency to weekly, and preparedness to withhold irrigation longer after rain, reduced irrigation requirement by 30-50%, compared to industry benchmarks of 5-6 ML/ha/annum. Project team consultation with regulatory authorities enhanced irrigation efficiency under imposed regional water restrictions. Laboratory studies showed incorporated biosolids / composts, or topdressed crumb rubber, improved compaction resistance of soils. Field evaluations confirmed compost incorporation significantly reduced surface hardness of high wear areas in dry conditions, whilst crumb rubber assisted turf persistence into early winter. Neither amendment was a panacea for poor agronomic practices. Under the auspices of the project Trade Mark Sureplay®, we published > 80 articles, and held > 100 extension activities involving > 2,000 participants. Sureplay® has developed a multi-level curator training structure and resource materials, subject to commercial implementation. The partnerships with industry bodies (particularly AFLQ), frequent extension activities, and engagement with government/regulatory sectors have been very successful, and are encouraged for any future work. Specific aspects of sports field management for further research include: (a) Understanding of factors affecting turf wear resistance and recovery, to improve turf persistence under wear; (b) Simple tests for pinpointing areas of fields with high hardness risk; and (c) Evaluation of new irrigation infrastructure, ‘water-saving’ devices, and irrigation protocols, in improving water use and turf cover outcomes.