920 resultados para network of workstations
Resumo:
Everyday human behaviour relies on our ability to predict outcomes on the basis of moment by moment information. Long-range neural phase synchronization has been hypothesized as a mechanism by which ‘predictions’ can exert an effect on the processing of incoming sensory events. Using magnetoencephalography (MEG) we have studied the relationship between the modulation of phase synchronization in a cerebral network of areas involved in visual target processing and the predictability of target occurrence. Our results reveal a striking increase in the modulation of phase synchronization associated with an increased probability of target occurrence. These observations are consistent with the hypothesis that long-range phase synchronization plays a critical functional role in humans' ability to effectively employ predictive heuristics.
Resumo:
This paper models how the structure and function of a network of firms affects their aggregate innovativeness. Each firm has the potential to innovate, either from in-house R&D or from innovation spillovers from neighboring firms. The nature of innovation spillovers depends upon network density, the commonality of knowledge between firms, and the learning capability of firms. Innovation spillovers are modelled in detail using ideas from organizational theory. Two main results emerge: (i) the marginal effect on innovativeness of spillover intensity is non-monotonic, and (ii) network density can affect innovativeness but only when there are heterogeneous firms.
Resumo:
High strength low alloy steels have been shown to be adversely affected by the existence of regions of poor impact toughness within the heat affected zone (HAZ) produced during multipass welding. One of these regions is the intercritically reheated coarse grained HAZ or intercritical zone. Since this region is generally narrow and discontinuous, of the order of 0.5 mm in width, weld simulators are often employed to produce a larger volume of uniform microstructure suitable for toughness assessment. The steel usedfor this study was a commercial quenched and tempered steel of 450 MN m -2 yield strength. Specimen blanks were subjected to a simulated welding cycle to produce a coarse grained structure of upper bainite during the first thermal cycle, followed by a second thermal cycle where the peak temperature T p2 was controlled. Charpy tests carried out for T p2 values in the range 650-850°C showed low toughness for T p2 values between 760 and 790°C, in the intercritical regime. Microstructural investigation of the development of grain boundary martensite-retained austenite (MA) phase has been coupled with image analysis to measure the volume fraction of MAformed. Most of the MA constituent appears at the prior austenite grain boundaries during intercritical heating, resulting in a 'necklace' appearance. For values of T p2 greater than 790°C the necklace appearance is lost and the second phase areas are observed throughout the structure. Concurrent with this is the development of the fine grained, predominantly ferritic structure that is associated with the improvement in toughness. At this stage the microstructure is transforming from the intercritical regime structure to the supercritically reheated coarse grained HAZ structure. The toughness improvement occurs even though the MA phase is still present, suggesting that the embrittlement is associated with the presence of a connected grain boundary network of the MA phase. The nature of the second phase particles can be controlled by the cooling rate during the second cycle and variesfrom MA phase at high cooling rates to a pearlitic structure at low cooling rates. The lowest toughness of the intercritical zone is observed only when MA phase is present. The reason suggested for this is that only the MA particles debond readily, a number of debonded particles in close proximity providing sufficient stress concentration to initiate local cleavage. © 1993 The Institute of Materials.
Resumo:
Determining an appropriate research methodology is considered as an important element in a research study; especially in a doctoral research study. It involves approach to the entire process of a research study, starting from theoretical underpinnings and spanning to data collection and analysis, and extending to developing the solutions for the problems investigated. Research methodology in essence is focused around the problems to be investigated in a research study and therefore varies according to the problems investigated. Thus, identifying the research methodology that best suits a research in hand is important, not only as it will benefit achieving the set objectives of a research, but also as it will serve establishing the credibility of the work. Research philosophy, approach, strategy, choice, and techniques are inherent components of the methodology. Research strategy provides the overall direction of the research including the process by which the research is conducted. Case study, experiment, survey, action research, grounded theory and ethnography are examples for such research strategies. Case study is documented as an empirical inquiry that investigates a contemporary phenomenon within its real-life context, especially when the boundaries between phenomenon and context are not clearly evident. Case study was adopted as the overarching research strategy, in a doctoral study developed to investigate the resilience of construction Small and Medium-sized Enterprises (SMEs) in the UK to extreme weather events. The research sought to investigate how construction SMEs are affected by EWEs, respond to the risk of EWEs, and means of enhancing their resilience to future EWEs. It is argued that utilising case study strategy will benefit the research study, in achieving the set objectives of the research and answering the research questions raised, by comparing and contrasting with the alternative strategies available. It is also claimed that the selected strategy will contribute towards addressing the call for improved methodological pluralism in construction management research, enhancing the understanding of complex network of relationships pertinent to the industry and the phenomenon being studied.
Resumo:
How can companies help change people's behaviour in order to benefit society? Organizations have the resources and market influence to effect positive change. Through product labeling, supply chain management, cause marketing, corporate philanthropy, employee volunteerism and NGO (non-government organization) partnerships, companies are helping society get active, eat healthy foods, dispose of products properly, use less energy and generally live more sustainable lives. This report reveals the three conditions necessary for changing people's behaviour that create benefits for society. The report also includes 19 mechanisms companies can use to motivate people to change and to create the capabilities and opportunities for change.
Resumo:
Smart cameras allow pre-processing of video data on the camera instead of sending it to a remote server for further analysis. Having a network of smart cameras allows various vision tasks to be processed in a distributed fashion. While cameras may have different tasks, we concentrate on distributed tracking in smart camera networks. This application introduces various highly interesting problems. Firstly, how can conflicting goals be satisfied such as cameras in the network try to track objects while also trying to keep communication overhead low? Secondly, how can cameras in the network self adapt in response to the behavior of objects and changes in scenarios, to ensure continued efficient performance? Thirdly, how can cameras organise themselves to improve the overall network's performance and efficiency? This paper presents a simulation environment, called CamSim, allowing distributed self-adaptation and self-organisation algorithms to be tested, without setting up a physical smart camera network. The simulation tool is written in Java and hence allows high portability between different operating systems. Relaxing various problems of computer vision and network communication enables a focus on implementing and testing new self-adaptation and self-organisation algorithms for cameras to use.
Resumo:
The entorhinal cortex (EC) is a key brain area controlling both hippocampal input and output via neurones in layer II and layer V, respectively. It is also a pivotal area in the generation and propagation of epilepsies involving the temporal lobe. We have previously shown that within the network of the EC, neurones in layer V are subject to powerful synaptic excitation but weak inhibition, whereas the reverse is true in layer II. The deep layers are also highly susceptible to acutely provoked epileptogenesis. Considerable evidence now points to a role of spontaneous background synaptic activity in control of neuronal, and hence network, excitability. In the present article we describe results of studies where we have compared background release of the excitatory transmitter, glutamate, and the inhibitory transmitter, GABA, in the two layers, the role of this background release in the balance of excitability, and its control by presynaptic auto- and heteroreceptors on presynaptic terminals. © The Physiological Society 2004.
Resumo:
This paper is focused on a parallel JAVA implementation of a processor defined in a Network of Evolutionary Processors. Processor description is based on JDom, which provides a complete, Java-based solution for accessing, manipulating, and outputting XML data from Java code. Communication among different processor to obtain a fully functional simulation of a Network of Evolutionary Processors will be treated in future. A safe-thread model of processors performs all parallel operations such as rules and filters. A non-deterministic behavior of processors is achieved with a thread for each rule and for each filter (input and output). Different results of a processor evolution are shown.
Resumo:
In the article the theoretical aspects of planning of the systems of the controlled from distance diagnosing of level of know ledges of students are resulted on the basis of modern pedagogical theoretical and technological approaches. The practical results of creation of the systems of this type are resulted for organization of testing both in the structure of local networks of higher educational establishments and with access through the global network of Internet.
Resumo:
Retailers increasingly recognize that environmental responsibility is a strategic imperative. However, little research has investigated or identified the factors that facilitate the successful implementation of environmentally responsible strategies across a network of customer-facing sales units (stores). We propose that a store manager’s ability to lead by example facilitates this process by fostering a supportive climate for store environmental stewardship (SENS-climate). By examining the influence of store managers’ actions on sales associates’ perceptions of the SENS-climate, as well as the subsequent impact on their performance—measured by margins, as well as sales of green and regular products—this study demonstrates that store managers can foster a SENS-climate by articulating their prioritization of environmental responsibility in their operational decisions. These positive effects are sustained by relational factors, such as the moderating effect of the store manager–sales associate dyadic tenure. In contrast, when store managers display high variability in their environmental orientation, it hinders the development of SENS-climate perceptions among sales associates. If sales associates perceive an enabling SENS-climate, they achieve higher margins and more green but fewer regular sales.
Resumo:
Distributed and/or composite web applications are driven by intercommunication via web services, which employ application-level protocols, such as SOAP. However, these protocols usually rely on the classic HTTP for transportation. HTTP is quite efficient for what it does — delivering web page content, but has never been intended to carry complex web service oriented communication. Today there exist modern protocols that are much better fit for the job. Such a candidate is XMPP. It is an XML-based, asynchronous, open protocol that has built-in security and authentication mechanisms and utilizes a network of federated servers. Sophisticated asynchronous multi-party communication patterns can be established, effectively aiding web service developers. This paper’s purpose is to prove by facts, comparisons, and practical examples that XMPP is not only better suited than HTTP to serve as middleware for web service protocols, but can also contribute to the overall development state of web services.
Resumo:
Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.
Resumo:
This paper details work carried out to verify the dimensional measurement performance of the Indoor GPS (iGPS) system; a network of Rotary-Laser Automatic Theodolites (R-LATs). Initially tests were carried out to determine the angular uncertainties on an individual R-LAT transmitter-receiver pair. A method is presented of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. Further tests carried out on a highly optimized version of the iGPS system have shown that the coordinate uncertainty can be reduced to 0.25 mm at a 95% confidence level.
Resumo:
This paper details a method of estimating the uncertainty of dimensional measurement for a three-dimensional coordinate measurement machine. An experimental procedure was developed to compare three-dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with a multilateration-like technique employed to establish three-dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. Specifically a distributed coordinate measurement device was tested which consisted of a network of Rotary-Laser Automatic Theodolites (R-LATs), this system is known commercially as indoor GPS (iGPS). The method was found to be practical and was used to estimate that the uncertainty of measurement for the basic iGPS system is approximately 1 mm at a 95% confidence level throughout a measurement volume of approximately 10 m × 10 m × 1.5 m. © 2010 IOP Publishing Ltd.
Resumo:
Advances in the area of industrial metrology have generated new technologies that are capable of measuring components with complex geometry and large dimensions. However, no standard or best-practice guides are available for the majority of such systems. Therefore, these new systems require appropriate testing and verification in order for the users to understand their full potential prior to their deployment in a real manufacturing environment. This is a crucial stage, especially when more than one system can be used for a specific measurement task. In this paper, two relatively new large-volume measurement systems, the mobile spatial co-ordinate measuring system (MScMS) and the indoor global positioning system (iGPS), are reviewed. These two systems utilize different technologies: the MScMS is based on ultrasound and radiofrequency signal transmission and the iGPS uses laser technology. Both systems have components with small dimensions that are distributed around the measuring area to form a network of sensors allowing rapid dimensional measurements to be performed in relation to large-size objects, with typical dimensions of several decametres. The portability, reconfigurability, and ease of installation make these systems attractive for many industries that manufacture large-scale products. In this paper, the major technical aspects of the two systems are briefly described and compared. Initial results of the tests performed to establish the repeatability and reproducibility of these systems are also presented. © IMechE 2009.