770 resultados para Forestry machine manufacturing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Structural, organizational, and technological changes in British industry during the interwar years led to a decline in skilled and physically demanding work, while there was a dramatic expansion in unskilled and semiskilled employment. Previous authors have noted that the new un/semiskilled jobs were generally filled by “fresh” workers recruited from outside the core manufacturing workforce, though there is considerable disagreement regarding the composition of this new workforce. This paper examines labour recruitment patterns and strategies using national data and case studies of eight rapidly expanding industrial centres. The new industrial workforce is shown to have been recruited from a “reserve army” of workers with the common features of relative cheapness, flexibility, and weak unionization. These included women, juveniles, local workers in poorly paid nonindustrial sectors, such as agriculture, and (where these other categories were in short supply) relatively young long-distance internal migrants from declining industrial areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a look is taken at how the use of implant and electrode technology can be employed to create biological brains for robots, to enable human enhancement and to diminish the effects of certain neural illnesses. In all cases the end result is to increase the range of abilities of the recipients. An indication is given of a number of areas in which such technology has already had a profound effect, a key element being the need for a clear interface linking a biological brain directly with computer technology. The emphasis is placed on practical scientific studies that have been and are being undertaken and reported on. The area of focus is the use of electrode technology, where either a connection is made directly with the cerebral cortex and/or nervous system or where implants into the human body are involved. The paper also considers robots that have biological brains in which human neurons can be employed as the sole thinking machine for a real world robot body.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores a novel tactile human-machine interface based on the controlled stimulation of mechanoreceptors by a subdermal magnetic implant manipulated through an external electromagnet. The selection of a suitable implant magnet and implant site is discussed and an external interface for manipulating the implant is described. The paper also reports on the basic properties of such an interface, including magnetic field strength sensitivity and frequency sensitivity obtained through experimentation on two participants. Finally, the paper presents two practical application scenarios for the interface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The invention relates to immunoassays, methods for carrying out immunoassays, immunoassay kits and methods for manufacturing immunoassay kits. In particular, the invention has relevance to capillary (especially microcapillary) immunoassay technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the regional characteristics of Indian manufacturing industry. Its aim is to assess whether geography plays any major role in determining the performance or characteristics of Indian manufacturing firms, and in order to do this, it presents the results of cross-section regressions estimated on the basis of a balanced sample of 1607 firms across the 30 Indian states. The results suggest that firm performance and characteristics are related to many of the expected industrial organization variables. However, there is also evidence of significant region–state influences on both the performance and characteristics of Indian manufacturing industry. As such, the results demonstrate that analyses which focus solely on standard non-spatial industrial organization variables will fail to explain much of the cross-sectional variation in firm performance and characteristics. In particular, while there are no systematic simple centre–periphery variations in the Indian regional economic system, there is evidence to suggest that industrial spatial concentration, regional specialization, and regional market size play a key role in determining the performance and characteristics of Indian manufacturing industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this study is to examine the relationship between business-level strategy and organisational performance and to test the applicability of Porter's generic strategies in explaining differences in the performance of organisations. Design/methodology/approach – The study was focussed on manufacturing firms in the UK belonging to the electrical and mechanical engineering sectors. Data were collected through a postal survey using the survey instrument from 124 organisations and the respondents were all at CEO level. Both objective and subjective measures were used to assess performance. Non-response bias was assessed statistically and it was not found to be a major problem affecting this study. Appropriate measures were taken to ensure that common method variance (CMV) does not affect the results of this study. Statistical tests indicated that CMV problem does not affect the results of this study. Findings – The results of this study indicate that firms adopting one of the strategies, namely cost-leadership or differentiation, perform better than “stuck-in-the-middle” firms which do not have a dominant strategic orientation. The integrated strategy group has lower performance compared with cost-leaders and differentiators in terms of financial performance measures. This provides support for Porter's view that combination strategies are unlikely to be effective in organisations. However, the cost-leadership and differentiation strategies were not strongly correlated with the financial performance measures indicating the limitations of Porter's generic strategies in explaining performance heterogeneity in organisations. Originality/value – This study makes an important contribution to the literature by identifying some of the gaps in the literature through a systematic literature review and addressing those gaps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the ten years since the first edition of this book appeared there have been significant developments in food process engineering, notably in biotechnology and membrane application. Advances have been made in the use of sensors for process control, and the growth of information technology and on-line computer applications continues apace. In addition, plant investment decisions are increasingly determined by quality assurance considerations and have to incorporate a greater emphasis on health and safety issues. The content of this edition has been rearranged to include descriptions of recent developments and to reflect the influence of new technology on the control and operations of automated plant. Original examples have been retained where relevant and these, together with many new illustrations, provide a comprehensive guide to good practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we consider transcripts which originated from a practical series of Turing’s Imitation Game which was held on 23rd June 2012 at Bletchley Park, England. In some cases the tests involved a 3-participant simultaneous comparison of two hidden entities whereas others were the result of a direct 2-participant interaction. Each of the transcripts considered here resulted in a human interrogator being fooled, by a machine, into concluding that they had been conversing with a human. Particular features of the conversation are highlighted, successful ploys on the part of each machine discussed and likely reasons for the interrogator being fooled are considered. Subsequent feedback from the interrogators involved is also included

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sponge cakes have traditionally been manufactured using multistage mixing methods to enhance potential foam formation by the eggs. Today, use of all-in (single-stage) mixing methods is superseding multistage methods for large-scale batter preparation to reduce costs and production time. In this study, multistage and all-in mixing procedures and three final high-speed mixing times (3, 5, and 15 min) for sponge cake production were tested to optimize a mixing method for pilot-scale research. Mixing for 3 min produced batters with higher relative density values than did longer mixing times. These batters generated well-aerated cakes with high volume and low hardness. In contrast, after 5 and 15 min of high-speed mixing, batters with lower relative density and higher viscosity values were produced. Although higher bubble incorporation and retention were observed, longer mixing times produced better developed gluten networks, which stiffened the batters and inhibited bubble expansion during mixing. As a result, these batters did not expand properly and produced cakes with low volume, dense crumb, and high hardness values. Results for all-in mixing were similar to those for the multistage mixing procedure in terms of the physical properties of batters and cakes (i.e., relative density, elastic moduli, volume, total cell area, hardness, etc.). These results suggest the all-in mixing procedure with a final high-speed mixing time of 3 min is an appropriate mixing method for pilot-scale sponge cake production. The advantages of this method are reduced energy costs and production time.