59 resultados para computer processing of language

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Native speakers learn their mother tongue slowly, from birth, by the constant repetition of common words and phrases in a variety of contexts and situations, within the language community. As foreign language learners, we face considerable disadvantages when compared to children learning their mother tongue. Foreign language learners start later in life, have less time, have fewer opportunities to experience the language, and learn in the restricted environment of the classroom. Teachers and books give us information about many words and phrases, but it is difficult for us to know what we need to focus on and learn thoroughly, and what is less important. The rules and explanations are often difficult for us to understand. A large language corpus represents roughly the amount and variety of language that a native-speaker experiences in a whole lifetime. Learners can now access language corpora. We can check which words and phrases are important, and quickly discover their common meanings, collocations, and structural patterns. It is easier to remember things that we find out ourselves, rather than things that teachers or books tell us. Each click on the computer keyboard can show us the same information in different ways, so we can understand it more easily. We can also get many more examples from a corpus. Teachers and native-speakers can also use corpora, to confirm and enhance their own knowledge of a language, and prepare exercises to guide their students. Each of us can learn at our own level and at our own speed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis describes an investigation into methods for the specification, design and implementation of computer control systems for flexible manufacturing machines comprising multiple, independent, electromechanically-driven mechanisms. An analysis is made of the elements of conventional mechanically-coupled machines in order that the operational functions of these elements may be identified. This analysis is used to define the scope of requirements necessary to specify the format, function and operation of a flexible, independently driven mechanism machine. A discussion of how this type of machine can accommodate modern manufacturing needs of high-speed and flexibility is presented. A sequential method of capturing requirements for such machines is detailed based on a hierarchical partitioning of machine requirements from product to independent drive mechanism. A classification of mechanisms using notations, including Data flow diagrams and Petri-nets, is described which supports capture and allows validation of requirements. A generic design for a modular, IDM machine controller is derived based upon hierarchy of control identified in these machines. A two mechanism experimental machine is detailed which is used to demonstrate the application of the specification, design and implementation techniques. A computer controller prototype and a fully flexible implementation for the IDM machine, based on Petri-net models described using the concurrent programming language Occam, is detailed. The ability of this modular computer controller to support flexible, safe and fault-tolerant operation of the two intermittent motion, discrete-synchronisation independent drive mechanisms is presented. The application of the machine development methodology to industrial projects is established.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was concerned with the computer automation of land evaluation. This is a broad subject with many issues to be resolved, so the study concentrated on three key problems: knowledge based programming; the integration of spatial information from remote sensing and other sources; and the inclusion of socio-economic information into the land evaluation analysis. Land evaluation and land use planning were considered in the context of overseas projects in the developing world. Knowledge based systems were found to provide significant advantages over conventional programming techniques for some aspects of the land evaluation process. Declarative languages, in particular Prolog, were ideally suited to integration of social information which changes with every situation. Rule-based expert system shells were also found to be suitable for this role, including knowledge acquisition at the interview stage. All the expert system shells examined suffered from very limited constraints to problem size, but new products now overcome this. Inductive expert system shells were useful as a guide to knowledge gaps and possible relationships, but the number of examples required was unrealistic for typical land use planning situations. The accuracy of classified satellite imagery was significantly enhanced by integrating spatial information on soil distribution for Thailand data. Estimates of the rice producing area were substantially improved (30% change in area) by the addition of soil information. Image processing work on Mozambique showed that satellite remote sensing was a useful tool in stratifying vegetation cover at provincial level to identify key development areas, but its full utility could not be realised on typical planning projects, without treatment as part of a complete spatial information system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A sieve plate distillation column has been constructed and interfaced to a minicomputer with the necessary instrumentation for dynamic, estimation and control studies with special bearing on low-cost and noise-free instrumentation. A dynamic simulation of the column with a binary liquid system has been compiled using deterministic models that include fluid dynamics via Brambilla's equation for tray liquid holdup calculations. The simulation predictions have been tested experimentally under steady-state and transient conditions. The simulator's predictions of the tray temperatures have shown reasonably close agreement with the measured values under steady-state conditions and in the face of a step change in the feed rate. A method of extending linear filtering theory to highly nonlinear systems with very nonlinear measurement functional relationships has been proposed and tested by simulation on binary distillation. The simulation results have proved that the proposed methodology can overcome the typical instability problems associated with the Kalman filters. Three extended Kalman filters have been formulated and tested by simulation. The filters have been used to refine a much simplified model sequentially and to estimate parameters such as the unmeasured feed composition using information from the column simulation. It is first assumed that corrupted tray composition measurements are made available to the filter and then corrupted tray temperature measurements are accessed instead. The simulation results have demonstrated the powerful capability of the Kalman filters to overcome the typical hardware problems associated with the operation of on-line analyzers in relation to distillation dynamics and control by, in effect, replacirig them. A method of implementing estimator-aided feedforward (EAFF) control schemes has been proposed and tested by simulation on binary distillation. The results have shown that the EAFF scheme provides much better control and energy conservation than the conventional feedback temperature control in the face of a sustained step change in the feed rate or multiple changes in the feed rate, composition and temperature. Further extensions of this work are recommended as regards simulation, estimation and EAFF control.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Caffeine is known to increase arousal, attention, and information processing–all factors implicated in facilitating persuasion. In a standard attitude-change paradigm, participants consumed an orange-juice drink that either contained caffeine (3.5 mg/kg body weight) or did not (placebo) prior to reading a counterattitudinal communication (anti-voluntary euthanasia). Participants then completed a thought-listing task and a number of attitude scales. The first experiment showed that those who consumed caffeine showed greater agreement with the communication (direct attitude: voluntary euthanasia) and on an issue related to, but not contained in, the communication (indirect attitude: abortion). The order in which direct and indirect attitudes were measured did not affect the results. A second experiment manipulated the quality of the arguments in the message (strong vs. weak) to determine whether systematic processing had occurred. There was evidence that systematic processing occurred in both drink conditions, but was greater for those who had consumed caffeine. In both experiments, the amount of message-congruent thinking mediated persuasion. These results show that caffeine can increase the extent to which people systematically process and arc influenced by a persuasive communication.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two experiments investigated the conditions under which majority and minority sources instigate systematic processing of their messages. Both experiments crossed source status (majority vs. minority) with message quality (strong vs. weak arguments). In each experiment, message elaboration was manipulated by varying either motivational (outcome relevance, Experiment 1) or cognitive (orientating tasks, Experiment 2) factors. The results showed that when either motivational or cognitive factors encouraged low message elaboration, there was heuristic acceptance of the majority position without detailed message processing. When the level of message elaboration was intermediate, there was message processing only for the minority source. Finally, when message elaboration was high, there was message processing for both source conditions. These results show that majority and minority influence is sensitive to motivational and cognitive factors that constrain or enhance message elaboration and that both sources can lead to systematic processing under specific circumstances. © 2007 by the Society for Personality and Social Psychology, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rectum has a unique physiological role as a sensory organ and differs in its afferent innervation from other gut organs that do not normally mediate conscious sensation. We compared the central processing of human esophageal, duodenal, and rectal sensation using cortical evoked potentials (CEP) in 10 healthy volunteers (age range 21-34 yr). Esophageal and duodenal CEP had similar morphology in all subjects, whereas rectal CEP had two different but reproducible morphologies. The rectal CEP latency to the first component P1 (69 ms) was shorter than both duodenal (123 ms; P = 0.008) and esophageal CEP latencies (106 ms; P = 0.004). The duodenal CEP amplitude of the P1-N1 component (5.0 µV) was smaller than that of the corresponding esophageal component (5.7 µV; P = 0.04) but similar to that of the corresponding rectal component (6.5 µV; P = 0.25). This suggests that rectal sensation is either mediated by faster-conducting afferent pathways or that there is a difference in the orientation or volume of cortical neurons representing the different gut organs. In conclusion, the physiological and anatomic differences between gut organs are reflected in differences in the characteristics of their afferent pathways and cortical processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background/Aims: Positron emission tomography has been applied to study cortical activation during human swallowing, but employs radio-isotopes precluding repeated experiments and has to be performed supine, making the task of swallowing difficult. Here we now describe Synthetic Aperture Magnetometry (SAM) as a novel method of localising and imaging the brain's neuronal activity from magnetoencephalographic (MEG) signals to study the cortical processing of human volitional swallowing in the more physiological prone position. Methods: In 3 healthy male volunteers (age 28–36), 151-channel whole cortex MEG (Omega-151, CTF Systems Inc.) was recorded whilst seated during the conditions of repeated volitional wet swallowing (5mls boluses at 0.2Hz) or rest. SAM analysis was then performed using varying spatial filters (5–60Hz) before co-registration with individual MRI brain images. Activation areas were then identified using standard sterotactic space neuro-anatomical maps. In one subject repeat studies were performed to confirm the initial study findings. Results: In all subjects, cortical activation maps for swallowing could be generated using SAM, the strongest activations being seen with 10–20Hz filter settings. The main cortical activations associated with swallowing were in: sensorimotor cortex (BA 3,4), insular cortex and lateral premotor cortex (BA 6,8). Of relevance, each cortical region displayed consistent inter-hemispheric asymmetry, to one or other hemisphere, this being different for each region and for each subject. Intra-subject comparisons of activation localisation and asymmetry showed impressive reproducibility. Conclusion: SAM analysis using MEG is an accurate, repeatable, and reproducible method for studying the brain processing of human swallowing in a more physiological manner and provides novel opportunities for future studies of the brain-gut axis in health and disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the introduction to the special issue “Languaging the worker: globalized governmentalities in/of language in peripheral spaces”, we take up the notion of governmentality as a means to interrogate the complex relationship between language, labor, power and subjectivity in peripheral multilingual spaces. Our aim here is to argue for the study of governmentality as a viable and growing approach in critical sociolinguistic research. As such, in this introduction, we first discuss key concepts germane to our interrogations, including the notions of governmentality, languaging, peripherality and language worker. We proceed to map out five ethnographically and discourse-analytically informed case studies. These examine diverse actors in different settings pertaining to the domain of work. Finally we chart how the case studies construe the issue of languaging the worker through a governmentality frame.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The following thesis describes the computer modelling of radio frequency capacitively coupled methane/hydrogen plasmas and the consequences for the reactive ion etching of (100) GaAs surfaces. In addition a range of etching experiments was undertaken over a matrix of pressure, power and methane concentration. The resulting surfaces were investigated using X-ray photoelectron spectroscopy and the results were discussed in terms of physical and chemical models of particle/surface interactions in addition to the predictions for energies, angles and relative fluxes to the substrate of the various plasma species. The model consisted of a Monte Carlo code which followed electrons and ions through the plasma and sheath potentials whilst taking account of collisions with background neutral gas molecules. The ionisation profile output from the electron module was used as input for the ionic module. Momentum scattering interactions of ions with gas molecules were investigated via different models and compared against results given by quantum mechanical code. The interactions were treated as central potential scattering events and the resulting neutral cascades were followed. The resulting predictions for ion energies at the cathode compared well to experimental ion energy distributions and this verified the particular form of the electrical potentials used and their applicability in the particular geometry plasma cell used in the etching experiments. The final code was used to investigate the effect of external plasma parameters on the mass distribution, energy and angles of all species impingent on the electrodes. Comparisons of electron energies in the plasma also agreed favourably with measurements made using a Langmuir electric probe. The surface analysis showed the surfaces all to be depleted in arsenic due to its preferential removal and the resultant Ga:As ratio in the surface was found to be directly linked to the etch rate. The etch rate was determined by the methane flux which was predicted by the code.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes work carried out to improve the fundamental modelling of liquid flows on distillation trays. A mathematical model is presented based on the principles of computerised fluid dynamics. It models the liquid flow in the horizontal directions allowing for the effects of the vapour through the use of an increased liquid turbulence, modelled by an eddy viscosity, and a resistance to liquid flow caused by the vapour being accelerated horizontally by the liquid. The resultant equations are similar to the Navier-Stokes equations with the addition of a resistance term.A mass-transfer model is used to calculate liquid concentration profiles and tray efficiencies. A heat and mass transfer analogy is used to compare theoretical concentration profiles to experimental water-cooling data obtained from a 2.44 metre diameter air-water distillation simulation rig. The ratios of air to water flow rates are varied in order to simulate three pressures: vacuum, atmospheric pressure and moderate pressure.For simulated atmospheric and moderate pressure distillation, the fluid mechanical model constantly over-predicts tray efficiencies with an accuracy of between +1.7% and +11.3%. This compares to -1.8% to -10.9% for the stagnant regions model (Porter et al. 1972) and +12.8% to +34.7% for the plug flow plus back-mixing model (Gerster et al. 1958). The model fails to predict the flow patterns and tray efficiencies for vacuum simulation due to the change in the mechanism of liquid transport, from a liquid continuous layer to a spray as the liquid flow-rate is reduced. This spray is not taken into account in the development of the fluid mechanical model. A sensitivity analysis carried out has shown that the fluid mechanical model is relatively insensitive to the prediction of the average height of clear liquid, and a reduction in the resistance term results in a slight loss of tray efficiency. But these effects are not great. The model is quite sensitive to the prediction of the eddy viscosity term. Variations can produce up to a 15% decrease in tray efficiency. The fluid mechanical model has been incorporated into a column model so that statistical optimisation techniques can be employed to fit a theoretical column concentration profile to experimental data. Through the use of this work mass-transfer data can be obtained.