909 resultados para Concept-based Terminology
Resumo:
Our eyes are input sensors which Provide our brains with streams of visual data. They have evolved to be extremely efficient, and they will constantly dart to-and-fro to rapidly build up a picture of the salient entities in a viewed scene. These actions are almost subconscious. However, they can provide telling signs of how the brain is decoding the visuals and call indicate emotional responses, prior to the viewer becoming aware of them. In this paper we discuss a method of tracking a user's eye movements, and Use these to calculate their gaze within an immersive virtual environment. We investigate how these gaze patterns can be captured and used to identify viewed virtual objects, and discuss how this can be used as a, natural method of interacting with the Virtual Environment. We describe a flexible tool that has been developed to achieve this, and detail initial validating applications that prove the concept.
Resumo:
Many weeds occur in patches but farmers frequently spray whole fields to control the weeds in these patches. Given a geo-referenced weed map, technology exists to confine spraying to these patches. Adoption of patch spraying by arable farmers has, however, been negligible partly due to the difficulty of constructing weed maps. Building on previous DEFRA and HGCA projects, this proposal aims to develop and evaluate a machine vision system to automate the weed mapping process. The project thereby addresses the principal technical stumbling block to widespread adoption of site specific weed management (SSWM). The accuracy of weed identification by machine vision based on a single field survey may be inadequate to create herbicide application maps. We therefore propose to test the hypothesis that sufficiently accurate weed maps can be constructed by integrating information from geo-referenced images captured automatically at different times of the year during normal field activities. Accuracy of identification will also be increased by utilising a priori knowledge of weeds present in fields. To prove this concept, images will be captured from arable fields on two farms and processed offline to identify and map the weeds, focussing especially on black-grass, wild oats, barren brome, couch grass and cleavers. As advocated by Lutman et al. (2002), the approach uncouples the weed mapping and treatment processes and builds on the observation that patches of these weeds are quite stable in arable fields. There are three main aspects to the project. 1) Machine vision hardware. Hardware component parts of the system are one or more cameras connected to a single board computer (Concurrent Solutions LLC) and interfaced with an accurate Global Positioning System (GPS) supplied by Patchwork Technology. The camera(s) will take separate measurements for each of the three primary colours of visible light (red, green and blue) in each pixel. The basic proof of concept can be achieved in principle using a single camera system, but in practice systems with more than one camera may need to be installed so that larger fractions of each field can be photographed. Hardware will be reviewed regularly during the project in response to feedback from other work packages and updated as required. 2) Image capture and weed identification software. The machine vision system will be attached to toolbars of farm machinery so that images can be collected during different field operations. Images will be captured at different ground speeds, in different directions and at different crop growth stages as well as in different crop backgrounds. Having captured geo-referenced images in the field, image analysis software will be developed to identify weed species by Murray State and Reading Universities with advice from The Arable Group. A wide range of pattern recognition and in particular Bayesian Networks will be used to advance the state of the art in machine vision-based weed identification and mapping. Weed identification algorithms used by others are inadequate for this project as we intend to collect and correlate images collected at different growth stages. Plants grown for this purpose by Herbiseed will be used in the first instance. In addition, our image capture and analysis system will include plant characteristics such as leaf shape, size, vein structure, colour and textural pattern, some of which are not detectable by other machine vision systems or are omitted by their algorithms. Using such a list of features observable using our machine vision system, we will determine those that can be used to distinguish weed species of interest. 3) Weed mapping. Geo-referenced maps of weeds in arable fields (Reading University and Syngenta) will be produced with advice from The Arable Group and Patchwork Technology. Natural infestations will be mapped in the fields but we will also introduce specimen plants in pots to facilitate more rigorous system evaluation and testing. Manual weed maps of the same fields will be generated by Reading University, Syngenta and Peter Lutman so that the accuracy of automated mapping can be assessed. The principal hypothesis and concept to be tested is that by combining maps from several surveys, a weed map with acceptable accuracy for endusers can be produced. If the concept is proved and can be commercialised, systems could be retrofitted at low cost onto existing farm machinery. The outputs of the weed mapping software would then link with the precision farming options already built into many commercial sprayers, allowing their use for targeted, site-specific herbicide applications. Immediate economic benefits would, therefore, arise directly from reducing herbicide costs. SSWM will also reduce the overall pesticide load on the crop and so may reduce pesticide residues in food and drinking water, and reduce adverse impacts of pesticides on non-target species and beneficials. Farmers may even choose to leave unsprayed some non-injurious, environmentally-beneficial, low density weed infestations. These benefits fit very well with the anticipated legislation emerging in the new EU Thematic Strategy for Pesticides which will encourage more targeted use of pesticides and greater uptake of Integrated Crop (Pest) Management approaches, and also with the requirements of the Water Framework Directive to reduce levels of pesticides in water bodies. The greater precision of weed management offered by SSWM is therefore a key element in preparing arable farming systems for the future, where policy makers and consumers want to minimise pesticide use and the carbon footprint of farming while maintaining food production and security. The mapping technology could also be used on organic farms to identify areas of fields needing mechanical weed control thereby reducing both carbon footprints and also damage to crops by, for example, spring tines. Objective i. To develop a prototype machine vision system for automated image capture during agricultural field operations; ii. To prove the concept that images captured by the machine vision system over a series of field operations can be processed to identify and geo-reference specific weeds in the field; iii. To generate weed maps from the geo-referenced, weed plants/patches identified in objective (ii).
Resumo:
The modelling of a nonlinear stochastic dynamical processes from data involves solving the problems of data gathering, preprocessing, model architecture selection, learning or adaptation, parametric evaluation and model validation. For a given model architecture such as associative memory networks, a common problem in non-linear modelling is the problem of "the curse of dimensionality". A series of complementary data based constructive identification schemes, mainly based on but not limited to an operating point dependent fuzzy models, are introduced in this paper with the aim to overcome the curse of dimensionality. These include (i) a mixture of experts algorithm based on a forward constrained regression algorithm; (ii) an inherent parsimonious delaunay input space partition based piecewise local lineal modelling concept; (iii) a neurofuzzy model constructive approach based on forward orthogonal least squares and optimal experimental design and finally (iv) the neurofuzzy model construction algorithm based on basis functions that are Bézier Bernstein polynomial functions and the additive decomposition. Illustrative examples demonstrate their applicability, showing that the final major hurdle in data based modelling has almost been removed.
Resumo:
This paper presents the theoretical development of a nonlinear adaptive filter based on a concept of filtering by approximated densities (FAD). The most common procedures for nonlinear estimation apply the extended Kalman filter. As opposed to conventional techniques, the proposed recursive algorithm does not require any linearisation. The prediction uses a maximum entropy principle subject to constraints. Thus, the densities created are of an exponential type and depend on a finite number of parameters. The filtering yields recursive equations involving these parameters. The update applies the Bayes theorem. Through simulation on a generic exponential model, the proposed nonlinear filter is implemented and the results prove to be superior to that of the extended Kalman filter and a class of nonlinear filters based on partitioning algorithms.
Resumo:
The study explores what happens to teachers practice and ’ professional identity when they adopt a collaborative action research approach to teaching and involve external creative partners and a university mentor. The teachers aim to nurture and develop the creative potential of their learners through empowering them to make decisions for themselves about their own progress and learning directions. The teachers worked creatively and collaboratively designing creative teaching and learning methods in support of pupils with language and communication difficulties. The respondents are from an English special school, primary school and girls secondary school. A mixed methods methodology is adopted. Gains in teacher confidence and capability were identified in addition to shifts in values that impacted directly on their self-concept of what it is to be an effective teacher promoting effective learning. The development of their professional identities within a team ethos included them being able to make decisions about learning that are based on the educational potential of learners that they proved resulted in elevated standards achieved by this group of learners. They were able to justify their actions on established educational principles. Tensions however were revealed between what they perceived as their normal required professionalism imposed by external agencies and the enhanced professionalism experienced working through the project where they were able to integrate theory and practice.
Resumo:
It is now established that native language affects one's perception of the world. However, it is unknown whether this effect is merely driven by conscious, language-based evaluation of the environment or whether it reflects fundamental differences in perceptual processing between individuals speaking different languages. Using brain potentials, we demonstrate that the existence in Greek of 2 color terms—ghalazio and ble—distinguishing light and dark blue leads to greater and faster perceptual discrimination of these colors in native speakers of Greek than in native speakers of English. The visual mismatch negativity, an index of automatic and preattentive change detection, was similar for blue and green deviant stimuli during a color oddball detection task in English participants, but it was significantly larger for blue than green deviant stimuli in native speakers of Greek. These findings establish an implicit effect of language-specific terminology on human color perception.
Resumo:
The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.
Resumo:
The present study aims to contribute to an understanding of the complexity of lobbying activities within the accounting standard-setting process in the UK. The paper reports detailed content analysis of submission letters to four related exposure drafts. These preceded two accounting standards that set out the concept of control used to determine the scope of consolidation in the UK, except for reporting under international standards. Regulation on the concept of control provides rich patterns of lobbying behaviour due to its controversial nature and its significance to financial reporting. Our examination is conducted by dividing lobbyists into two categories, corporate and non-corporate, which are hypothesised (and demonstrated) to lobby differently. In order to test the significance of these differences we apply ANOVA techniques and univariate regression analysis. Corporate respondents are found to devote more attention to issues of specific applicability of the concept of control, whereas non-corporate respondents tend to devote more attention to issues of general applicability of this concept. A strong association between the issues raised by corporate respondents and their line of business is revealed. Both categories of lobbyists are found to advance conceptually-based arguments more often than economic consequences-based or combined arguments. However, when economic consequences-based arguments are used, they come exclusively from the corporate category of respondents.
Resumo:
This contribution proposes a novel probability density function (PDF) estimation based over-sampling (PDFOS) approach for two-class imbalanced classification problems. The classical Parzen-window kernel function is adopted to estimate the PDF of the positive class. Then according to the estimated PDF, synthetic instances are generated as the additional training data. The essential concept is to re-balance the class distribution of the original imbalanced data set under the principle that synthetic data sample follows the same statistical properties. Based on the over-sampled training data, the radial basis function (RBF) classifier is constructed by applying the orthogonal forward selection procedure, in which the classifier’s structure and the parameters of RBF kernels are determined using a particle swarm optimisation algorithm based on the criterion of minimising the leave-one-out misclassification rate. The effectiveness of the proposed PDFOS approach is demonstrated by the empirical study on several imbalanced data sets.
Resumo:
Purpose – The aim of this paper is to investigate how values from within Abrahamic religions could be adopted to improve liberal market economies’ (LMEs’) corporate governance business practices. Design/methodology/approach – The concept of spiritual capitalism is explained from an Islamic perspective by adopting three universal Abrahamic values to critically analyse LMEs and offer an ethical alternative to current capitalism concerns. Findings – It is found that LMEs can be improved by considering all stakeholders, putting ethics before economics, and introducing shared risk/reward plus lower debt. Originality/value – The paper compares LMEs/Co-ordinated market economies (CMEs)/Islamic countries economies (ICEs) within an ethical framework for LMEs.
Resumo:
Home-based online business ventures are an increasingly pervasive yet under-researched phenomenon. The experiences and mindset of entrepreneurs setting up and running such enterprises require better understanding. Using data from a qualitative study of 23 online home-based business entrepreneurs, we propose the augmented concept of ‘mental mobility’ to encapsulate how they approach their business activities. Drawing on Howard P. Becker's early theorising of mobility, together with Victor Turner's later notion of liminality, we conceptualise mental mobility as the process through which individuals navigate the liminal spaces between the physical and digital spheres of work and the overlapping home/workplace, enabling them to manipulate and partially reconcile the spatial, temporal and emotional tensions that are present in such work environments. Our research also holds important applications for alternative employment contexts and broader social orderings because of the increasingly pervasive and disruptive influence of technology on experiences of remunerated work.
Resumo:
ESA’s first multi-satellite mission Cluster is unique in its concept of 4 satellites orbiting in controlled formations. This will give an unprecedented opportunity to study structure and dynamics of the magnetosphere. In this paper we discuss ways in which ground-based remote-sensing observations of the ionosphere can be used to support the multipoint in-situ satellite measurements. There are a very large number of potentially useful configurations between the satellites and any one ground-based observatory; however, the number of ideal occurrences for any one configuration is low. Many of the ground-based instruments cannot operate continuously and Cluster will take data only for a part of each orbit, depending on how much high-resolution (‘burst-mode’) data are acquired. In addition, there are a great many instrument modes and the formation, size and shape of the cluster of the four satellites to consider. These circumstances create a clear and pressing need for careful planning to ensure that the scientific return from Cluster is maximised by additional coordinated ground-based observations. For this reason, ESA established a working group to coordinate the observations on the ground with Cluster. We will give a number of examples how the combined spacecraft and ground-based observations can address outstanding questions in magnetospheric physics. An online computer tool has been prepared to allow for the planning of conjunctions and advantageous constellations between the Cluster spacecraft and individual or combined ground-based systems. During the mission a ground-based database containing index and summary data will help to identify interesting datasets and allow to select intervals for coordinated studies. We illustrate the philosophy of our approach, using a few important examples of the many possible configurations between the satellite and the ground-based instruments.
Resumo:
Advances in hardware and software technologies allow to capture streaming data. The area of Data Stream Mining (DSM) is concerned with the analysis of these vast amounts of data as it is generated in real-time. Data stream classification is one of the most important DSM techniques allowing to classify previously unseen data instances. Different to traditional classifiers for static data, data stream classifiers need to adapt to concept changes (concept drift) in the stream in real-time in order to reflect the most recent concept in the data as accurately as possible. A recent addition to the data stream classifier toolbox is eRules which induces and updates a set of expressive rules that can easily be interpreted by humans. However, like most rule-based data stream classifiers, eRules exhibits a poor computational performance when confronted with continuous attributes. In this work, we propose an approach to deal with continuous data effectively and accurately in rule-based classifiers by using the Gaussian distribution as heuristic for building rule terms on continuous attributes. We show on the example of eRules that incorporating our method for continuous attributes indeed speeds up the real-time rule induction process while maintaining a similar level of accuracy compared with the original eRules classifier. We termed this new version of eRules with our approach G-eRules.
Resumo:
An important application of Big Data Analytics is the real-time analysis of streaming data. Streaming data imposes unique challenges to data mining algorithms, such as concept drifts, the need to analyse the data on the fly due to unbounded data streams and scalable algorithms due to potentially high throughput of data. Real-time classification algorithms that are adaptive to concept drifts and fast exist, however, most approaches are not naturally parallel and are thus limited in their scalability. This paper presents work on the Micro-Cluster Nearest Neighbour (MC-NN) classifier. MC-NN is based on an adaptive statistical data summary based on Micro-Clusters. MC-NN is very fast and adaptive to concept drift whilst maintaining the parallel properties of the base KNN classifier. Also MC-NN is competitive compared with existing data stream classifiers in terms of accuracy and speed.
Resumo:
Objectives In this study a prototype of a new health forecasting alert system is developed, which is aligned to the approach used in the Met Office’s (MO) National Severe Weather Warning Service (NSWWS). This is in order to improve information available to responders in the health and social care system by linking temperatures more directly to risks of mortality, and developing a system more coherent with other weather alerts. The prototype is compared to the current system in the Cold Weather and Heatwave plans via a case-study approach to verify its potential advantages and shortcomings. Method The prototype health forecasting alert system introduces an “impact vs likelihood matrix” for the health impacts of hot and cold temperatures which is similar to those used operationally for other weather hazards as part of the NSWWS. The impact axis of this matrix is based on existing epidemiological evidence, which shows an increasing relative risk of death at extremes of outdoor temperature beyond a threshold which can be identified epidemiologically. The likelihood axis is based on a probability measure associated with the temperature forecast. The new method is tested for two case studies (one during summer 2013, one during winter 2013), and compared to the performance of the current alert system. Conclusions The prototype shows some clear improvements over the current alert system. It allows for a much greater degree of flexibility, provides more detailed regional information about the health risks associated with periods of extreme temperatures, and is more coherent with other weather alerts which may make it easier for front line responders to use. It will require validation and engagement with stakeholders before it can be considered for use.