897 resultados para Education approach to work
Resumo:
This paper describes the design and implementation of an agent based network for the support of collaborative switching tasks within the control room environment of the National Grid Company plc. This work includes aspects from several research disciplines, including operational analysis, human computer interaction, finite state modelling techniques, intelligent agents and computer supported co-operative work. Aspects of these procedures have been used in the analysis of collaborative tasks to produce distributed local models for all involved users. These models have been used as the basis for the production of local finite state automata. These automata have then been embedded within an agent network together with behavioural information extracted from the task and user analysis phase. The resulting support system is capable of task and communication management within the transmission despatch environment.
Resumo:
Classical risk assessment approaches for animal diseases are influenced by the probability of release, exposure and consequences of a hazard affecting a livestock population. Once a pathogen enters into domestic livestock, potential risks of exposure and infection both to animals and people extend through a chain of economic activities related to producing, buying and selling of animals and products. Therefore, in order to understand economic drivers of animal diseases in different ecosystems and to come up with effective and efficient measures to manage disease risks from a country or region, the entire value chain and related markets for animal and product needs to be analysed to come out with practical and cost effective risk management options agreed by actors and players on those value chains. Value chain analysis enriches disease risk assessment providing a framework for interdisciplinary collaboration, which seems to be in increasing demand for problems concerning infectious livestock diseases. The best way to achieve this is to ensure that veterinary epidemiologists and social scientists work together throughout the process at all levels.
Resumo:
Top Down Induction of Decision Trees (TDIDT) is the most commonly used method of constructing a model from a dataset in the form of classification rules to classify previously unseen data. Alternative algorithms have been developed such as the Prism algorithm. Prism constructs modular rules which produce qualitatively better rules than rules induced by TDIDT. However, along with the increasing size of databases, many existing rule learning algorithms have proved to be computational expensive on large datasets. To tackle the problem of scalability, parallel classification rule induction algorithms have been introduced. As TDIDT is the most popular classifier, even though there are strongly competitive alternative algorithms, most parallel approaches to inducing classification rules are based on TDIDT. In this paper we describe work on a distributed classifier that induces classification rules in a parallel manner based on Prism.
Resumo:
Induction of classification rules is one of the most important technologies in data mining. Most of the work in this field has concentrated on the Top Down Induction of Decision Trees (TDIDT) approach. However, alternative approaches have been developed such as the Prism algorithm for inducing modular rules. Prism often produces qualitatively better rules than TDIDT but suffers from higher computational requirements. We investigate approaches that have been developed to minimize the computational requirements of TDIDT, in order to find analogous approaches that could reduce the computational requirements of Prism.
Resumo:
Equilibrium theory occupies an important position in chemistry and it is traditionally based on thermodynamics. A novel mathematical approach to chemical equilibrium theory for gaseous systems at constant temperature and pressure is developed. Six theorems are presented logically which illustrate the power of mathematics to explain chemical observations and these are combined logically to create a coherent system. This mathematical treatment provides more insight into chemical equilibrium and creates more tools that can be used to investigate complex situations. Although some of the issues covered have previously been given in the literature, new mathematical representations are provided. Compared to traditional treatments, the new approach relies on straightforward mathematics and less on thermodynamics, thus, giving a new and complementary perspective on equilibrium theory. It provides a new theoretical basis for a thorough and deep presentation of traditional chemical equilibrium. This work demonstrates that new research in a traditional field such as equilibrium theory, generally thought to have been completed many years ago, can still offer new insights and that more efficient ways to present the contents can be established. The work presented here can be considered appropriate as part of a mathematical chemistry course at University level.
Resumo:
Flexibility of information systems (IS) have been studied to improve the adaption in support of the business agility as the set of capabilities to compete more effectively and adapt to rapid changes in market conditions (Glossary of business agility terms, 2003). However, most of work on IS flexibility has been limited to systems architecture, ignoring the analysis of interoperability as a part of flexibility from the requirements. This paper reports a PhD project, which proposes an approach to develop IS with flexibility features, considering some challenges of flexibility in small and medium enterprises (SMEs) such as the lack of interoperability and the agility of their business. The motivation of this research are the high prices of IS in developing countries and the usefulness of organizational semiotics to support the analysis of requirements for IS. (Liu, 2005).
Resumo:
Advances in hardware technologies allow to capture and process data in real-time and the resulting high throughput data streams require novel data mining approaches. The research area of Data Stream Mining (DSM) is developing data mining algorithms that allow us to analyse these continuous streams of data in real-time. The creation and real-time adaption of classification models from data streams is one of the most challenging DSM tasks. Current classifiers for streaming data address this problem by using incremental learning algorithms. However, even so these algorithms are fast, they are challenged by high velocity data streams, where data instances are incoming at a fast rate. This is problematic if the applications desire that there is no or only a very little delay between changes in the patterns of the stream and absorption of these patterns by the classifier. Problems of scalability to Big Data of traditional data mining algorithms for static (non streaming) datasets have been addressed through the development of parallel classifiers. However, there is very little work on the parallelisation of data stream classification techniques. In this paper we investigate K-Nearest Neighbours (KNN) as the basis for a real-time adaptive and parallel methodology for scalable data stream classification tasks.
Resumo:
Accurate and reliable rain rate estimates are important for various hydrometeorological applications. Consequently, rain sensors of different types have been deployed in many regions. In this work, measurements from different instruments, namely, rain gauge, weather radar, and microwave link, are combined for the first time to estimate with greater accuracy the spatial distribution and intensity of rainfall. The objective is to retrieve the rain rate that is consistent with all these measurements while incorporating the uncertainty associated with the different sources of information. Assuming the problem is not strongly nonlinear, a variational approach is implemented and the Gauss–Newton method is used to minimize the cost function containing proper error estimates from all sensors. Furthermore, the method can be flexibly adapted to additional data sources. The proposed approach is tested using data from 14 rain gauges and 14 operational microwave links located in the Zürich area (Switzerland) to correct the prior rain rate provided by the operational radar rain product from the Swiss meteorological service (MeteoSwiss). A cross-validation approach demonstrates the improvement of rain rate estimates when assimilating rain gauge and microwave link information.
Resumo:
This chapter explores the spatialities of children's rights through a focus on how children's paid and unpaid work in Sub-Saharan Africa intersects with wider debates about child labor, child domestic work and young caregiving. Several tensions surround the universalist and individualistic nature of the rights discourse in the context of Sub-Saharan Africa and policymakers, practitioners, children and community members have emphasized children's responsibilities to their families and communities, as well as their rights. The limitations of ILO definitions of child labor and child domestic work and UNCRC concerns about 'hazardous' and 'harmful' work are highlighted through examining the situation of children providing unpaid domestic and care support to family members in the private space of their own or a relative's home. Differing perspectives towards young caregiving have been adopted to date by policymakers and practitioners in East Africa, ranging from a child labor/ child protection/ abolitionist approach, to a 'young carers'/ child-centered rights perspective. These differing perspectives influence the level and nature of support and resources that children involved in care work may be able to access. A contextual, multi-sectorial approach to young caregiving is needed that seeks to understand children's, family members' and community members' perceptions of what constitutes inappropriate caring responsibilities within particular cultural contexts and how these should best be alleviated.
Resumo:
For the diagnosis and prognosis of the problems of quality of life, a multidisciplinary ecosystemic approach encompasses four dimensions of being-in-the-world, as donors and recipients: intimate, interactive, social and biophysical. Social, cultural and environmental vulnerabilities are understood and dealt with, in different circumstances of space and time, as the conjugated effect of all dimensions of being-in-the-world, as they induce the events (deficits and assets), cope with consequences (desired or undesired) and contribute for change. Instead of fragmented and reduced representations of reality, diagnosis and prognosis of cultural, educational, environmental and health problems considers the connections (assets) and ruptures (deficits) between the different dimensions, providing a planning model to develop and evaluate research, teaching programmes, public policies and field projects. The methodology is participatory, experiential and reflexive; heuristic-hermeneutic processes unveil cultural and epistemic paradigms that orient subject-object relationships; giving people the opportunity to reflect on their own realities, engage in new experiences and find new ways to live better in a better world. The proposal is a creative model for thought and practice, providing many opportunities for discussion, debate and development of holistic projects integrating different scientific domains (social sciences, psychology, education, philosophy, etc.).
Resumo:
Promoting the inclusion of students with disabilities in e-learning systems has brought many challenges for researchers and educators. The use of synchronous communication tools such as interactive whiteboards has been regarded as an obstacle for inclusive education. In this paper, we present the proposal of an inclusive approach to provide blind students with the possibility to participate in live learning sessions with whiteboard software. The approach is based on the provision of accessible textual descriptions by a live mediator. With the accessible descriptions, students are able to navigate through the elements and explore the content of the class using screen readers. The method used for this study consisted of the implementation of a software prototype within a virtual learning environment and a case study with the participation of a blind student in a live distance class. The results from the case study have shown that this approach can be very effective, and may be a starting point to provide blind students with resources they had previously been deprived from. The proof of concept implemented has shown that many further possibilities may be explored to enhance the interaction of blind users with educational content in whiteboards, and further pedagogical approaches can be investigated from this proposal. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
There is an increasing interest in the application of Evolutionary Algorithms (EAs) to induce classification rules. This hybrid approach can benefit areas where classical methods for rule induction have not been very successful. One example is the induction of classification rules in imbalanced domains. Imbalanced data occur when one or more classes heavily outnumber other classes. Frequently, classical machine learning (ML) classifiers are not able to learn in the presence of imbalanced data sets, inducing classification models that always predict the most numerous classes. In this work, we propose a novel hybrid approach to deal with this problem. We create several balanced data sets with all minority class cases and a random sample of majority class cases. These balanced data sets are fed to classical ML systems that produce rule sets. The rule sets are combined creating a pool of rules and an EA is used to build a classifier from this pool of rules. This hybrid approach has some advantages over undersampling, since it reduces the amount of discarded information, and some advantages over oversampling, since it avoids overfitting. The proposed approach was experimentally analysed and the experimental results show an improvement in the classification performance measured as the area under the receiver operating characteristics (ROC) curve.
Resumo:
Over the useful life of a LAN, network downtimes will have a negative impact on organizational productivity not included in current Network Topological Design (NTD) problems. We propose a new approach to LAN topological design that includes the impact of these productivity losses into the network design, minimizing not only the CAPEX but also the expected cost of unproductiveness attributable to network downtimes over a certain period of network operation.
Resumo:
In the present work, a new approach for the determination of the partition coefficient in different interfaces based on the density function theory is proposed. Our results for log P(ow) considering a n-octanol/water interface for a large super cell for acetone -0.30 (-0.24) and methane 0.95 (0.78) are comparable with the experimental data given in parenthesis. We believe that these differences are mainly related to the absence of van der Walls interactions and the limited number of molecules considered in the super cell. The numerical deviations are smaller than that observed for interpolation based tools. As the proposed model is parameter free, it is not limited to the n-octanol/water interface.