33 resultados para competence-based approach
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This paper proposes a new methodology to compute Value at Risk (VaR) for quantifying losses in credit portfolios. We approximate the cumulative distribution of the loss function by a finite combination of Haar wavelet basis functions and calculate the coefficients of the approximation by inverting its Laplace transform. The Wavelet Approximation (WA) method is specially suitable for non-smooth distributions, often arising in small or concentrated portfolios, when the hypothesis of the Basel II formulas are violated. To test the methodology we consider the Vasicek one-factor portfolio credit loss model as our model framework. WA is an accurate, robust and fast method, allowing to estimate VaR much more quickly than with a Monte Carlo (MC) method at the same level of accuracy and reliability.
Resumo:
Not considered in the analytical model of the plant, uncertainties always dramatically decrease the performance of the fault detection task in the practice. To cope better with this prevalent problem, in this paper we develop a methodology using Modal Interval Analysis which takes into account those uncertainties in the plant model. A fault detection method is developed based on this model which is quite robust to uncertainty and results in no false alarm. As soon as a fault is detected, an ANFIS model is trained in online to capture the major behavior of the occurred fault which can be used for fault accommodation. The simulation results understandably demonstrate the capability of the proposed method for accomplishing both tasks appropriately
Resumo:
This paper describes a failure alert system and a methodology for content reuse in a new instructional design system called InterMediActor (IMA). IMA provides an environment for instructional content design, production and reuse, and for students’ evaluation based in content specification through a hierarchical structure of competences. The student assessment process and information extraction process for content reuse are explained.
Resumo:
Student guidance is an always desired characteristic in any educational system, butit represents special difficulty if it has to be deployed in an automated way to fulfilsuch needs in a computer supported educational tool. In this paper we explorepossible avenues relying on machine learning techniques, to be included in a nearfuture -in the form of a tutoring navigational tool- in a teleeducation platform -InterMediActor- currently under development. Since no data from that platform isavailable yet, the preliminary experiments presented in this paper are builtinterpreting every subject in the Telecommunications Degree at Universidad CarlosIII de Madrid as an aggregated macro-competence (following the methodologicalconsiderations in InterMediActor), such that marks achieved by students can beused as data for the models, to be replaced in a near future by real data directlymeasured inside InterMediActor. We evaluate the predictability of students qualifications, and we deploy a preventive early detection system -failure alert-, toidentify those students more prone to fail a certain subject such that correctivemeans can be deployed with sufficient anticipation.
Resumo:
We present an agent-based model with the aim of studying how macro-level dynamics of spatial distances among interacting individuals in a closed space emerge from micro-level dyadic and local interactions. Our agents moved on a lattice (referred to as a room) using a model implemented in a computer program called P-Space in order to minimize their dissatisfaction, defined as a function of the discrepancy between the real distance and the ideal, or desired, distance between agents. Ideal distances evolved in accordance with the agent's personal and social space, which changed throughout the dynamics of the interactions among the agents. In the first set of simulations we studied the effects of the parameters of the function that generated ideal distances, and in a second set we explored how group macrolevel behavior depended on model parameters and other variables. We learned that certain parameter values yielded consistent patterns in the agents' personal and social spaces, which in turn led to avoidance and approaching behaviors in the agents. We also found that the spatial behavior of the group of agents as a whole was influenced by the values of the model parameters, as well as by other variables such as the number of agents. Our work demonstrates that the bottom-up approach is a useful way of explaining macro-level spatial behavior. The proposed model is also shown to be a powerful tool for simulating the spatial behavior of groups of interacting individuals.
Resumo:
This document puts into question the conventional way of delineating tourism destinations. It intends to show a model of spatial analysis, to find new interpretations of the reality, more balanced and more optimized, in comparison with other territorial views most of them based on administrative boundaries. This paper portrays a methodological exercise that aims to structure tourism geographies into new tourism areas on the basis of visitor’s consumption patterns, which would be better fitted to the needs of tourist demand
Resumo:
The inverse scattering problem concerning the determination of the joint time-delayDoppler-scale reflectivity density characterizing continuous target environments is addressed by recourse to the generalized frame theory. A reconstruction formula,involving the echoes of a frame of outgoing signals and its corresponding reciprocalframe, is developed. A ‘‘realistic’’ situation with respect to the transmission ofa finite number of signals is further considered. In such a case, our reconstruction formula is shown to yield the orthogonal projection of the reflectivity density onto a subspace generated by the transmitted signals.
Resumo:
JXTA is a mature set of open protocols, with morethan 10 years of history, that enable the creation and deployment of peer-to-peer (P2P) networks, allowing the execution of services in a distributed manner. Throughout its lifecycle, ithas slowly evolved in order to appeal a broad set of different applications. Part of this evolution includes providing basic security capabilities in its protocols in order to achieve some degree of message privacy and authentication. However, undersome contexts, more advanced security requirements should be met, such as anonymity. There are several methods to attain anonymity in generic P2P networks. In this paper, we proposehow to adapt a replicated message-based approach to JXTA, by taking advantage of its idiosyncracies and capabilities.
Resumo:
The paper presents the results of the piloting or pilot test in a virtual classroom. This e-portfolio was carried out in the 2005-2006 academic year, with students of the Doctorate in Information Society, at the Open University of Catalonia. The electronic portfolio is a strategy for competence based assessment. This experience shows the types of e-portfolios, where students show their work without interactions, and apply the competence-based learning theories in an interactive portfolio system. The real process of learning is developed in the competency based system, the portfolio not only is a basic bio document, has become a real space for learning with competence model. The paper brings out new ideas and possibilities: the competence-based learning promotes closer relationships between universities and companies and redesigns the pedagogic act.
Resumo:
This article presents preliminary research from an instructional design perspective on the design of the case method as an integral part of pedagogy and technology. Key features and benefitsusing this teaching and learning strategy in a Virtual Teaching and Learning Environment(VTLE) are identified, taking into account the requirements of the European Higher Education Area (EHEA) for a competence-based curricula design. The implications of these findings for alearning object approach exploring the possibilities of learning personalization, reusability and interoperability trough IMS LD, are also analyzed.
Resumo:
In this work we present a proposal for a course in translation from German into Spanish following the task based approach as known in second language acquisition. The aim is to improve the translation competence of translation students. We depart from the hypothesis that some students select inapropiate translation strategies when faced with certain translation problems leading them to translation errors. In order to avoid these translation errors originated by wrong application of such strategies we propose a didactic method which helps to prevent them by a) raising awareness of the different subcompetences required while translating, b) improving the ability to identify translation problems and relate them to the different subcompetences and c) enhancing the use of the most adequate strategy according to the characteristics of each problem. With regard to translation and how translation competence is acquired our work follows the communicative approach to translation theory as defended among others by Hatim & Mason (1990), Lörscher (1992) and Kiraly (1995), where translation is seen as a communicative activity which can be analized from a psycholinguistic perspective. In this sense we give operative definitions for what we understand by “translation problem”, “translation strategy”, “translation error”, “translation competence” and “translation”. Our approach to didactics adapts recent developments in Second Language Teaching within the communicative paradigm as is the task based approach by Nunan (1989) acquisition to translation. Fitting the recquirements of this pedagogic approach we present a planning for a translation course which is compatible with present translation studies.
Resumo:
A model-based approach for fault diagnosis is proposed, where the fault detection is based on checking the consistencyof the Analytical Redundancy Relations (ARRs) using an interval tool. The tool takes into account the uncertainty in theparameters and the measurements using intervals. Faults are explicitly included in the model, which allows for the exploitation of additional information. This information is obtained from partial derivatives computed from the ARRs. The signs in the residuals are used to prune the candidate space when performing the fault diagnosis task. The method is illustrated using a two-tank example, in which these aspects are shown to have an impact on the diagnosis and fault discrimination, since the proposed method goes beyond the structural methods
Resumo:
The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.
Resumo:
Most sedimentary modelling programs developed in recent years focus on either terrigenous or carbonate marine sedimentation. Nevertheless, only a few programs have attempted to consider mixed terrigenous-carbonate sedimentation, and most of these are two-dimensional, which is a major restriction since geological processes take place in 3D. This paper presents the basic concepts of a new 3D mathematical forward simulation model for clastic sediments, which was developed from SIMSAFADIM, a previous 3D carbonate sedimentation model. The new extended model, SIMSAFADIM-CLASTIC, simulates processes of autochthonous marine carbonate production and accumulation, together with clastic transport and sedimentation in three dimensions of both carbonate and terrigenous sediments. Other models and modelling strategies may also provide realistic and efficient tools for prediction of stratigraphic architecture and facies distribution of sedimentary deposits. However, SIMSAFADIM-CLASTIC becomes an innovative model that attempts to simulate different sediment types using a process-based approach, therefore being a useful tool for 3D prediction of stratigraphic architecture and facies distribution in sedimentary basins. This model is applied to the neogene Vallès-Penedès half-graben (western Mediterranean, NE Spain) to show the capacity of the program when applied to a realistic geologic situation involving interactions between terrigenous clastics and carbonate sediments.