63 resultados para off-design piste


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two important challenges that teachers are currently facing are the sharing and the collaborative authoring of their learning design solutions, such as didactical units and learning materials. On the one hand, there are tools that can be used for the creation of design solutions and only some of them facilitate the co-edition. However, they do not incorporate mechanisms that support the sharing of the designs between teachers. On the other hand, there are tools that serve as repositories of educational resources but they do not enable the authoring of the designs. In this paper we present LdShake, a web tool whose novelty is focused on the combined support for the social sharing and co-edition of learning design solutions within communities of teachers. Teachers can create and share learning designs with other teachers using different access rights so that they can read, comment or co-edit the designs. Therefore, each design solution is associated to a group of teachers able to work on its definition, and another group that can only see the design. The tool is generic in that it allows the creation of designs based on any pedagogical approach. However, it can be particularized in instances providing pre-formatted designs structured according to a specific didactic method (such as Problem-Based Learning, PBL). A particularized LdShake instance has been used in the context of Human Biology studies where teams of teachers are required to work together in the design of PBL solutions. A controlled user study, that compares the use of a generic LdShake and a Moodle system, configured to enable the creation and sharing of designs, has been also carried out. The combined results of the real and controlled studies show that the social structure, and the commenting, co-edition and publishing features of LdShake provide a useful, effective and usable approach for facilitating teachers' teamwork.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2×2 MIMO profiles included in Mobile WiMAX specifications are Alamouti’s space-time code (STC) fortransmit diversity and spatial multiplexing (SM). The former hasfull diversity and the latter has full rate, but neither of them hasboth of these desired features. An alternative 2×2 STC, which is both full rate and full diversity, is the Golden code. It is the best known 2×2 STC, but it has a high decoding complexity. Recently, the attention was turned to the decoder complexity, this issue wasincluded in the STC design criteria, and different STCs wereproposed. In this paper, we first present a full-rate full-diversity2×2 STC design leading to substantially lower complexity ofthe optimum detector compared to the Golden code with only a slight performance loss. We provide the general optimized form of this STC and show that this scheme achieves the diversitymultiplexing frontier for square QAM signal constellations. Then, we present a variant of the proposed STC, which provides a further decrease in the detection complexity with a rate reduction of 25% and show that this provides an interesting trade-off between the Alamouti scheme and SM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This workshop paper states that fostering active student participation both in face-to-face lectures / seminars and outside the classroom (personal and group study at home, the library, etc.) requires a certain level of teacher-led inquiry. The paper presents a set of strategies drawn from real practice in higher education with teacher-led inquiry ingredients that promote active learning. Thesepractices highlight the role of the syllabus, the importance of iterative learning designs, explicit teacher-led inquiry, and the implications of the context, sustainability and practitioners’ creativity. The strategies discussed in this paper can serve as input to the workshop as real cases that need to be represented in design and supported in enactment (with and without technologies).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In multiuser detection, the set of users active at any time may be unknown to the receiver. In these conditions, optimum reception consists of detecting simultaneously the set of activeusers and their data, problem that can be solved exactly by applying random-set theory (RST) and Bayesian recursions (BR). However, implementation of optimum receivers may be limited by their complexity, which grows exponentially with the number of potential users. In this paper we examine three strategies leading to reduced-complexity receivers.In particular, we show how a simple approximation of BRs enables the use of Sphere Detection (SD) algorithm, whichexhibits satisfactory performance with limited complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Collaborative activities, in which students actively interact with each other, have proved to provide significant learning benefits. In Computer-Supported Collaborative Learning (CSCL), these collaborative activities are assisted by technologies. However, the use of computers does not guarantee collaboration, as free collaboration does not necessary lead to fruitful learning. Therefore, practitioners need to design CSCL scripts that structure the collaborative settings so that they promote learning. However, not all teachers have the technical and pedagogical background needed to design such scripts. With the aim of assisting teachers in designing effective CSCL scripts, we propose a model to support the selection of reusable good practices (formulated as patterns) so that they can be used as a starting point for their own designs. This model is based on a pattern ontology that computationally represents the knowledge captured on a pattern language for the design of CSCL scripts. A preliminary evaluation of the proposed approach is provided with two examples based on a set of meaningful interrelated patters computationally represented with the pattern ontology, and a paper prototyping experience carried out with two teaches. The results offer interesting insights towards the implementation of the pattern ontology in software tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimum experimental designs depend on the design criterion, the model andthe design region. The talk will consider the design of experiments for regressionmodels in which there is a single response with the explanatory variables lying ina simplex. One example is experiments on various compositions of glass such asthose considered by Martin, Bursnall, and Stillman (2001).Because of the highly symmetric nature of the simplex, the class of models thatare of interest, typically Scheff´e polynomials (Scheff´e 1958) are rather differentfrom those of standard regression analysis. The optimum designs are also ratherdifferent, inheriting a high degree of symmetry from the models.In the talk I will hope to discuss a variety of modes for such experiments. ThenI will discuss constrained mixture experiments, when not all the simplex is availablefor experimentation. Other important aspects include mixture experimentswith extra non-mixture factors and the blocking of mixture experiments.Much of the material is in Chapter 16 of Atkinson, Donev, and Tobias (2007).If time and my research allows, I would hope to finish with a few comments ondesign when the responses, rather than the explanatory variables, lie in a simplex.ReferencesAtkinson, A. C., A. N. Donev, and R. D. Tobias (2007). Optimum ExperimentalDesigns, with SAS. Oxford: Oxford University Press.Martin, R. J., M. C. Bursnall, and E. C. Stillman (2001). Further results onoptimal and efficient designs for constrained mixture experiments. In A. C.Atkinson, B. Bogacka, and A. Zhigljavsky (Eds.), Optimal Design 2000,pp. 225–239. Dordrecht: Kluwer.Scheff´e, H. (1958). Experiments with mixtures. Journal of the Royal StatisticalSociety, Ser. B 20, 344–360.1

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Person Trade-Off (PTO) is a methodology aimed at measuring thesocial value of health states. The rest of methodologies would measure individualutility and would be less appropriate for taking resource allocation decisions.However few studies have been conducted to test the validity of the method.We present a pilot study with this objective. The study is based on theresult of interviews to 30 undergraduate students in Economics. We judgethe validity of PTO answers by their adequacy to three hypothesis of rationality.First, we show that, given certain rationality assumptions, PTO answersshould be predicted from answers to Standard Gamble questions. This firsthypothesis is not verified. The second hypothesis is that PTO answersshould not vary with different frames of equivalent PTO questions. Thissecond hypothesis is also not verified. Our third hypothesis is that PTOvalues should predict social preferences for allocating resources betweenpatients. This hypothesis is verified. The evidence on the validity of themethod is then conflicting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cost systems have been shown to have developed considerably in recent years andactivity-based costing (ABC) has been shown to be a contribution to cost management,particularly in service businesses. The public sector is composed to a very great extentof service functions, yet considerably less has been reported of the use of ABC tosupport cost management in this sector.In Spain, cost systems are essential for city councils as they are obliged to calculate thecost of the services subject to taxation (eg. waste collection, etc). City councils musthave a cost system in place to calculate the cost of services, as they are legally requirednot to profit , from these services.This paper examines the development of systems to support cost management in theSpanish Public Sector. Through semi-structured interviews with 28 subjects within oneCity Council it contains a case study of cost management. The paper contains extractsfrom interviews and a number of factors are identified which contribute to thesuccessful development of the cost management system.Following the case study a number of other City Councils were identified where activity-based techniques had either failed or stalled. Based on the factors identified inthe single case study a further enquiry is reported. The paper includes a summary usingstatistical analysis which draws attention to change management, funding and politicalincentives as factors which had an influence on system success or failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We lay out a tractable model for fiscal and monetary policy analysis in a currency union, and study its implications for the optimal design of such policies. Monetary policy is conducted by a common central bank, which sets the interest rate for the union as a whole. Fiscal policy is implemented at the countrylevel, through the choice of government spending. The model incorporates country-specific shocks and nominal rigidities. Under our assumptions, the optimal cooperative policy arrangement requires that inflation be stabilized at the union level by the common central bank, while fiscal policy is used by each country for stabilization purposes. By contrast, when the fiscal authorities act in a non-coordinated way, their joint actions lead to a suboptimal outcome, and make the common central bank face a trade-off between inflation and output gap stabilization at the union level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new unifying framework for investigating throughput-WIP(Work-in-Process) optimal control problems in queueing systems,based on reformulating them as linear programming (LP) problems withspecial structure: We show that if a throughput-WIP performance pairin a stochastic system satisfies the Threshold Property we introducein this paper, then we can reformulate the problem of optimizing alinear objective of throughput-WIP performance as a (semi-infinite)LP problem over a polygon with special structure (a thresholdpolygon). The strong structural properties of such polygones explainthe optimality of threshold policies for optimizing linearperformance objectives: their vertices correspond to the performancepairs of threshold policies. We analyze in this framework theversatile input-output queueing intensity control model introduced byChen and Yao (1990), obtaining a variety of new results, including (a)an exact reformulation of the control problem as an LP problem over athreshold polygon; (b) an analytical characterization of the Min WIPfunction (giving the minimum WIP level required to attain a targetthroughput level); (c) an LP Value Decomposition Theorem that relatesthe objective value under an arbitrary policy with that of a giventhreshold policy (thus revealing the LP interpretation of Chen andYao's optimality conditions); (d) diminishing returns and invarianceproperties of throughput-WIP performance, which underlie thresholdoptimality; (e) a unified treatment of the time-discounted andtime-average cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Firms compete by choosing both a price and a design from a family of designs thatcan be represented as demand rotations. Consumers engage in costly sequential searchamong firms. Each time a consumer pays a search cost he observes a new offering. Anoffering consists of a price quote and a new good, where goods might vary in the extentto which they are good matches for the consumer. In equilibrium, only two design-styles arise: either the most niche where consumers are likely to either love or loathethe product, or the broadest where consumers are likely to have similar valuations. Inequilibrium, different firms may simultaneously offer both design-styles. We performcomparative statics on the equilibrium and show that a fall in search costs can lead tohigher industry prices and profits and lower consumer surplus. Our analysis is relatedto discussions of how the internet has led to the prevalence of niche goods and the"long tail" phenomenon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper tests the internal consistency of time trade-off utilities.We find significant violations of consistency in the direction predictedby loss aversion. The violations disappear for higher gauge durations.We show that loss aversion can also explain that for short gaugedurations time trade-off utilities exceed standard gamble utilities. Ourresults suggest that time trade-off measurements that use relativelyshort gauge durations, like the widely used EuroQol algorithm(Dolan 1997), are affected by loss aversion and lead to utilities thatare too high.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We obtain minimax lower and upper bounds for the expected distortionredundancy of empirically designed vector quantizers. We show that the meansquared distortion of a vector quantizer designed from $n$ i.i.d. datapoints using any design algorithm is at least $\Omega (n^{-1/2})$ awayfrom the optimal distortion for some distribution on a bounded subset of${\cal R}^d$. Together with existing upper bounds this result shows thatthe minimax distortion redundancy for empirical quantizer design, as afunction of the size of the training data, is asymptotically on the orderof $n^{1/2}$. We also derive a new upper bound for the performance of theempirically optimal quantizer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.