68 resultados para Performanced Based Design
Resumo:
A systolic array to implement lattice-reduction-aided lineardetection is proposed for a MIMO receiver. The lattice reductionalgorithm and the ensuing linear detections are operated in the same array, which can be hardware-efficient. All-swap lattice reduction algorithm (ASLR) is considered for the systolic design.ASLR is a variant of the LLL algorithm, which processes all lattice basis vectors within one iteration. Lattice-reduction-aided linear detection based on ASLR and LLL algorithms have very similarbit-error-rate performance, while ASLR is more time efficient inthe systolic array, especially for systems with a large number ofantennas.
Resumo:
The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.
Resumo:
When applying a Collaborative Learning Flow Pattern (CLFP) to structure sequences of activities in real contexts, one of the tasks is to organize groups of students according to the constraints imposed by the pattern. Sometimes,unexpected events occurring at runtime force this pre-defined distribution to be changed. In such situations, an adjustment of the group structures to be adapted to the new context is needed. If the collaborative pattern is complex, this group redefinitionmight be difficult and time consuming to be carried out in real time. In this context, technology can help on notifying the teacher which incompatibilitiesbetween the actual context and the constraints imposed by the pattern. This chapter presents a flexible solution for supporting teachers in the group organization profiting from the intrinsic constraints defined by a CLFPs codified in IMS Learning Design. A prototype of a web-based tool for the TAPPS and Jigsaw CLFPs and the preliminary results of a controlled user study are alsopresented as a first step towards flexible technological systems to support grouping tasks in this context.
Resumo:
Collaborative activities, in which students actively interact with each other, have proved to provide significant learning benefits. In Computer-Supported Collaborative Learning (CSCL), these collaborative activities are assisted by technologies. However, the use of computers does not guarantee collaboration, as free collaboration does not necessary lead to fruitful learning. Therefore, practitioners need to design CSCL scripts that structure the collaborative settings so that they promote learning. However, not all teachers have the technical and pedagogical background needed to design such scripts. With the aim of assisting teachers in designing effective CSCL scripts, we propose a model to support the selection of reusable good practices (formulated as patterns) so that they can be used as a starting point for their own designs. This model is based on a pattern ontology that computationally represents the knowledge captured on a pattern language for the design of CSCL scripts. A preliminary evaluation of the proposed approach is provided with two examples based on a set of meaningful interrelated patters computationally represented with the pattern ontology, and a paper prototyping experience carried out with two teaches. The results offer interesting insights towards the implementation of the pattern ontology in software tools.
Resumo:
Cost systems have been shown to have developed considerably in recent years andactivity-based costing (ABC) has been shown to be a contribution to cost management,particularly in service businesses. The public sector is composed to a very great extentof service functions, yet considerably less has been reported of the use of ABC tosupport cost management in this sector.In Spain, cost systems are essential for city councils as they are obliged to calculate thecost of the services subject to taxation (eg. waste collection, etc). City councils musthave a cost system in place to calculate the cost of services, as they are legally requirednot to profit , from these services.This paper examines the development of systems to support cost management in theSpanish Public Sector. Through semi-structured interviews with 28 subjects within oneCity Council it contains a case study of cost management. The paper contains extractsfrom interviews and a number of factors are identified which contribute to thesuccessful development of the cost management system.Following the case study a number of other City Councils were identified where activity-based techniques had either failed or stalled. Based on the factors identified inthe single case study a further enquiry is reported. The paper includes a summary usingstatistical analysis which draws attention to change management, funding and politicalincentives as factors which had an influence on system success or failure.
Resumo:
The emphasis on integrated care implies new incentives that promote coordinationbetween levels of care. Considering a population as a whole, the resource allocation systemhas to adapt to this environment. This research is aimed to design a model that allows formorbidity related prospective and concurrent capitation payment. The model can be applied inpublicly funded health systems and managed competition settings.Methods: We analyze the application of hybrid risk adjustment versus either prospective orconcurrent risk adjustment formulae in the context of funding total health expenditures for thepopulation of an integrated healthcare delivery organization in Catalonia during years 2004 and2005.Results: The hybrid model reimburses integrated care organizations avoiding excessive risktransfer and maximizing incentives for efficiency in the provision. At the same time, it eliminatesincentives for risk selection for a specific set of high risk individuals through the use ofconcurrent reimbursement in order to assure a proper classification of patients.Conclusion: Prospective Risk Adjustment is used to transfer the financial risk to the healthprovider and therefore provide incentives for efficiency. Within the context of a National HealthSystem, such transfer of financial risk is illusory, and the government has to cover the deficits.Hybrid risk adjustment is useful to provide the right combination of incentive for efficiency andappropriate level of risk transfer for integrated care organizations.
Resumo:
Agent-based computational economics is becoming widely used in practice. This paperexplores the consistency of some of its standard techniques. We focus in particular on prevailingwholesale electricity trading simulation methods. We include different supply and demandrepresentations and propose the Experience-Weighted Attractions method to include severalbehavioural algorithms. We compare the results across assumptions and to economic theorypredictions. The match is good under best-response and reinforcement learning but not underfictitious play. The simulations perform well under flat and upward-slopping supply bidding,and also for plausible demand elasticity assumptions. Learning is influenced by the number ofbids per plant and the initial conditions. The overall conclusion is that agent-based simulationassumptions are far from innocuous. We link their performance to underlying features, andidentify those that are better suited to model wholesale electricity markets.
Resumo:
We continue the development of a method for the selection of a bandwidth or a number of design parameters in density estimation. We provideexplicit non-asymptotic density-free inequalities that relate the $L_1$ error of the selected estimate with that of the best possible estimate,and study in particular the connection between the richness of the classof density estimates and the performance bound. For example, our methodallows one to pick the bandwidth and kernel order in the kernel estimatesimultaneously and still assure that for {\it all densities}, the $L_1$error of the corresponding kernel estimate is not larger than aboutthree times the error of the estimate with the optimal smoothing factor and kernel plus a constant times $\sqrt{\log n/n}$, where $n$ is the sample size, and the constant only depends on the complexity of the family of kernels used in the estimate. Further applications include multivariate kernel estimates, transformed kernel estimates, and variablekernel estimates.
Resumo:
The proposal to work on this final project came after several discussions held with Dr. Elzbieta Malinowski Gadja, who in 2008 published the book entitled Advanced Data Warehouse Design: From Conventional to Spatial and Temporal Applications (Data-Centric Systems and Applications). The project was carried out under the technical supervision of Dr. Malinowski and the direct beneficiary was the University of Costa Rica (UCR) where Dr. Malinowski is a professor at the Department of Computer Science and Informatics. The purpose of this project was twofold: First, to translate chapter III of said book with the intention of generating educational material for the use of the UCR and, second, to venture in the field of technical translation related to data warehouse. For the first component, the goal was to generate a final product that would eventually serve as an educational tool for the post-graduate courses of the UCR. For the second component, this project allowed me to acquire new skills and put into practice techniques that have helped me not only to perfom better in my current job as an Assistant Translator of the Inter-American BAnk (IDB), but also to use them in similar projects. The process was lenggthy and required torough research and constant communication with the author. The investigation focused on the search of terms and definitions to prepare the glossary, which was the basis to start the translation project. The translation process itself was carried out by phases, so that comments and corrections by the author could be taken into account in subsequent stages. Later, based on the glossary and the translated text, illustrations had been created in the Visio software were translated. In addition to the technical revision by the author, professor Carme Mangiron was in charge of revising the non-technical text. The result was a high-quality document that is currently used as reference and study material by the Department of Computer Science and Informatics of Costa Rica.
Resumo:
This paper presents a case study that explores the advantages that can be derived from the use of a design support system during the design of wastewater treatment plants (WWTP). With this objective in mind a simplified but plausible WWTP design case study has been generated with KBDS, a computer-based support system that maintains a historical record of the design process. The study shows how, by employing such a historical record, it is possible to: (1) rank different design proposals responding to a design problem; (2) study the influence of changing the weight of the arguments used in the selection of the most adequate proposal; (3) take advantage of keywords to assist the designer in the search of specific items within the historical records; (4) evaluate automatically thecompliance of alternative design proposals with respect to the design objectives; (5) verify the validity of previous decisions after the modification of the current constraints or specifications; (6) re-use the design records when upgrading an existing WWTP or when designing similar facilities; (7) generate documentation of the decision making process; and (8) associate a variety of documents as annotations to any component in the design history. The paper also shows one possible future role of design support systems as they outgrow their current reactive role as repositories of historical information and start to proactively support the generation of new knowledge during the design process
Resumo:
A method of making a multiple matched filter which allows the recognition of different characters in successive planes in simple conditions is proposed. The generation of the filter is based on recording on the same plate the Fourier transforms of the different patterns to be recognized, each of which is affected by different spherical phase factors because the patterns have been placed at different distances from the lens. This is proved by means of experiments with a triple filter which allows satisfactory recognition of three characters.
Resumo:
We report the design and validation of simple magnetic tweezers for oscillating ferromagnetic beads in the piconewton and nanometer scales. The system is based on a single pair of coaxial coils operating in two sequential modes: permanent magnetization of the beads through a large and brief pulse of magnetic field and generation of magnetic gradients to produce uniaxial oscillatory forces. By using this two step method, the magnetic moment of the beads remains constant during measurements. Therefore, the applied force can be computed and varies linearly with the driving signal. No feedback control is required to produce well defined force oscillations over a wide bandwidth. The design of the coils was optimized to obtain high magnetic fields (280 mT) and gradients (2 T/m) with high homogeneity (5% variation) within the sample. The magnetic tweezers were implemented in an inverted optical microscope with a videomicroscopy-based multiparticle tracking system. The apparatus was validated with 4.5 ¿m magnetite beads obtaining forces up to ~2 pN and subnanometer resolution. The applicability of the device includes microrheology of biopolymer and cell cytoplasm, molecular mechanics, and mechanotransduction in living cells.
Resumo:
InAlAs/InGaAs/InP based high electron mobility transistor devices have been structurally and electrically characterized, using transmission electron microscopy and Raman spectroscopy and measuring Hall mobilities. The InGaAs lattice matched channels, with an In molar fraction of 53%, grown at temperatures lower than 530¿°C exhibit alloy decomposition driving an anisotropic InGaAs surface roughness oriented along [1math0]. Conversely, lattice mismatched channels with an In molar fraction of 75% do not present this lateral decomposition but a strain induced roughness, with higher strength as the channel growth temperature increases beyond 490¿°C. In both cases the presence of the roughness implies low and anisotropic Hall mobilities of the two dimensional electron gas.
Resumo:
We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.
Resumo:
A method of making a multiple matched filter which allows the recognition of different characters in successive planes in simple conditions is proposed. The generation of the filter is based on recording on the same plate the Fourier transforms of the different patterns to be recognized, each of which is affected by different spherical phase factors because the patterns have been placed at different distances from the lens. This is proved by means of experiments with a triple filter which allows satisfactory recognition of three characters.