933 resultados para non-trivial data structures


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the process of wrapping existing scientific codes in the domain of plasma physics simulations through the use of the Sun’s Java Native Interface. We have created a Java front-end for a particular functionality, offered by legacy native libraries, in order to achieve reusability and interoperability without having to rewrite these libraries. The technique, introduced in this paper, includes two approaches – the one-to-one mapping for wrapping a number of native functions, and using peer classes for wrapping native data structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the article, we have reviewed the means for visualization of syntax, semantics and source code for programming languages which support procedural and/or object-oriented paradigm. It is examined how the structure of the source code of the structural and object-oriented programming styles has influenced different approaches for their teaching. We maintain a thesis valid for the object-oriented programming paradigm, which claims that the activities for design and programming of classes are done by the same specialist, and the training of this specialist should include design as well as programming skills and knowledge for modeling of abstract data structures. We put the question how a high level of abstraction in the object-oriented paradigm should be presented in simple model in the design stage, so the complexity in the programming stage stay low and be easily learnable. We give answer to this question, by building models using the UML notation, as we take a concrete example from the teaching practice including programming techniques for inheritance and polymorphism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have devised a general scheme that reveals multiple duality relations valid for all multi-channel Luttinger Liquids. The relations are universal and should be used for establishing phase diagrams and searching for new non-trivial phases in low-dimensional strongly correlated systems. The technique developed provides universal correspondence between scaling dimensions of local perturbations in different phases. These multiple relations between scaling dimensions lead to a connection between different inter-phase boundaries on the phase diagram. The dualities, in particular, constrain phase diagram and allow predictions of emergence and observation of new phases without explicit model-dependent calculations. As an example, we demonstrate the impossibility of non-trivial phase existence for fermions coupled to phonons in one dimension. © 2013 EPLA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes difficulties with the introduction of object-oriented concepts in introductory computing education and then proposes a two-language, two-paradigm curriculum model that alleviates such difficulties. Our two-language, two-paradigm curriculum model begins with teaching imperative programming using Python programming language, continues with teaching object-oriented computing using Java, and concludes with teaching object-oriented data structures with Java.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AMS subject classification: 68Q22, 90C90

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2016

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We determine the endogenous order of moves in which the firms set their prices in the framework of a capacity-constrained Bertrand-Edgeworth triopoly. A three-period timing game that determines the period in which the firms announce their prices precedes the price-setting stage. We show for the non-trivial case (in which the Bertrand-Edgeworth triopoly has only an equilibrium in non-degenerated mixed-strategies) that the firm with the largest capacity sets its price first, while the two other firms set their prices later. Our result extends a finding by Deneckere and Kovenock (1992) from duopolies to triopolies. This extension was made possible by Hirata's (2009) recent advancements on the mixed-strategy equilibria of Bertrand-Edgeworth games.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A dolgozatban a hitelderivatívák intenzitásalapú modellezésének néhány kérdését vizsgáljuk meg. Megmutatjuk, hogy alkalmas mértékcserével nemcsak a duplán sztochasztikus folyamatok, hanem tetszőleges intenzitással rendelkező pontfolyamat esetén is kiszámolható az összetett kár- és csődfolyamat eloszlásának Laplace-transzformáltja. _____ The paper addresses questions concerning the use of intensity based modeling in the pricing of credit derivatives. As the specification of the distribution of the lossprocess is a non-trivial exercise, the well-know technique for this task utilizes the inversion of the Laplace-transform. A popular choice for the model is the class of doubly stochastic processes given that their Laplace-transforms can be determined easily. Unfortunately these processes lack several key features supported by the empirical observations, e.g. they cannot replicate the self-exciting nature of defaults. The aim of the paper is to show that by using an appropriate change of measure the Laplace-transform can be calculated not only for a doubly stochastic process, but for an arbitrary point process with intensity as well. To support the application of the technique, we investigate the e®ect of the change of measure on the stochastic nature of the underlying process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computers have dramatically changed the way we live, conduct business, and deliver education. They have infiltrated the Bahamian public school system to the extent that many educators now feel the need for a national plan. The development of such a plan is a challenging undertaking, especially in developing countries where physical, financial, and human resources are scarce. This study assessed the situation with regard to computers within the Bahamian public school system, and provided recommended guidelines to the Bahamian government based on the results of a survey, the body of knowledge about trends in computer usage in schools, and the country's needs. ^ This was a descriptive study for which an extensive review of literature in areas of computer hardware, software, teacher training, research, curriculum, support services and local context variables was undertaken. One objective of the study was to establish what should or could be relative to the state-of-the-art in educational computing. A survey was conducted involving 201 teachers and 51 school administrators from 60 randomly selected Bahamian public schools. A random stratified cluster sampling technique was used. ^ This study used both quantitative and qualitative research methodologies. Quantitative methods were used to summarize the data about numbers and types of computers, categories of software available, peripheral equipment, and related topics through the use of forced-choice questions in a survey instrument. Results of these were displayed in tables and charts. Qualitative methods, data synthesis and content analysis, were used to analyze the non-numeric data obtained from open-ended questions on teachers' and school administrators' questionnaires, such as those regarding teachers' perceptions and attitudes about computers and their use in classrooms. Also, interpretative methodologies were used to analyze the qualitative results of several interviews conducted with senior public school system's officials. Content analysis was used to gather data from the literature on topics pertaining to the study. ^ Based on the literature review and the data gathered for this study a number of recommendations are presented. These recommendations may be used by the government of the Commonwealth of The Bahamas to establish policies with regard to the use of computers within the public school system. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Semantic Binary Data Model (SBM) is a viable alternative to the now-dominant relational data model. SBM would be especially advantageous for applications dealing with complex interrelated networks of objects provided that a robust efficient implementation can be achieved. This dissertation presents an implementation design method for SBM, algorithms, and their analytical and empirical evaluation. Our method allows building a robust and flexible database engine with a wider applicability range and improved performance. ^ Extensions to SBM are introduced and an implementation of these extensions is proposed that allows the database engine to efficiently support applications with a predefined set of queries. A New Record data structure is proposed. Trade-offs of employing Fact, Record and Bitmap Data structures for storing information in a semantic database are analyzed. ^ A clustering ID distribution algorithm and an efficient algorithm for object ID encoding are proposed. Mapping to an XML data model is analyzed and a new XML-based XSDL language facilitating interoperability of the system is defined. Solutions to issues associated with making the database engine multi-platform are presented. An improvement to the atomic update algorithm suitable for certain scenarios of database recovery is proposed. ^ Specific guidelines are devised for implementing a robust and well-performing database engine based on the extended Semantic Data Model. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proofs by induction are central to many computer science areas such as data structures, theory of computation, programming languages, program efficiency-time complexity, and program correctness. Proofs by induction can also improve students’ understanding and performance of computer science concepts such as programming languages, algorithm design, and recursion, as well as serve as a medium for teaching them. Even though students are exposed to proofs by induction in many courses of their curricula, they still have difficulties understanding and performing them. This impacts the whole course of their studies, since proofs by induction are omnipresent in computer science. Specifically, students do not gain conceptual understanding of induction early in the curriculum and as a result, they have difficulties applying it to more advanced areas later on in their studies. The goal of my dissertation is twofold: (1) identifying sources of computer science students’ difficulties with proofs by induction, and (2) developing a new approach to teaching proofs by induction by way of an interactive and multimodal electronic book (e-book). For the first goal, I undertook a study to identify possible sources of computer science students’ difficulties with proofs by induction. Its results suggest that there is a close correlation between students’ understanding of inductive definitions and their understanding and performance of proofs by induction. For designing and developing my e-book, I took into consideration the results of my study, as well as the drawbacks of the current methodologies of teaching proofs by induction for computer science. I designed my e-book to be used as a standalone and complete educational environment. I also conducted a study on the effectiveness of my e-book in the classroom. The results of my study suggest that, unlike the current methodologies of teaching proofs by induction for computer science, my e-book helped students overcome many of their difficulties and gain conceptual understanding of proofs induction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation analyzes the obstacles against further cooperation in international economic relations. The first essay explains the gradual nature of trade liberalization. I show that existence of asymmetric information between governments provides a sufficient reason for gradualism to exist. Governments prefer starting small to reduce the cost of partner’s betrayal when there is sufficient degree of information asymmetry regarding the partner’s type. Learning about partner’s incentive structure enhances expectations, encouraging governments to increase their current level of cooperation. Specifically, the uninformed government’s subjective belief for the trading partner being good is improved as the partner acts cooperatively. This updated belief, in turn, lowers the subjective probability of future betrayal, enabling further progress in cooperation. The second essay analyzes the relationship between two countries facing two policy dilemmas in an environment with two way goods and capital flows. When issues are independent and countries are symmetric, signing separate agreements for tariffs (Free Trade Agreements-FTA) and for taxes (Tax Treaties-TT) provides the identical level of enforcement as signing a linked agreement. However, linkage can still improve the joint welfare by transferring the slack enforcement power in a case of asymmetric issues or countries. I report non-results in two cases where the policy issues are interconnected due to technological spillover effect of FDI. Moreover, I show that linking the agreements actually reduces enforcement when agreements are linked under a limited punishment rule and policy variables are strategic substitutes. The third essay investigates the welfare/enforcement consequences of linking trade and environmental agreements. In the standard literature, linking the agreements generate non-trivial results only when there is structural relation between the issues. I focus on institutional design of the linkage and show that even if environmental aspects of international trade are negligible linking the agreements might still have some interesting welfare implications under current GATT Rules. Specifically, when traded goods are substitutes in consumption, linking the environmental agreement with trade agreement under the Withdrawal of Equivalent Concession Rule (Article XXVIII) will reduce the enforcement. However, enforcement in environmental issue increases when the same rule is implemented in the absence of linkage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing scientifically credible tools for measuring the success of ecological restoration projects is a difficult and a non-trivial task. Yet, reliable measures of the general health and ecological integrity of ecosystems are critical for assessing the success of restoration programs. The South Florida Ecosystem Restoration Task Force (Task Force), which helps coordinate a multi-billion dollar multi-organizational effort between federal, state, local and tribal governments to restore the Florida Everglades, is using a small set of system-wide ecological indicators to assess the restoration efforts. A team of scientists and managers identified eleven ecological indicators from a field of several hundred through a selection process using 12 criteria to determine their applicability as part of a system-wide suite. The 12 criteria are: (1) is the indicator relevant to the ecosystem? (2) Does it respond to variability at a scale that makes it applicable to the entire system? (3) Is the indicator feasible to implement and is it measureable? (4) Is the indicator sensitive to system drivers and is it predictable? (5) Is the indicator interpretable in a common language? (6) Are there situations where an optimistic trend with regard to an indicator might suggest a pessimistic restoration trend? (7) Are there situations where a pessimistic trend with regard to an indicator may be unrelated to restoration activities? (8) Is the indicator scientifically defensible? (9) Can clear, measureable targets be established for the indicator to allow for assessments of success? (10) Does the indicator have specificity to be able to result in corrective action? (11) What level of ecosystem process or structure does the indicator address? (12) Does the indicator provide early warning signs of ecological change? In addition, a two page stoplight report card was developed to assist in communicating the complex science inherent in ecological indicators in a common language for resource managers, policy makers and the public. The report card employs a universally understood stoplight symbol that uses green to indicate that targets are being met, yellow to indicate that targets have not been met and corrective action may be needed and red to represent that targets are far from being met and corrective action is required. This paper presents the scientific process and the results of the development and selection of the criteria, the indicators and the stoplight report card format and content. The detailed process and results for the individual indicators are presented in companion papers in this special issue of Ecological Indicators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Proofs by induction are central to many computer science areas such as data structures, theory of computation, programming languages, program efficiency-time complexity, and program correctness. Proofs by induction can also improve students’ understanding of and performance with computer science concepts such as programming languages, algorithm design, and recursion, as well as serve as a medium for teaching them. Even though students are exposed to proofs by induction in many courses of their curricula, they still have difficulties understanding and performing them. This impacts the whole course of their studies, since proofs by induction are omnipresent in computer science. Specifically, students do not gain conceptual understanding of induction early in the curriculum and as a result, they have difficulties applying it to more advanced areas later on in their studies. The goal of my dissertation is twofold: 1. identifying sources of computer science students’ difficulties with proofs by induction, and 2. developing a new approach to teaching proofs by induction by way of an interactive and multimodal electronic book (e-book). For the first goal, I undertook a study to identify possible sources of computer science students’ difficulties with proofs by induction. Its results suggest that there is a close correlation between students’ understanding of inductive definitions and their understanding and performance of proofs by induction. For designing and developing my e-book, I took into consideration the results of my study, as well as the drawbacks of the current methodologies of teaching proofs by induction for computer science. I designed my e-book to be used as a standalone and complete educational environment. I also conducted a study on the effectiveness of my e-book in the classroom. The results of my study suggest that, unlike the current methodologies of teaching proofs by induction for computer science, my e-book helped students overcome many of their difficulties and gain conceptual understanding of proofs induction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance of algorithms for fault location i n transmission lines is directly related to the accuracy of its input data. Thus, fa ctors such as errors in the line parameters, failures in synchronization of oscillographic recor ds and errors in measurements of voltage and current can significantly influence the accurac y of algorithms that use bad data to indicate the fault location. This work presents a new method ology for fault location in transmission lines based on the theory of state estimation in or der to determine the location of faults more accurately by considering realistic systematic erro rs that may be present in measurements of voltage and current. The methodology was implemente d in two stages: pre-fault and post- fault. In the first step, assuming non-synchronized data, the synchronization angle and positive sequence line parameters are estimated, an d in the second, the fault distance is estimated. Besides calculating the most likely faul t distance obtained from measurement errors, the variance associated with the distance f ound is also determined, using the errors theory. This is one of the main contributions of th is work, since, with the proposed algorithm, it is possible to determine a most likely zone of f ault incidence, with approximately 95,45% of confidence. Tests for evaluation and validation of the proposed algorithm were realized from actual records of faults and from simulations of fictitious transmission systems using ATP software. The obtained results are relevant to show that the proposed estimation approach works even adopting realistic variances, c ompatible with real equipments errors.