897 resultados para Programming tasks


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämä kandidaatintyö tutkii tietotekniikan perusopetuksessa keskeisen aiheen,ohjelmoinnin, alkeisopetusta ja siihen liittyviä ongelmia. Työssä perehdytään ohjelmoinnin perusopetusmenetelmiin ja opetuksen lähestymistapoihin, sekä ratkaisuihin, joilla opetusta voidaan tehostaa. Näitä ratkaisuja työssä ovat mm. ohjelmointikielen valinta, käytettävän kehitysympäristön löytäminen sekä kurssia tukevien opetusapuvälineiden etsiminen. Lisäksi kurssin läpivientiin liittyvien toimintojen, kuten harjoitusten ja mahdollisten viikkotehtävien valinta kuuluu osaksitätä työtä. Työ itsessään lähestyy aihetta tutkimalla Pythonin soveltuvuutta ohjelmoinnin alkeisopetukseen mm. vertailemalla sitä muihin olemassa oleviin yleisiin opetuskieliin, kuten C, C++ tai Java. Se tarkastelee kielen hyviä ja huonoja puolia, sekä tutkii, voidaanko Pythonia hyödyntää luontevasti pääasiallisena opetuskielenä. Lisäksi työ perehtyy siihen, mitä kaikkea kurssilla tulisi opettaa, sekä siihen, kuinka kurssin läpivienti olisi tehokkainta toteuttaa ja minkälaiset tekniset puitteet kurssin toteuttamista varten olisi järkevää valita.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tutkimus keskittyy hankintatoimen kehittämiseen osana laitosprojektien toteutusta. Työ pohjautuu empiiriseltä taustaltaan Pöyry Oyj:n projektiliiketoimintaan ja työn tarkastelunäkökulmaksi onvalittu projektihallinnosta vastaavan yrityksen näkökulma. Tutkimus on hyvin käytännönläheinen ¿ se lähtee hankinnan ja sen seurannan ongelmista ja pyrkii tarjoamaan niihin uudenlaisia ratkaisuja. Pohjimmiltaan tutkimus kuuluu teollisuustalouden piiriin, vaikka tietojärjestelmätieteellä on vahva tukirooli. Työn tavoitteet ja tulokset liittyvät teollisuustaloudelle ominaisesti yrityksen toiminnan kehittämiseen, käytetyt välineet ja ratkaisut puolestaan hyödyntävät tietojärjestelmätieteen antamia mahdollisuuksia. Tutkimuksessa on käytetty konstruktiivista tutkimusotetta, jonka mukaisesti on luotu innovatiivisia konstruktioita ratkaisemaan aitoja reaalimaailman ongelmia ja tätä kautta tuotettu kontribuutioita teollisuustaloudelle. Tavoitteena oli järjestää hankintatoimi ja sen seuranta suurissa laitosprojekteissa tehokkaammin. Tätä varten uudistettiin ensin projektihallinnon ja hankintatoimen toimintaohjeet vastaamaan paremmin nykyajan vaatimuksia. Toimintaohjeiden perusteella ryhdyttiin toteuttamaan hankintaohjelmistoa, joka pystyisi kattamaan kaikki toimintaohjeissa kuvatut piirteet. Lopulta hankintaohjelmisto toi mukanaan uusia piirteitä projektihallintoon ja hankintatoimeen ja nämä sisällytettiin toimintaohjeisiin. Tähän kehitystyöhön ryhdyttiin, jotta laitosprojektien projektihallinto ja hankintatoimi toimisivat paremmin, eli pienemmin kustannuksin tuottaen projekteissa tarvittavat tulokset nopeammin, tarkemmin ja laadukkaammin. Tutkimuksella on kolmenlaisia tuloksia: hankintatoimen parannetut metodit, hankintaohjelmiston pohjana olevat toiminta- ja laskentamallit sekä implementaationa hankintasovellus. Uudistetut projekti- ja hankintaohjeet kuvaavat hankintatoiminnan parannettuja metodeja. Hankintaohjelmistoasuunnitellessa ja kehitettäessä tehdyt kuvaukset sisältävät uusia malleja niin hankintaprosessille kuin hankinnan seuraamiseksi suurissa laitosprojekteissa. Itse ohjelmisto on tuloksena implementaatio, joka perustuu parannettuihin hankintametodeihin ja uusiin toiminta- ja laskentamalleihin. Uudistetut projekti- ja hankintaohjeet ovat olleet käytössä Pöyry Oyj:ssä vuodesta 1991. Vuosien varrella nämä toimintaohjeet ovat auttaneet ja tukeneet satojen laitosprojektientoteutusta ja ylläpitäneet Pöyry Oyj:n kilpailukykyä kansainvälisenä projektitalona. Hankintasovellus puolestaan on ollut käytössä useissa projekteissa ja sen on havaittu pienentävän hankintatoimen suoria työkustannuksia laitosprojekteissa. Sovelluksen katsotaan myös tuovan epäsuoria kustannussäästöjä parempien hankintapäätösten muodossa, mutta näiden säästöjen suuruutta ei pystytä luotettavasti arvioimaan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ignoring irrelevant visual information aids efficient interaction with task environments. We studied how people, after practice, start to ignore the irrelevant aspects of stimuli. For this we focused on how information reduction transfers to rarely practised and novel stimuli. In Experiment 1, we compared competing mathematical models on how people cease to fixate on irrelevant parts of stimuli. Information reduction occurred at the same rate for frequent, infrequent, and novel stimuli. Once acquired with some stimuli, it was applied to all. In Experiment 2, simplification of task processing also occurred in a once-for-all manner when spatial regularities were ruled out so that people could not rely on learning which screen position is irrelevant. Apparently, changes in eye movements were an effect of a once-for-all strategy change rather than a cause of it. Overall, the results suggest that participants incidentally acquired knowledge about regularities in the task material and then decided to voluntarily apply it for efficient task processing. Such decisions should be incorporated into accounts of information reduction and other theories of strategy change in skill acquisition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Possibilistic Defeasible Logic Programming (P-DeLP) is a logic programming language which combines features from argumentation theory and logic programming, incorporating the treatment of possibilistic uncertainty at the object-language level. In spite of its expressive power, an important limitation in P-DeLP is that imprecise, fuzzy information cannot be expressed in the object language. One interesting alternative for solving this limitation is the use of PGL+, a possibilistic logic over Gödel logic extended with fuzzy constants. Fuzzy constants in PGL+ allow expressing disjunctive information about the unknown value of a variable, in the sense of a magnitude, modelled as a (unary) predicate. The aim of this article is twofold: firstly, we formalize DePGL+, a possibilistic defeasible logic programming language that extends P-DeLP through the use of PGL+ in order to incorporate fuzzy constants and a fuzzy unification mechanism for them. Secondly, we propose a way to handle conflicting arguments in the context of the extended framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the last decade defeasible argumentation frameworks have evolved to become a sound setting to formalize commonsense, qualitative reasoning. The logic programming paradigm has shown to be particularly useful for developing different argument-based frameworks on the basis of different variants of logic programming which incorporate defeasible rules. Most of such frameworks, however, are unable to deal with explicit uncertainty, nor with vague knowledge, as defeasibility is directly encoded in the object language. This paper presents Possibilistic Logic Programming (P-DeLP), a new logic programming language which combines features from argumentation theory and logic programming, incorporating as well the treatment of possibilistic uncertainty. Such features are formalized on the basis of PGL, a possibilistic logic based on G¨odel fuzzy logic. One of the applications of P-DeLP is providing an intelligent agent with non-monotonic, argumentative inference capabilities. In this paper we also provide a better understanding of such capabilities by defining two non-monotonic operators which model the expansion of a given program P by adding new weighed facts associated with argument conclusions and warranted literals, respectively. Different logical properties for the proposed operators are studied

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After incidentally learning about a hidden regularity, participants can either continue to solve the task as instructed or, alternatively, apply a shortcut. Past research suggests that the amount of conflict implied by adopting a shortcut seems to bias the decision for vs. against continuing instruction-coherent task processing. We explored whether this decision might transfer from one incidental learning task to the next. Theories that conceptualize strategy change in incidental learning as a learning-plus-decision phenomenon suggest that high demands to adhere to instruction-coherent task processing in Task 1 will impede shortcut usage in Task 2, whereas low control demands will foster it. We sequentially applied two established incidental learning tasks differing in stimuli, responses and hidden regularity (the alphabet verification task followed by the serial reaction task, SRT). While some participants experienced a complete redundancy in the task material of the alphabet verification task (low demands to adhere to instructions), for others the redundancy was only partial. Thus, shortcut application would have led to errors (high demands to follow instructions). The low control demand condition showed the strongest usage of the fixed and repeating sequence of responses in the SRT. The transfer results are in line with the learning-plus-decision view of strategy change in incidental learning, rather than with resource theories of self-control.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this master's thesis is to study robot programming using simulation software, and also how to embed the simulation software into company's own robot controlling software. The further goal is to study a new communication interface to the assembly line's components -more precisely how to connect the robot cell into this new communication system. Conveyor lines are already available where the conveyors use the new communication standard. The robot cell is not yet capable of communicating with to other devices using the new communication protocols. The main problem among robot manufacturers is that they all have their own communication systems and programming languages. There has not been any common programming language to program all the different robot manufacturers robots, until the RRS (Realistic Robot Simulation) standards were developed. The RRS - II makes it possible to create the robot programs in the simulation software and it gives a common user interface for different robot manufacturers robots. This thesis will present the RRS - II standard and the robot manufacturers situation for the RRS - II support. Thesis presents how the simulation software can be embedded into company's own robot controlling software and also how the robot cell can be connected to the CAMX (Computer Aided Manufacturing using XML) communication system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research examines the impacts of the Swiss reform of the allocation of tasks which was accepted in 2004 and implemented in 2008 to "re-assign" the responsibilities between the federal government and the cantons. The public tasks were redistributed, according to the leading and fundamental principle of subsidiarity. Seven tasks came under exclusive federal responsibility; ten came under the control of the cantons; and twenty-two "common tasks" were allocated to both the Confederation and the cantons. For these common tasks it wasn't possible to separate the management and the implementation. In order to deal with nineteen of them, the reform introduced the conventions-programs (CPs), which are public law contracts signed by the Confederation with each canton. These CPs are generally valid for periods of four years (2008-11, 2012-15 and 2016-19, respectively). The third period is currently being prepared. By using the principal-agent theory I examine how contracts can improve political relations between a principal (Confederation) and an agent (canton). I also provide a first qualitative analysis by examining the impacts of these contracts on the vertical cooperation and on the implication of different actors by focusing my study on five CPs - protection of cultural heritage and conservation of historic monuments, encouragement of the integration of foreigners, economic development, protection against noise and protection of the nature and landscape - applied in five cantons, which represents twenty-five cases studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demyelinating diseases are characterized by a loss of oligodendrocytes leading to axonal degeneration and impaired brain function. Current strategies used for the treatment of demyelinating disease such as multiple sclerosis largely rely on modulation of the immune system. Only limited treatment options are available for treating the later stages of the disease, and these treatments require regenerative therapies to ameliorate the consequences of oligodendrocyte loss and axonal impairment. Directed differentiation of adult hippocampal neural stem/progenitor cells (NSPCs) into oligodendrocytes may represent an endogenous source of glial cells for cell-replacement strategies aiming to treat demyelinating disease. Here, we show that Ascl1-mediated conversion of hippocampal NSPCs into mature oligodendrocytes enhances remyelination in a diphtheria-toxin (DT)-inducible, genetic model for demyelination. These findings highlight the potential of targeting hippocampal NSPCs for the treatment of demyelinated lesions in the adult brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: To provide an overview of available evidence of the potential role of epigenetics in the pathogenesis of hypertension and vascular dysfunction. RECENT FINDINGS: Arterial hypertension is a highly heritable condition. Surprisingly, however, genetic variants only explain a tiny fraction of the phenotypic variation and the term 'missing heritability' has been coined to describe this phenomenon. Recent evidence suggests that phenotypic alteration that is unrelated to changes in DNA sequence (thereby escaping detection by classic genetic methodology) offers a potential explanation. Here, we present some basic information on epigenetics and review recent work consistent with the hypothesis of epigenetically induced arterial hypertension. SUMMARY: New technologies that enable the rigorous assessment of epigenetic changes and their phenotypic consequences may provide the basis for explaining the missing heritability of arterial hypertension and offer new possibilities for treatment and/or prevention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concurrent aims to be a different type of task distribution system compared to what MPI like system do. It adds a simple but powerful application abstraction layer to distribute the logic of an entire application onto a swarm of clusters holding similarities with volunteer computing systems. Traditional task distributed systems will just perform simple tasks onto the distributed system and wait for results. Concurrent goes one step further by letting the tasks and the application decide what to do. The programming paradigm is then totally async without any waits for results and based on notifications once a computation has been performed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Virtual Laboratories are an indispensablespace for developing practical activities in a Virtual Environment. In the field of Computer and Software Engineering different types of practical activities have tobe performed in order to obtain basic competences which are impossible to achieve by other means. This paper specifies an ontology for a general virtual laboratory.The proposed ontology provides a mechanism to select the best resources needed in a Virtual Laboratory once a specific practical activity has been defined and the maincompetences that students have to achieve in the learning process have been fixed. Furthermore, the proposed ontology can be used to develop an automatic and wizardtool that creates a Moodle Classroom using the practical activity specification and the related competences.