845 resultados para Notation musicale. catalane


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sähköiset huutokaupat ovat virtuaalisia markkinapaikkoja, jotka sijaitsevat jossain päin internetiä. Sähköistä huutokauppaa käydään yritysten välillä (B2B), yritysten ja kuluttajien välillä (B2C) sekä kuluttajien kesken (C2C). Tässä työssä sähköisellä huutokaupalla tarkoitetaan ensin mainittua, yritysten keskinäistä kaupankäyntiä. Työn tarkoituksena on tutkia työnkulkukoneen soveltuvuutta sähköisen huutokauppajärjestelmän moottorina. Työssä perehdytään avoimen lähdekoodin ActiveBPEL-koneeseen, ja tutkimus tapahtuu suunnittelemalla, mallintamalla ja testaamalla liiketoimintaprosessi, joka rekisteröi ostajan ja myyjän tiedot järjestelmään. Toteutettava prosessi on yksi osa sähköistä huutokauppaa, mutta saman periaatteen mukaisesti olisi mahdollista toteuttaa myös kokonainen huutokauppa. Tässä työssä tarkastellaan sähköistä huutokauppaa, joka perustuu web-palveluihin, ja jolla on selvä koordinaattori. Koordinaattori ohjaa toisia mukana olevia web-palveluja ja niiden ajettavia operaatioita. Korkean tason mallit kuvataan BPMN-notaation avulla, itse prosessi toteutetaan BPEL-kielellä. Prosessin mallinnuksessa ja simuloinnissa käytetään apuna ActiveBPEL Designer -ohjelmaa. Työn tavoitteena on paitsi toteuttaa osa huutokaupasta, myös antaa lukijalle käsitys siitä liiketoimintaympäristöstä, johon huutokauppa kuuluu, sekä valottaa huutokaupan taustalla olevia teknologioita. Erityisesti web-palvelut ja niihin liittyvät käsitteet tulevat lukijalle tutuiksi.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nicolas Collins est un musicien né en 1954 à New York, dont l'oeuvre compose notamment avec le feedback pour interroger en profondeur l'expérience musicale. Héritier de la culture punk et de la musique expérimentale américaine, il détourne l'usage de technologies existantes pour en réinventer les ressources sonores. Ses textes racontent l'histoire des musiciens avec lesquels il a dialogué (John Cage, Alvin Lucier, David Tudor), des oeuvres qu'il a créées et des instruments qu'il a fabriqués, incitant chacun à transformer à son tour les objets mêmes (micros, réseaux, synthétiseurs, etc.) de la transmission sonore. Avec un dvd inédit composé par Nicolas Collins, Salvaged.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The value of forensic results crucially depends on the propositions and the information under which they are evaluated. For example, if a full single DNA profile for a contemporary marker system matching the profile of Mr A is assessed, given the propositions that the DNA came from Mr A and given it came from an unknown person, the strength of evidence can be overwhelming (e.g., in the order of a billion). In contrast, if we assess the same result given that the DNA came from Mr A and given it came from his twin brother (i.e., a person with the same DNA profile), the strength of evidence will be 1, and therefore neutral, unhelpful and irrelevant 1 to the case at hand. While this understanding is probably uncontroversial and obvious to most, if not all practitioners dealing with DNA evidence, the practical precept of not specifying an alternative source with the same characteristics as the one considered under the first proposition may be much less clear in other circumstances. During discussions with colleagues and trainees, cases have come to our attention where forensic scientists have difficulty with the formulation of propositions. It is particularly common to observe that results (e.g., observations) are included in the propositions, whereas-as argued throughout this note-they should not be. A typical example could be a case where a shoe-mark with a logo and the general pattern characteristics of a Nike Air Jordan shoe is found at the scene of a crime. A Nike Air Jordan shoe is then seized at Mr A's house and control prints of this shoe compared to the mark. The results (e.g., a trace with this general pattern and acquired characteristics corresponding to the sole of Mr A's shoe) are then evaluated given the propositions 'The mark was left by Mr A's Nike Air Jordan shoe-sole' and 'The mark was left by an unknown Nike Air Jordan shoe'. As a consequence, the footwear examiner will not evaluate part of the observations (i.e., the mark presents the general pattern of a Nike Air Jordan) whereas they can be highly informative. Such examples can be found in all forensic disciplines. In this article, we present a few such examples and discuss aspects that will help forensic scientists with the formulation of propositions. In particular, we emphasise on the usefulness of notation to distinguish results that forensic scientists should evaluate from case information that the Court will evaluate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Occupational hygiene practitioners typically assess the risk posed by occupational exposure by comparing exposure measurements to regulatory occupational exposure limits (OELs). In most jurisdictions, OELs are only available for exposure by the inhalation pathway. Skin notations are used to indicate substances for which dermal exposure may lead to health effects. However, these notations are either present or absent and provide no indication of acceptable levels of exposure. Furthermore, the methodology and framework for assigning skin notation differ widely across jurisdictions resulting in inconsistencies in the substances that carry notations. The UPERCUT tool was developed in response to these limitations. It helps occupational health stakeholders to assess the hazard associated with dermal exposure to chemicals. UPERCUT integrates dermal quantitative structure-activity relationships (QSARs) and toxicological data to provide users with a skin hazard index called the dermal hazard ratio (DHR) for the substance and scenario of interest. The DHR is the ratio between the estimated 'received' dose and the 'acceptable' dose. The 'received' dose is estimated using physico-chemical data and information on the exposure scenario provided by the user (body parts exposure and exposure duration), and the 'acceptable' dose is estimated using inhalation OELs and toxicological data. The uncertainty surrounding the DHR is estimated with Monte Carlo simulation. Additional information on the selected substances includes intrinsic skin permeation potential of the substance and the existence of skin notations. UPERCUT is the only available tool that estimates the absorbed dose and compares this to an acceptable dose. In the absence of dermal OELs it provides a systematic and simple approach for screening dermal exposure scenarios for 1686 substances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La presente tesi di ricerca si propone di studiare la danza della sarabanda come capolavoro di musica, retorica e poesia in tre delle Sechs Partiten für klavier di J. S. Bach (BWV 825-830). Lo studio svolto è fondamentalmente di tipo retorico: permette di individuare e interpretare la struttura oratorialepoetica del discorso letterario-musicale nelle tre danze scelte e di imarcarne i suoi aspetti più salienti di ritmica e ornatus. L’approccio permette di osservare, con uno sguardo di comparazione, la presenza di un percorso retorico comune alle tre sarabande, che evolve con gradualità, differenziandole nettamente: ciò riesce a valorizzare a pieno la danza nella sua forza ed efficacia espressiva, che la rende di fatto il movimento di massima commozione di ogni Partita.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is a translation from IUPAC nomenclature document by K. Danzer and L. A. Currie (Pure Appl. Chem., 1998, 70(4), 993-1014). Its goal is to establish an uniform and meaningful approach to terminology (in Portuguese), notation, and formulation for calibation in analytical chemistry. In this first part, general fundamentals of calibration are presented, namely for both relationships of qualitative and quantitative variables (relations between variables characterizing certain types analytes of the measured function on the other hand and between variables characterizing the amount or concentration of the chemical species and the intensities of the measured signals, on the other hand). On this basis, the fundamentals of the common single component calibration (Univariate Calibration) which models the relationship y = f(x) between the signal intensities y and the amounts or concentrations x of the analyte under given conditions are represented. Additional papers will be prepared dealing with extensive relationships between several intensities and analyte contents, namely with multivariate calibrations and with optimization and experimental design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En el presente artículo hemos tratado de analizar las características de las almazaras cooperativas catalanas con la finalidad de destacar no solamente sus elementos más importantes de gestión sino también sus principales debilidades. Las cooperativas se consideran instrumentos útiles para acometer políticas de desarrollo económico y bienestar social. A pesar de ello hay un gran debate abierto acerca de la eficiencia de estas organizaciones, que plantean el interrogante de si están en condiciones de competir con éxito en una economía cada vez más abierta, liberalizada y global. A través de las encuestas que hemos realizado al conjunto de las cooperativas oleícolas de Catalunya, hemos podido averiguar sus características distintivas, así como las principales desventajas y problemas a los que se enfrentan. Creemos puede ser de interés analizar las características de estas organizaciones, que tanto desde el punto de vista organizativo como de recursos humanos, como financiero, las hace diferentes de otro tipo de organizaciones pero que sin embargo, han de hacer frente a un mercado cada día más exigente y competitivo

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is a translation of an IUPAC document by K. Danzer, M. Otto and L. A. Currie (Pure Appl. Chem., 2004, 76(6), 1215-1225). Its goal is to establish a uniform and meaningful standard for terminology (in Portuguese), notation, and formulation concerning multispecies calibration in analytical chemistry. Calibration in analytical chemistry refers to the relation between sample domain and measurement domain (signal domain) expressed by an analytical function x = f s (Q) representing a pattern of chemical species Q and their amounts or concentrations x in a given test sample and a measured function y = f (z) that may be a spectrum, chromatogram, etc. Simultaneous multispecies analyses are carried out mainly by spectroscopic and chromatographic methods in a more or less selective way. For the determination of n species Qi (i=1,2, ..., n), at least n signals must be measured which should be well separated in the ideal case. In analytical practice, the situation can be different.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The skill of programming is a key asset for every computer science student. Many studies have shown that this is a hard skill to learn and the outcomes of programming courses have often been substandard. Thus, a range of methods and tools have been developed to assist students’ learning processes. One of the biggest fields in computer science education is the use of visualizations as a learning aid and many visualization based tools have been developed to aid the learning process during last few decades. Studies conducted in this thesis focus on two different visualizationbased tools TRAKLA2 and ViLLE. This thesis includes results from multiple empirical studies about what kind of effects the introduction and usage of these tools have on students’ opinions and performance, and what kind of implications there are from a teacher’s point of view. The results from studies in this thesis show that students preferred to do web-based exercises, and felt that those exercises contributed to their learning. The usage of the tool motivated students to work harder during their course, which was shown in overall course performance and drop-out statistics. We have also shown that visualization-based tools can be used to enhance the learning process, and one of the key factors is the higher and active level of engagement (see. Engagement Taxonomy by Naps et al., 2002). The automatic grading accompanied with immediate feedback helps students to overcome obstacles during the learning process, and to grasp the key element in the learning task. These kinds of tools can help us to cope with the fact that many programming courses are overcrowded with limited teaching resources. These tools allows us to tackle this problem by utilizing automatic assessment in exercises that are most suitable to be done in the web (like tracing and simulation) since its supports students’ independent learning regardless of time and place. In summary, we can use our course’s resources more efficiently to increase the quality of the learning experience of the students and the teaching experience of the teacher, and even increase performance of the students. There are also methodological results from this thesis which contribute to developing insight into the conduct of empirical evaluations of new tools or techniques. When we evaluate a new tool, especially one accompanied with visualization, we need to give a proper introduction to it and to the graphical notation used by tool. The standard procedure should also include capturing the screen with audio to confirm that the participants of the experiment are doing what they are supposed to do. By taken such measures in the study of the learning impact of visualization support for learning, we can avoid drawing false conclusion from our experiments. As computer science educators, we face two important challenges. Firstly, we need to start to deliver the message in our own institution and all over the world about the new – scientifically proven – innovations in teaching like TRAKLA2 and ViLLE. Secondly, we have the relevant experience of conducting teaching related experiment, and thus we can support our colleagues to learn essential know-how of the research based improvement of their teaching. This change can transform academic teaching into publications and by utilizing this approach we can significantly increase the adoption of the new tools and techniques, and overall increase the knowledge of best-practices. In future, we need to combine our forces and tackle these universal and common problems together by creating multi-national and multiinstitutional research projects. We need to create a community and a platform in which we can share these best practices and at the same time conduct multi-national research projects easily.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Programming and mathematics are core areas of computer science (CS) and consequently also important parts of CS education. Introductory instruction in these two topics is, however, not without problems. Studies show that CS students find programming difficult to learn and that teaching mathematical topics to CS novices is challenging. One reason for the latter is the disconnection between mathematics and programming found in many CS curricula, which results in students not seeing the relevance of the subject for their studies. In addition, reports indicate that students' mathematical capability and maturity levels are dropping. The challenges faced when teaching mathematics and programming at CS departments can also be traced back to gaps in students' prior education. In Finland the high school curriculum does not include CS as a subject; instead, focus is on learning to use the computer and its applications as tools. Similarly, many of the mathematics courses emphasize application of formulas, while logic, formalisms and proofs, which are important in CS, are avoided. Consequently, high school graduates are not well prepared for studies in CS. Motivated by these challenges, the goal of the present work is to describe new approaches to teaching mathematics and programming aimed at addressing these issues: Structured derivations is a logic-based approach to teaching mathematics, where formalisms and justifications are made explicit. The aim is to help students become better at communicating their reasoning using mathematical language and logical notation at the same time as they become more confident with formalisms. The Python programming language was originally designed with education in mind, and has a simple syntax compared to many other popular languages. The aim of using it in instruction is to address algorithms and their implementation in a way that allows focus to be put on learning algorithmic thinking and programming instead of on learning a complex syntax. Invariant based programming is a diagrammatic approach to developing programs that are correct by construction. The approach is based on elementary propositional and predicate logic, and makes explicit the underlying mathematical foundations of programming. The aim is also to show how mathematics in general, and logic in particular, can be used to create better programs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents the design for a graphical parameter editor for Testing and Test Control Notation 3 (TTCN-3) test suites. This work was done in the context of OpenTTCN IDE, a TTCN-3 development environment built on top of the Eclipse platform. The design presented relies on an additional parameter editing tab added to the launch configurations for test campaigns. This parameter editing tab shows the list of editable parameters and allows opening editing components for the different parameters. Each TTCN-3 primitive type will have a specific editing component providing tools to ease modification of values of that type.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Liiketoiminnot käyttävät useita erillisiä tietojärjestelmiä. Toimintaprosessit sisältävät useiden eri liiketoimintojen suorittamia tehtäviä. Tehtävien tarvitsemien ja tuottamien tietojen sujuvan virtauksen toteutuminen vaatii tietojen ja tietojärjestelmien integraatiota, jota on toteutettu perinteisesti järjestelmien välisillä suorilla yhteyksillä. Tästä seuraa IT-arkkitehtuurin joustamattomuutta. Palvelulähtöisellä arkkitehtuurilla (Service Oriented Architecture, SOA) luvataan IT-arkkitehtuurille parempaa joustavuutta ja toisaalta kustannussäästöjä. Työssä selvitettiin palveluarkkitehtuurin teoreettinen tausta sekä palvelulähtöisen prosessikuvauskielen BPMN ideaa. Empiirisessä osuudessa haettiin teemahaastattelujen avulla kohdeyrityksen ja sen käyttämien järjestelmätoimittajien näkemyksiä palveluarkkitehtuurista ja siihen vaikuttavista tekijöistä. Lisäksi työssä selvitettiin palveluarkkitehtuurista saatavia vastauksia kohdeyrityksen IT-strategiassa esitettyihin tavoitteisiin. Työn tuloksena analysoitiin palvelupohjaista mallinnusmenetelmää noudattaen prosessi- ja palvelukuvaus sekä tunnistettiin niitä tukevat SOA-palvelut. Menetelmän lopputuloksia hyödyntäen työssä esitettiin implementointiratkaisu palveluväylän avulla toteutettuna. Lisäksi luonnosteltiin ehdotusta siitä, miten kohdeyritys voisi lähteä liikkeelle palveluarkkitehtuurin soveltamisessa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation examined skill development in music reading by focusing on the visual processing of music notation in different music-reading tasks. Each of the three experiments of this dissertation addressed one of the three types of music reading: (i) sight-reading, i.e. reading and performing completely unknown music, (ii) rehearsed reading, during which the performer is already familiar with the music being played, and (iii) silent reading with no performance requirements. The use of the eye-tracking methodology allowed the recording of the readers’ eye movements from the time of music reading with extreme precision. Due to the lack of coherence in the smallish amount of prior studies on eye movements in music reading, the dissertation also had a heavy methodological emphasis. The present dissertation thus aimed to promote two major issues: (1) it investigated the eye-movement indicators of skill and skill development in sight-reading, rehearsed reading and silent reading, and (2) developed and tested suitable methods that can be used by future studies on the topic. Experiment I focused on the eye-movement behaviour of adults during their first steps of learning to read music notation. The longitudinal experiment spanned a nine-month long music-training period, during which 49 participants (university students taking part in a compulsory music course) sight-read and performed a series of simple melodies in three measurement sessions. Participants with no musical background were entitled as “novices”, whereas “amateurs” had had musical training prior to the experiment. The main issue of interest was the changes in the novices’ eye movements and performances across the measurements while the amateurs offered a point of reference for the assessment of the novices’ development. The experiment showed that the novices tended to sight-read in a more stepwise fashion than the amateurs, the latter group manifesting more back-and-forth eye movements. The novices’ skill development was reflected by the faster identification of note symbols involved in larger melodic intervals. Across the measurements, the novices also began to show sensitivity to the melodies’ metrical structure, which the amateurs demonstrated from the very beginning. The stimulus melodies consisted of quarter notes, making the effects of meter and larger melodic intervals distinguishable from effects caused by, say, different rhythmic patterns. Experiment II explored the eye movements of 40 experienced musicians (music education students and music performance students) during temporally controlled rehearsed reading. This cross-sectional experiment focused on the eye-movement effects of one-bar-long melodic alterations placed within a familiar melody. The synchronizing of the performance and eye-movement recordings enabled the investigation of the eye-hand span, i.e., the temporal gap between a performed note and the point of gaze. The eye-hand span was typically found to remain around one second. Music performance students demonstrated increased professing efficiency by their shorter average fixation durations as well as in the two examined eye-hand span measures: these participants used larger eye-hand spans more frequently and inspected more of the musical score during the performance of one metrical beat than students of music education. Although all participants produced performances almost indistinguishable in terms of their auditory characteristics, the altered bars indeed affected the reading of the score: the general effects of expertise in terms of the two eye- hand span measures, demonstrated by the music performance students, disappeared in the face of the melodic alterations. Experiment III was a longitudinal experiment designed to examine the differences between adult novice and amateur musicians’ silent reading of music notation, as well as the changes the 49 participants manifested during a nine-month long music course. From a methodological perspective, an opening to research on eye movements in music reading was the inclusion of a verbal protocol in the research design: after viewing the musical image, the readers were asked to describe what they had seen. A two-way categorization for verbal descriptions was developed in order to assess the quality of extracted musical information. More extensive musical background was related to shorter average fixation duration, more linear scanning of the musical image, and more sophisticated verbal descriptions of the music in question. No apparent effects of skill development were observed for the novice music readers alone, but all participants improved their verbal descriptions towards the last measurement. Apart from the background-related differences between groups of participants, combining verbal and eye-movement data in a cluster analysis identified three styles of silent reading. The finding demonstrated individual differences in how the freely defined silent-reading task was approached. This dissertation is among the first presentations of a series of experiments systematically addressing the visual processing of music notation in various types of music-reading tasks and focusing especially on the eye-movement indicators of developing music-reading skill. Overall, the experiments demonstrate that the music-reading processes are affected not only by “top-down” factors, such as musical background, but also by the “bottom-up” effects of specific features of music notation, such as pitch heights, metrical division, rhythmic patterns and unexpected melodic events. From a methodological perspective, the experiments emphasize the importance of systematic stimulus design, temporal control during performance tasks, and the development of complementary methods, for easing the interpretation of the eye-movement data. To conclude, this dissertation suggests that advances in comprehending the cognitive aspects of music reading, the nature of expertise in this musical task, and the development of educational tools can be attained through the systematic application of the eye-tracking methodology also in this specific domain.