11 resultados para automatic music analysis
em Helda - Digital Repository of University of Helsinki
Resumo:
This thesis explores melodic and harmonic features of heavy metal, and while doing so, explores various methods of music analysis; their applicability and limitations regarding the study of heavy metal music. The study is built on three general hypotheses according to which 1) acoustic characteristics play a significant role for chord constructing in heavy metal, 2) heavy metal has strong ties and similarities with other Western musical styles, and 3) theories and analytical methods of Western art music may be applied to heavy metal. It seems evident that in heavy metal some chord structures appear far more frequently than others. It is suggested here that the fundamental reason for this is the use of guitar distortion effect. Subsequently, theories as to how and under what principles heavy metal is constructed need to be put under discussion; analytical models regarding the classification of consonance and dissonance and chord categorization are here revised to meet the common practices of this music. It is evident that heavy metal is not an isolated style of music; it is seen here as a cultural fusion of various musical styles. Moreover, it is suggested that the theoretical background to the construction of Western music and its analysis can offer invaluable insights to heavy metal. However, the analytical methods need to be reformed to some extent to meet the characteristics of the music. This reformation includes an accommodation of linear and functional theories that has been found rather rarely in music theory and musicology.
Resumo:
Music as the Art of Anxiety: A Philosophical Approach to the Existential-Ontological Meaning of Music. The present research studies music as an art of anxiety from the points of view of both Martin Heidegger s thought and phenomenological philosophy in general. In the Heideggerian perspective, anxiety is understood as a fundamental mode of being (Grundbefindlichkeit) in human existence. Taken as an existential-ontological concept, anxiety is conceived philosophically and not psychologically. The central research questions are: what is the relationship between music and existential-ontological anxiety? In what way can music be considered as an art of anxiety? In thinking of music as a channel and manifestation of anxiety, what makes it a special case? What are the possible applications of phenomenology and Heideggerian thought in musicology? The main aim of the research is to develop a theory of music as an art of existential-ontological anxiety and to apply this theory to musicologically relevant phenomena. Furthermore, the research will contribute to contemporary musicological debates and research as it aims to outline the phenomenological study of music as a field of its own; the development of a specific methodology is implicit in these aims. The main subject of the study, a theory of music as an art of anxiety, integrates Heideggerian and phenomenological philosophies with critical and cultural theories concerning violence, social sacrifice, and mimetic desire (René Girard), music, noise and society (Jacques Attali), and the affect-based charme of music (Vladimir Jankélévitch). Thus, in addition to the subjective mood (Stimmung) of emptiness and meaninglessness, the philosophical concept of anxiety also refers to a state of disorder and chaos in general; for instance, to noise in the realm of sound and total (social) violence at the level of society. In this study, music is approached as conveying the existentially crucial human compulsion for signifying i.e., organizing chaos. In music, this happens primarily at the immediate level of experience, i.e. in affectivity, and also in relation to all of the aforementioned dimensions (sound, society, consciousness, and so on). Thus, music s existential-ontological meaning in human existence, Dasein, is in its ability to reveal different orders of existence as such. Indeed, this makes music the art of anxiety: more precisely, music can be existentially significant at the level of moods. The study proceeds from outlining the relevance of phenomenology and Heidegger s philosophy in musicology to the philosophical development of a theory of music as the art of anxiety. The theory is developed further through the study of three selected specific musical phenomena: the concept of a musical work, guitar smashing in the performance tradition of rock music, and Erik Bergman s orchestral work Colori ed improvvisazioni. The first example illustrates the level of individual human-subject in music as the art of anxiety, as a means of signifying chaos, while the second example focuses on the collective need to socio-culturally channel violence. The third example, being music-analytical, studies contemporary music s ability to mirror the structures of anxiety at the level of a specific musical text. The selected examples illustrate that, in addition to the philosophical orientation, the research also contributes to music analysis, popular music studies, and the cultural-critical study of music. Key words: music, anxiety, phenomenology, Martin Heidegger, ontology, guitar smashing, Erik Bergman, musical work, affectivity, Stimmung, René Girard
Resumo:
The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.
Resumo:
The subject of the thesis is the mediated construction of author images in popular music. In the study, the construction of images is treated as a process in which artists, the media and the members of the audience participate. The notions of presented, mediated and compiled author images are used in explaining the mediation process and the various authorial roles of the agents involved. In order to explore the issue more closely, I analyse the author images of a group of popular music artists representing the genres of rock, pop and electronic dance music. The analysed material consists mostly of written media texts through which the artists authorial roles and creative responsibilities are discussed. Theoretically speaking, the starting points for the examination lie in cultural studies and discourse analysis. Even though author images may be conceived as intertextual constructions, the artist is usually presented as a recognizable figure whose purpose is to give the music its public face. This study does not, then, deal with musical authors as such, but rather with their public images and mediated constructions. Because of the author-based functioning of popular music culture and the idea of the artist s individual creative power, the collective and social processes involved in the making of popular music are often superseded by the belief in a single, originating authorship. In addition to the collective practices of music making, the roles of the media and the marketing machinery complicate attempts to clarify the sharing of authorial contributions. As the case studies demonstrate, the differences between the examined author images are connected with a number of themes ranging from issues of auteurism and stardom to the use of masked imagery and the blending of authorial voices. Also the emergence of new music technologies has affected not only the ways in which music is made, but also how the artist s authorial status and artistic identity is understood. In the study at hand, the author images of auteurs, stars, DJs and sampling artists are discussed alongside such varied topics as collective authorship, evaluative hierarchies, visual promotion and generic conventions. Taken altogether, the examined case studies shed light on the functioning of popular music culture and the ways in which musical authorship is (re)defined.
Resumo:
The dissertation focuses on the development of music education in Estonian kindergartens and the factors influencing it, analysed in the historical perspective relying on post-positivist paradigm. The study is based on the factors and subjects’ views on kindergarten music education from 1905 to 2008, recorded in written sources or ascertained by means of questionnaire and interview. The dissertation deals with music’s functions, music education in retrospective, factors influencing child’s musical aptitude and development and teacher’s role in it through the prism of history. The formation of Estonian kindergarten music education and the phenomenon of its development have been researched by stages: the first manifestations of music in kindergarten in 1905 - 1940; the formation of the concept of music education in 1941 - 1967 and the application of a unified system in 1968 - 1990. The work also outlines innovative trends in music education at the end of the last millennium and the beginning of this century, in 1991 - 2008. The study relies on a combined design and an analysis of historical archival material and empirical data. The empirical part of the study is based on the questionnaire (n=183) and interviews (n=18) carried out with kindergarten music teachers. The data has been analysed using both qualitative and quantitative methods. The subject of the research is the content and activity types of kindergarten music education and the role of music teacher in their implementation. The study confirmed that fundamental changes took place in Estonian kindergarten music education due to the change in political power in the 1940s. Following the example of the Soviet system of education, music in kindergarten became an independent music educational orientation and the position of a professionally trained music teacher was established (1947). It was also confirmed that in the newly independent Estonian Republic under the influence of innovative trends a new paradigm of music education arose from the traditional singing-centred education towards the more balanced use of music activity types (attaching importance to the child-centred approach, an increase in the number and variety of activity types). The most important conclusions made in the dissertation are that there has been improvement and development deriving from contemporary trends in the clear concept that has evolved in Estonian kindergarten music education over a century; professionally trained music teachers have had a crucial role in shaping it; and kindergarten music education is firmly positioned as a part of preschool education in Estonian system of education. Key words: early childhood music education, history of music education, kindergarten music education, early childhood music teachers
Resumo:
Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.
Resumo:
This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.
Resumo:
Wireless access is expected to play a crucial role in the future of the Internet. The demands of the wireless environment are not always compatible with the assumptions that were made on the era of the wired links. At the same time, new services that take advantage of the advances in many areas of technology are invented. These services include delivery of mass media like television and radio, Internet phone calls, and video conferencing. The network must be able to deliver these services with acceptable performance and quality to the end user. This thesis presents an experimental study to measure the performance of bulk data TCP transfers, streaming audio flows, and HTTP transfers which compete the limited bandwidth of the GPRS/UMTS-like wireless link. The wireless link characteristics are modeled with a wireless network emulator. We analyze how different competing workload types behave with regular TPC and how the active queue management, the Differentiated services (DiffServ), and a combination of TCP enhancements affect the performance and the quality of service. We test on four link types including an error-free link and the links with different Automatic Repeat reQuest (ARQ) persistency. The analysis consists of comparing the resulting performance in different configurations based on defined metrics. We observed that DiffServ and Random Early Detection (RED) with Explicit Congestion Notification (ECN) are useful, and in some conditions necessary, for quality of service and fairness because a long queuing delay and congestion related packet losses cause problems without DiffServ and RED. However, we observed situations, where there is still room for significant improvements if the link-level is aware of the quality of service. Only very error-prone link diminishes the benefits to nil. The combination of TCP enhancements improves performance. These include initial window of four, Control Block Interdependence (CBI) and Forward RTO recovery (F-RTO). The initial window of four helps a later starting TCP flow to start faster but generates congestion under some conditions. CBI prevents slow-start overshoot and balances slow start in the presence of error drops, and F-RTO reduces unnecessary retransmissions successfully.
Resumo:
Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.