11 resultados para Qualitative data analysis software

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The starting point of this study was to find out how the historical consciousness manifest in conceptions and experiences of Chilean refugees and their descendants. The previous research of historical consciousness has shown that powerful experiences such as the revolution and being a refugee may have an effect on historical consciousness. The purpose of this study is to solve how those experiences in the past have influenced Chilean refugees and their descendant s interpretations of the present and expectations for the future. The research material was collected by interviewing four Chilean refugees that escaped to Finland in years 1973 1976 and four young adults who represent the second generation. All second generation interviewees were born in Finland and their other parent or both parents were Chilean refugees. The two groups were not in a family relation to each other. The empirical part of the research was made by qualitative methods. The research material was collected by the method of focused interview and it was analysed by the qualitative data analysis software Atlas.ti 6.0. Content analysis was the main research tool. The previous theory of historical consciousness and the study questions was used to create the seven categories that manifest historical consciousness. The seven categories were biographical memory, collective memory, experiences of living between two cultures, idea of man, the essence of history and the reason for living, value conceptions and expectations of the future. Content analysis was based on those categories. Subcategories were based on the research material and were created during the analysis. The results of this study were made up of categories. The study revealed that experiences of revolution and of being a refugee has a significant role in the historical consciousness of the Chilean refugees. It became evident in their biographical memory being separated in three parts, in their values and in the belief of possibility of an individual to govern her own life. The second generation was also exposed to their parent s experiences in the past. The collective trauma in their parent s past has been part of their life indirectly and has affected the way they think of themselves, their concepts and their place in the present world. The active and regular retrospection in Finland by Chilean adults and special Gabriela Mistral club activities has played a big part in the construction of their historical consciousness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Thesis, we develop theory and methods for computational data analysis. The problems in data analysis are approached from three perspectives: statistical learning theory, the Bayesian framework, and the information-theoretic minimum description length (MDL) principle. Contributions in statistical learning theory address the possibility of generalization to unseen cases, and regression analysis with partially observed data with an application to mobile device positioning. In the second part of the Thesis, we discuss so called Bayesian network classifiers, and show that they are closely related to logistic regression models. In the final part, we apply the MDL principle to tracing the history of old manuscripts, and to noise reduction in digital signals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accelerator mass spectrometry (AMS) is an ultrasensitive technique for measuring the concentration of a single isotope. The electric and magnetic fields of an electrostatic accelerator system are used to filter out other isotopes from the ion beam. The high velocity means that molecules can be destroyed and removed from the measurement background. As a result, concentrations down to one atom in 10^16 atoms are measurable. This thesis describes the construction of the new AMS system in the Accelerator Laboratory of the University of Helsinki. The system is described in detail along with the relevant ion optics. System performance and some of the 14C measurements done with the system are described. In a second part of the thesis, a novel statistical model for the analysis of AMS data is presented. Bayesian methods are used in order to make the best use of the available information. In the new model, instrumental drift is modelled with a continuous first-order autoregressive process. This enables rigorous normalization to standards measured at different times. The Poisson statistical nature of a 14C measurement is also taken into account properly, so that uncertainty estimates are much more stable. It is shown that, overall, the new model improves both the accuracy and the precision of AMS measurements. In particular, the results can be improved for samples with very low 14C concentrations or measured only a few times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: Develop and validate tools to estimate residual noise covariance in Planck frequency maps. Quantify signal error effects and compare different techniques to produce low-resolution maps. Methods: We derive analytical estimates of covariance of the residual noise contained in low-resolution maps produced using a number of map-making approaches. We test these analytical predictions using Monte Carlo simulations and their impact on angular power spectrum estimation. We use simulations to quantify the level of signal errors incurred in different resolution downgrading schemes considered in this work. Results: We find an excellent agreement between the optimal residual noise covariance matrices and Monte Carlo noise maps. For destriping map-makers, the extent of agreement is dictated by the knee frequency of the correlated noise component and the chosen baseline offset length. The significance of signal striping is shown to be insignificant when properly dealt with. In map resolution downgrading, we find that a carefully selected window function is required to reduce aliasing to the sub-percent level at multipoles, ell > 2Nside, where Nside is the HEALPix resolution parameter. We show that sufficient characterization of the residual noise is unavoidable if one is to draw reliable contraints on large scale anisotropy. Conclusions: We have described how to compute the low-resolution maps, with a controlled sky signal level, and a reliable estimate of covariance of the residual noise. We have also presented a method to smooth the residual noise covariance matrices to describe the noise correlations in smoothed, bandwidth limited maps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the study was to get acquainted with the activity of Näppärät Mummot, a Lahti-based crafts society, and its importance to the wellness of the members of the group. The selected aim, i.e., analyzing the wellness, largely affected the whole research process and its results. According to earlier studies in the field, different forms of craft and expressional activity promote one's wellness as well as support the work for one's identity. Based on my theoretical knowledge, my research was set out to: 1) form a general view of crafts culture within Näppärät Mummot and 2) find out how recollective craft that promotes wellness is perceived through communality, experiential activity, work for one's identity, and divided as well as undivided craft. Qualitative field work was governed by ethnographic research strategy, according to which I set out to get thoroughly familiar with the society I was studying. The methods I used to collect data were participant observation and thematic interview. I used a field diary for writing down all data I acquired through the observation. The interviewee group was formed by seven members of Näppärät Mummot. An mp3 recorder was used to record the interviews, which I transcribed later. The method for data analysis was qualitative content analysis, for which I used Weft QDA, a qualitative analysis software application. I formed themes that shed light on research tasks from the data using coding and theory-driven analysis. I kept literature and data I collected in cooperation through the whole analysis process. Lastly, drawing from the classes of meaning of therapeutic craft that I sketched by means of summarizing and classifying, I presented the central concepts that describe the main results of the study. The main results were six concepts that describe Näppärät Mummot's crafts culture and recollective craft with its wellness-beneficial effect: 1) autobiographical craft, 2) shared work for one's identity, 3) shared intention for craft, 4) craft as a partner, 5) individual manner of craft, and 6) shared improvement. Craft promoted wellness in many ways. It was used to promote inner life management in difficult times and it also provided sensations of empowerment through pleasure from craft. Expressional, shared craft also served as means of reinforcing one's identity in various ways. Expressional work for one's identity through autobiographical themes of craft represented rearranging one's life through holistic craft. A personal way of doing things also served as expressional action and work for one's identity even with divided craft. Shared work for identities meant reinforcing the identities of the members through discources of craft and interaction with their close ones. What proves the interconnection between communality and craft as well as their shared meaning is that communality motivated the members to work on their craft projects, while craft served as the means of communication between the members: communication through craft was easier than lingual communication. The results can not be generalized to apply to other groups: they are used to describe the versatile means of recollective craft to promote the well-being among the crafts society Näppärät Mummot. However, the results do introduce a new perspective to the social discussion on how cultural activities promote well-being.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A key trait of Free and Open Source Software (FOSS) development is its distributed nature. Nevertheless, two project-level operations, the fork and the merge of program code, are among the least well understood events in the lifespan of a FOSS project. Some projects have explicitly adopted these operations as the primary means of concurrent development. In this study, we examine the effect of highly distributed software development, is found in the Linux kernel project, on collection and modelling of software development data. We find that distributed development calls for sophisticated temporal modelling techniques where several versions of the source code tree can exist at once. Attention must be turned towards the methods of quality assurance and peer review that projects employ to manage these parallel source trees. Our analysis indicates that two new metrics, fork rate and merge rate, could be useful for determining the role of distributed version control systems in FOSS projects. The study presents a preliminary data set consisting of version control and mailing list data.