948 resultados para university standards
Resumo:
This research is connected with an education development project for the four-year-long officer education program at the National Defence University. In this curriculum physics was studied in two alternative course plans namely scientific and general. Observations connected to the later one e.g. student feedback and learning outcome gave indications that action was needed to support the course. The reform work was focused on the production of aligned course related instructional material. The learning material project produced a customized textbook set for the students of the general basic physics course. The research adapts phases that are typical in Design Based Research (DBR). The research analyses the feature requirements for physics textbook aimed at a specific sector and frames supporting instructional material development, and summarizes the experiences gained in the learning material project when the selected frames have been applied. The quality of instructional material is an essential part of qualified teaching. The goal of instructional material customization is to increase the product's customer centric nature and to enhance its function as a support media for the learning process. Textbooks are still one of the core elements in physics teaching. The idea of a textbook will remain but the form and appearance may change according to the prevailing technology. The work deals with substance connected frames (demands of a physics textbook according to the PER-viewpoint, quality thinking in educational material development), frames of university pedagogy and instructional material production processes. A wide knowledge and understanding of different frames are useful in development work, if they are to be utilized to aid inspiration without limiting new reasoning and new kinds of models. Applying customization even in the frame utilization supports creative and situation aware design and diminishes the gap between theory and practice. Generally, physics teachers produce their own supplementary instructional material. Even though customization thinking is not unknown the threshold to produce an entire textbook might be high. Even though the observations here are from the general physics course at the NDU, the research gives tools also for development in other discipline related educational contexts. This research is an example of an instructional material development work together the questions it uncovers, and presents thoughts when textbook customization is rewarding. At the same time, the research aims to further creative customization thinking in instruction and development. Key words: Physics textbook, PER (Physics Education Research), Instructional quality, Customization, Creativity
Resumo:
Examines the symbolic significance of major events and their security provision in the historical and contemporary context of the European Code of Police Ethics. Stresses the potential of major events to set new practical policing and security standards of technology and in doing so necessitiate the maintenance of professional ethical standards for policing in Europe.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.
Resumo:
One of the central issues in making efficient use of IT in the design, construction and maintenance of buildings is the sharing of the digital building data across disciplines and lifecycle stages. One technology which enables data sharing is CAD layering, which to be of real use requires the definition of standards. This paper focuses on the background, objectives and effectiveness of the International standard ISO 13567, Organisation and naming of layers for CAD. In particular the efficiency and effectiveness of the standardisation and standard implementation process are in focus, rather than the technical details. The study was conducted as a qualitative study with a number of experts who responded to a semi-structured mail questionnaire, supplemented by personal interviews. The main results were that CAD layer standards based on the ISO standard have been implemented, particularly in northern European countries, but are not very widely used. A major problem which was identified was the lack of resources for marketing and implementing the standard as national variations, once it had been formally accepted.
Resumo:
This dissertation empirically explores the relations among three theoretical perspectives: university students approaches to learning, self-regulated learning, as well as cognitive and attributional strategies. The relations were quantitatively studied from both variable- and person-centered perspectives. In addition, the meaning that students gave to their disciplinary choices was examined. The general research questions of the study were: 1) What kinds of relationships exist among approaches to learning, regulation of learning, and cognitive and attributional strategies? What kinds of cognitive-motivational profiles can be identified among university students, and how are such profiles related to study success and well-being? 3) How do university students explain their disciplinary choices? Four empirical studies addressed these questions. Studies I, II, and III were quantitative, applying self-report questionnaires, and Study IV was qualitative in nature. Study I explored relations among cognitive strategies, approaches to learning, regulation of learning, and study success by using correlations and a K-means cluster analysis. The participants were 366 students from various faculties at different phases of their studies. The results showed that all the measured constructs were logically related to each other in both variable- and person-centered approaches. Study II further examined what kinds of cognitive-motivational profiles could be identified among first-year university students (n=436) in arts, law, and agriculture and forestry. Differences in terms of study success, exhaustion, and stress among students with differing profiles were also looked at. By using a latent class cluster analysis (LCCA), three groups of students were identified: non-academic (34%), self-directed (35%), and helpless students (31%). Helpless students reported the highest levels of stress and exhaustion. Self-directed students received the highest grades. In Study III, cognitive-motivational profiles were identified among novice teacher students (n=213) using LCCA. Well-being, epistemological beliefs, and study success were looked at in relation to the profiles. Three groups of students were found: non-regulating (50%), self-directed (35%), and non-reflective (22%). Self-directed students again received the best grades. Non-regulating students reported the highest levels of stress and exhaustion, the lowest level of interest, and showed the strongest preference for certain and practical knowledge. Study IV, which was qualitative in nature, explored how first-year students (n = 536 ) in three fields of studies, arts, law, and veterinary medicine explained their disciplinary choices. Content analyses showed that interest appeared to be a common concept in students description of their choices across the three faculties. However, the objects of interest of the freshmen appeared rather unspecified. Veterinary medicine and law students most often referred to future work or a profession, whereas only one-fifth of the arts students did so. The dissertation showed that combining different theoretical perspectives and methodologies enabled us to build a rich picture of university students cognitive and motivational predispositions towards studying and learning. Further, cognitive-emotional aspects played a significant role in studying, not only in relation to study success, but also in terms of well-being. Keywords: approaches to learning, self-regulation, cognitive and attributional strategies, university students
Resumo:
Study orientations in higher education consist of various dimensions, such as approaches to learning, conceptions of learning and knowledge (i.e. epistemologies), self-regulation, and motivation. They have also been measured in different ways. The main orientations typically reported are reproducing and meaning orientations. The present study explored dimensions of study orientations, focusing in particular on pharmacy and medicine. New versions of self-report instruments were developed and tested in various contexts and in two countries. Furthermore, the linkages between study orientations and students epistemological development were explored. The context of problem-based (PBL) small groups was investigated in order to better understand how collaboration contributes to the quality of learning. The participants of Study I (n=66) were pharmacy students, who were followed during a three-year professionally oriented program in terms of their study orientations and epistemologies. A reproducing orientation to studying diminished during studying, whereas only a few students maintained their original level of meaning orientation. Dualism was found to be associated with a reproducing orientation. In Study II practices associated with deep and surface approaches to learning were measured in two differing ways, in order to better distinguish between what students believed to be useful in studying, and the extent to which they applied their beliefs to practice when preparing for examinations. Differences between domains were investigated by including a sample of Finnish and Swedish medical students (n=956) and a Finnish non-medical sample of university students (n=865). Memorizing and rote learning appeared as differing components of a surface approach to learning, while understanding, relating, and critical evaluation of knowledge emerged as aspects of a deep approach to learning. A structural model confirmed these results in both student samples. Study III explored a wide variety of dimensions of learning in medical education. Swedish medical students (n=280) answered the questionnaire. The deep approach to learning was strongly related to collaboration and reflective learning, whereas the surface approach was associated with novice-like views of knowledge and the valuing of certain and directly applicable knowledge. PBL students aimed at understanding, but also valued the role of memorization. Study IV investigated 12 PBL tutorial groups of students (n=116) studying microbiology and pharmacology in a medical school. The educational application was expected to support a deep approach to learning: Group members course grades in a final examination were related to the perceived functioning of the PBL tutorial groups. Further, the quality of cases that had been used as triggers for learning, was associated with the quality of small group functioning. New dimensions of study orientations were discovered. In particular, novel, finer distinctions were found within the deep approach component. In medicine, critical evaluation of knowledge appeared to be less valued than understanding and relating. Further, collaboration appeared to be closely related to the deep approach, and it was also important in terms of successful PBL studying. The results of the studies confirmed the previously found associations between approaches to learning and study success, but showed interesting context- and subgroup-related differences in this respect. Students ideas about the nature of knowledge and their approaches to learning were shown to be closely related. The present study expanded our understanding of the dimensions of study orientations, of their development, and their contextual variability in pharmacy and medicine.
Resumo:
For the past two centuries, nationalism has been among the most influential legitimizing principles of political organization. According to its simple definition, nationalism is a principle or a way of thinking and acting which holds that the world is divided into nations, and that national and political units should be congruent. Nationalism can thus be divided into two aspects: internal and external. Internally, the political units, i.e., states, should be made up of only one nation. Externally each nation-state should be sovereign. Transnational national governance of rights of national minorities violates both these principles. This study explores the formation, operation, and effectiveness of the European post-Cold War minorities system. The study identifies two basic approaches to minority rights: security and justice. These approaches have been used to legitimize international minority politics and they also inform the practice of transnational governance. The security approach is based on the recognition that the norm of national self-determination cannot be fulfilled in all relevant cases, and so minority rights are offered as a compensation to the dissatisfied national groups, reducing their aspiration to challenge the status quo. From the justice perspective, minority rights are justified as a compensatory strategy against discrimination caused by majority nation-building. The research concludes that the post-Cold War minorities system was justified on the basis of a particular version of the security approach, according to which only Eastern European minority situations are threatening because of the ethnic variant of nationalism that exists in that region. This security frame was essential in internationalising minority issues and justifying the swift development of norms and institutions to deal with these issues. However, from the justice perspective this approach is problematic, since it justified double standards in European minority politics. Even though majority nation-building is often detrimental to minorities also in Western Europe, Western countries can treat their minorities more or less however they choose. One of the main contributions of this thesis is the detailed investigation of the operation of the post-Cold War minorities system. For the first decade since its creation in the early 1990s, the system operated mainly through its security track, which is based on the field activities of the OSCE that are supported by the EU. The study shows how the effectiveness of this track was based on inter-organizational cooperation in which various transnational actors compensate for each other s weaknesses. After the enlargement of the EU and dissolution of the membership conditionality this track, which was limited to Eastern Europe from the start, has become increasingly ineffective. Since the EU enlargement, the focus minorities system has shifted more and more towards its legal track, which is based on the Framework Convention for the Protection of National Minorities (Council of Europe). The study presents in detail how a network of like-minded representatives of governments, international organizations, and independent experts was able strengthen the framework convention s (originally weak) monitoring system considerably. The development of the legal track allows for a more universal and consistent, justice-based approach to minority rights in contemporary Europe, but the nationalist principle of organization still severely hinders the materialization of this possibility.
Resumo:
This thesis explores the particular framework of evidentiary assessment of three selected appellate national asylum procedures in Europe and discusses the relationship between these procedures, on the one hand, and between these procedures and other legal systems, including the EU legal order and international law, on the other. A theme running throughout the thesis is the EU strivings towards approximation of national asylum procedures and my study analyses the evidentiary assessment of national procedures with the aim of pinpointing similarities and differences, and the influences which affect these distinctions. The thesis first explores the frames construed for national evidentiary solutions by studying the object of decision-making and the impact of legal systems outside the national. Second, the study analyses the factual evidentiary assessment of three national procedures - German, Finnish and English. Thirdly, the study explores the interrelationship between these procedures and the legal systems influencing them and poses questions in relation to the strivings of EU and methods of convergence. The thesis begins by stating the framework and starting points for the research. It moves on to establish keys of comparison concerning four elements of evidentiary assessment that are of importance to any appellate asylum procedure, and that can be compared between national procedures, on the one hand, and between international, regional and national frameworks, on the other. Four keys of comparison are established: the burden of proof, demands for evidentiary robustness, the standard of proof and requirements for the methods of evidentiary assessment. These keys of comparison are then identified in three national appellate asylum procedures, and in order to come to conclusions on the evidentiary standards of the appellate asylum procedures, relevant elements of the asylum procedures in general are presented. Further, institutional, formal and procedural matters which have an impact on the evidentiary standards in the national appellate procedures are analysed. From there, the thesis moves on to establish the relationship between national evidentiary standards and the legal systems which affect them, and gives reasons for similarities and divergences. Further, the thesis studies the impact of the national frameworks on the regional and international level. Lastly, the dissertation makes a de lege ferenda survey of the relationship between EU developments, the goal of harmonization in relation to national asylum procedures and the particular feature of evidentiary standards in national appellate asylum procedures. Methodology The thesis follows legal dogmatic methods. The aim is to analyse legal norms and legal constructions and give them content and context. My study takes as its outset an understanding of the purposes for legal research also regarding evidence and asylum to determine the contents of valid law through analysis and systematization. However, as evidentiary issues traditionally are normatively vaguely defined, a strict traditional normative dogmatic approach is not applied. For the same reason a traditionalist and strict legal positivism is not applied. The dogmatics applied to the analysis of the study is supported by practical analysis. The aim is not only to reach conclusions concerning the contents of legal norms and the requirements of law, but also to study the use and practical functioning of these norms, giving them a practcial context. Further, the study relies on a comparative method. A functionalist comparative method is employed and keys of comparison are found in evidentiary standards of three selected national appellate asylum procedures. The functioning equivalences of German, Finnish and English evidentiary standards of appellate asylum procedures are compared, and they are positioned in an European and international legal setting. Research Results The thesis provides results regarding the use of evidence in national appellate asylum procedures. It is established that evidentiary solutions do indeed impact on the asylum procedure and that the results of the procedure are dependent on the evidentiary solutions made in the procedures. Variations in, amongst other things, the interpretation of the burden of proof, the applied standard of proof and the method for determining evidentiary value, are analysed. It is established that national impacts play an important role in the adaptation of national appellate procedures to external requirements. Further, it is established that the impact of national procedures on as well the international framework as on EU law varies between the studied countries, partly depending on the position of the Member State in legislative advances at the EU level. In this comparative study it is, further, established that the impact of EU requirements concerning evidentiary issues may be have positive as well as negative effects with regard to the desired harmonization. It is also concluded that harmonization using means of convergence that primaly target legal frameworks may not in all instances be optimal in relation to evidentiary standards, and that more varied and pragmatic means of convergence must be introduced in order to secure harmonization also in terms of evidence. To date, legal culture and traditions seem to prevail over direct efforts at procedural harmonization.
Resumo:
Foreign compounds, such as drugs are metabolised in the body in numerous reactions. Metabolic reactions are divided into phase I (functionalisation) and phase II (conjugation) reactions. Uridine diphosphoglucuronosyltransferase enzymes (UGTs) are important catalysts of phase II metabolic system. They catalyse the transfer of glucuronic acid to small lipophilic molecules and convert them to hydrophilic and polar glucuronides that are readily excreted from the body. Liver is the main site of drug metabolism. Many drugs are racemic mixtures of two enantiomers. Glucuronidation of a racemic compound yields a pair of diastereomeric glucuronides. Stereoisomers are interesting substrates in glucuronidation studies since some UGTs display stereoselectivity. Diastereomeric glucuronides of O-desmethyltramadol (M1) and entacapone were selected as model compounds in this work. The investigations of the thesis deal with enzymatic glucuronidation and the development of analytical methods for drug metabolites, particularly diastereomeric glucuronides. The glucuronides were analysed from complex biological matrices, such as urine or from in vitro incubation matrices. Various pretreatment techniques were needed to purify, concentrate and isolate the analytes of interest. Analyses were carried out by liquid chromatography (LC) with ultraviolet (UV) or mass spectrometric (MS) detection or with capillary electromigration techniques. Commercial glucuronide standards were not available for the studies. Enzyme-assisted synthesis with rat liver microsomes was therefore used to produce M1 glucuronides as reference compounds. The glucuronides were isolated by LC/UV and ultra performance liquid chromatography (UPLC)/MS, while tandem mass spectrometry (MS/MS) and nuclear magnetic resonance (NMR) spectroscopy were employed in structural characterisation. The glucuronides were identified as phenolic O-glucuronides of M1. To identify the active UGT enzymes in (±)-M1 glucuronidation recombinant human UGTs and human tissue microsomes were incubated with (±)-M1. The study revealed that several UGTs can catalyse (±)-M1 glucuronidation. Glucuronidation in human liver microsomes like in rat liver microsomes is stereoselective. The results of the studies showed that UGT2B7, most probably, is the main UGT responsible for (±)-M1 glucuronidation in human liver. Large variation in stereoselectivity of UGTs toward (±)-M1 enantiomers was observed. Formation of M1 glucuronides was monitored with a fast and selective UPLC/MS method. Capillary electromigration techniques are known for their high resolution power. A method that relied on capillary electrophoresis (CE) with UV detection was developed for the separation of tramadol and its free and glucuronidated metabolites. The suitability of the method to identify tramadol metabolites in an authentic urine samples was tested. Unaltered tramadol and four of its main metabolites were detected in the electropherogram. A micellar electrokinetic chromatography (MEKC) /UV method was developed for the separation of the glucuronides of entacapone in human urine. The validated method was tested in the analysis of urine samples of patients. The glucuronides of entacapone could be quantified after oral entacapone dosing.
Resumo:
This study explores labour relations between domestic workers and employers in India. It is based on interviews with both employers and workers, and ethnographically oriented field work in Jaipur, carried out in 2004-2007. Combining development studies with gender studies, labour studies, and childhood studies, it asks how labour relations between domestic workers and employers are formed in Jaipur, and how female domestic workers trajectories are created. Focusing on female part-time maids and live-in work arrangements, the study analyses children s work in the context of overall work force, not in isolation from it. Drawing on feminist Marxism, domestic labour relations are seen as an arena of struggle. The study takes an empirical approach, showing class through empiria and shows how paid domestic work is structured and stratified through intersecting hierarchies of class, caste, gender, age, ethnicity and religion. The importance of class in domestic labour relations is reiterated, but that of caste, so often downplayed by employers, is also emphasized. Domestic workers are crucial to the functioning of middle and upper middle class households, but their function is not just utilitarian. Through them working women and housewives are able to maintain purity and reproduce class disctinctions, both between poor and middle classes and lower and upper middle classes. Despite commodification of work relations, traditional elements of service relationships have been retained, particularly through maternalist practices such as gift giving, creating a peculiar blend of traditional and market practices. Whilst employers of part-time workers purchase services in a segmented market from a range of workers for specific, traditional live-in workers are also hired to serve employers round the clock. Employers and workers grudgingly acknowledged their dependence on one another, employers seeking various strategies to manage fear of servant crime, such as the hiring of children or not employing live-in workers in dual-earning households. Paid domestic work carries a heavy stigma and provide no entry to other jobs. It is transmitted from mothers to daughters and working girls were often the main income providers in their families. The diversity of working conditions is analysed through a continuum of vulnerability, generic live-in workers, particularly children and unmarried young women with no close family in Jaipur, being the most vulnerable and experienced part-time workers the least vulnerable. Whilst terms of employment are negotiated informally and individually, some informal standards regarding salary and days off existed for maids. However, employers maintain that workings conditions are a matter of individual, moral choice. Their reluctance to view their role as that of employers and the workers as their employees is one of the main stumbling blocks in the way of improved working conditions. Key words: paid domestic work, India, children s work, class, caste, gender, life course
Resumo:
With the emergence of service-oriented computing technology, companies embrace new ways of carrying out business transactions electronically. Since the parties involved in an electronic business transaction (eBT) manage a heterogeneous information-systems infrastructure within their organizational domains, the collaboration complexity is considerable and safeguarding an interorganizational collaboration with an eBT is difficult, but of high significance. This paper describes a conceptual framework that pays attention to the complexities of an eBT and its differentiating characteristics that go further than traditional database transactions. Since the eBT is a framework that comprises separate levels, pre-existing transaction concepts are explored for populating the respective levels. To show the feasibility of the described eBT framework, industry initiatives that are aspiring to become business-transaction standards, are checked for eBT compatible characteristics. Since realizing an eBT framework raises many tricky issues, the paper maps out important research areas that require scientific attention. Essentially, it is required to investigate how the business semantics influences the nature of an eBT throughout its lifecycle.
Resumo:
People with coeliac disease have to maintain a gluten-free diet, which means excluding wheat, barley and rye prolamin proteins from their diet. Immunochemical methods are used to analyse the harmful proteins and to control the purity of gluten-free foods. In this thesis, the behaviour of prolamins in immunological gluten assays and with different prolamin-specific antibodies was examined. The immunoassays were also used to detect residual rye prolamins in sourdough systems after enzymatic hydrolysis and wheat prolamins after deamidation. The aim was to characterize the ability of the gluten analysis assays to quantify different prolamins in varying matrices in order to improve the accuracy of the assays. Prolamin groups of cereals consist of a complex mixture of proteins that vary in their size and amino acid sequences. Two common characteristics distinguish prolamins from other cereal proteins. Firstly, they are soluble in aqueous alcohols, and secondly, most of the prolamins are mainly formed from repetitive amino acid sequences containing high amounts of proline and glutamine. The diversity among prolamin proteins sets high requirements for their quantification. In the present study, prolamin contents were evaluated using enzyme-linked immunosorbent assays based on ω- and R5 antibodies. In addition, assays based on A1 and G12 antibodies were used to examine the effect of deamidation on prolamin proteins. The prolamin compositions and the cross-reactivity of antibodies with prolamin groups were evaluated with electrophoretic separation and Western blotting. The results of this thesis research demonstrate that the currently used gluten analysis methods are not able to accurately quantify barley prolamins, especially when hydrolysed or mixed in oats. However, more precise results can be obtained when the standard more closely matches the sample proteins, as demonstrated with barley prolamin standards. The study also revealed that all of the harmful prolamins, i.e. wheat, barley and rye prolamins, are most efficiently extracted with 40% 1-propanol containing 1% dithiothreitol at 50 °C. The extractability of barley and rye prolamins was considerably higher with 40% 1-propanol than with 60% ethanol, which is typically used for prolamin extraction. The prolamin levels of rye were lowered by 99.5% from the original levels when an enzyme-active rye-malt sourdough system was used for prolamin degradation. Such extensive degradation of rye prolamins suggest the use of sourdough as a part of gluten-free baking. Deamidation increases the diversity of prolamins and improves their solubility and ability to form structures such as emulsions and foams. Deamidation changes the protein structure, which has consequences for antibody recognition in gluten analysis. According to the resuts of the present work, the analysis methods were not able to quantify wheat gluten after deamidation except at very high concentrations. Consequently, deamidated gluten peptides can exist in food products and remain undetected, and thus cause a risk for people with gluten intolerance. The results of this thesis demonstrate that current gluten analysis methods cannot accurately quantify prolamins in all food matrices. New information on the prolamins of rye and barley in addition to wheat prolamins is also provided in this thesis, which is essential for improving gluten analysis methods so that they can more accurately quantify prolamins from harmful cereals.