938 resultados para Literary representation of burned books


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Local features are used in many computer vision tasks including visual object categorization, content-based image retrieval and object recognition to mention a few. Local features are points, blobs or regions in images that are extracted using a local feature detector. To make use of extracted local features the localized interest points are described using a local feature descriptor. A descriptor histogram vector is a compact representation of an image and can be used for searching and matching images in databases. In this thesis the performance of local feature detectors and descriptors is evaluated for object class detection task. Features are extracted from image samples belonging to several object classes. Matching features are then searched using random image pairs of a same class. The goal of this thesis is to find out what are the best detector and descriptor methods for such task in terms of detector repeatability and descriptor matching rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus of this study is to examine the role of police and immigrants’ relations, as less is known about this process in the country. The studies were approached in two different ways. Firstly, an attempt was made to examine how immigrants view their encounters with the police. Secondly, the studies explored how aware the police are of immigrants’ experiences in their various encounters and interactions on the street level. An ancillary aim of the studies is to clarify, analyse and discuss how prejudice and stereotypes can be tackled, thereby contributing to the general debate about racism and discrimination for better ethnic relations in the country. The data in which this analysis was based is on a group of adults (n=88) from the total of 120 Africans questioned for the entire study (n=45) police cadets and (n=6) serving police officers from Turku. The present thesis is a compilation of five articles. A summary of each article findings follows, as the same data was used in all five studies. In the first study, a theoretical model was developed to examine the perceived knowledge of bias by immigrants resulting from race, culture and belief. This was also an attempt to explore whether this knowledge was predetermined in my attempt to classify and discuss as well as analyse the factors that may be influencing immigrants’ allegations of unfair treatment by the police in Turku. The main finding shows that in the first paper there was ignorance and naivety on the part of the police in their attitudes towards the African immigrant’s prior experiences with the police, and this may probably have resulted from stereotypes or their lack of experience as well as prior training with immigrants where these kinds of experience are rampant in the country (Egharevba, 2003 and 2004a). In exploring what leads to stereotypes, a working definition is the assumption that is prevalent among some segments of the population, including the police, that Finland is a homogenous country by employing certain conducts and behaviour towards ethnic and immigrant groups in the country. This to my understanding is stereotype. Historically this was true, but today the social topography of the country is changing and becoming even more complex. It is true that, on linguistic grounds, the country is multilingual, as there are a few recognised national minority languages (Swedish, Sami and Russian) as well as a number of immigrant languages including English. Apparently it is vital for the police to have a line of communication open when addressing the problem associated with immigrants in the country. The second paper moved a step further by examining African immigrants’ understanding of human rights as well as what human rights violation means or entails in their views as a result of their experiences with the police, both in Finland and in their country of origin. This approach became essential during the course of the study, especially when the participants were completing the questionnaire (N=88), where volunteers were solicited for a later date for an in-depth interview with the author. Many of the respondents came from countries where human rights are not well protected and seldom discussed publicly, therefore understanding their views on the subject can help to explain why some of the immigrants are sceptical about coming forward to report cases of batteries and assaults to the police, or even their experiences of being monitored in shopping malls in their new home and the reason behind their low level of trust in public authorities in Finland. The study showed that knowledge of human rights is notably low among some of the participants. The study also found that female respondents were less aware of human rights when compared with their male counterparts. This has resulted in some of the male participants focussing more on their traditional ways of thinking by not realising that they are in a new country where there is equality in sexes and lack of respect on gender terms is not condoned. The third paper focussed on the respondents’ experiences with the police in Turku and tried to explore police attitudes towards African immigrant clients, in addition to the role stereotype plays in police views of different cultures and how these views have impacted on immigrants’ views of discriminatory policing in Turku. The data is the same throughout the entire studies (n=88), except that some few participants were interviewed for the third paper thirty-five persons. The results showed that there is some bias in mass-media reports on the immigrants’ issues, due to selective portrayal of biases without much investigation being carried out before jumping to conclusions, especially when the issues at stake involve an immigrant (Egharevba, 2005a; Egharevba, 2004a and 2004b). In this vein, there was an allegation that the police are even biased while investigating cases of theft, especially if the stolen property is owned by an immigrant (Egharevba, 2006a, Egharevba, 2006b). One vital observation from the respondents’ various comments was that race has meaning in their encounters and interaction with the police in the country. This result led the author to conclude that the relation between the police and immigrants is still a challenge, as there is rampant fear and distrust towards the police by some segments of the participating respondents in the study. In the fourth paper the focus was on examining the respondents’ view of the police, with special emphasis on race and culture as well as the respondents’ perspective on police behaviour in Turku. This is because race, as it was relayed to me in the study, is a significant predictor of police perception (Egharevba, 2005a; Egharevba and Hannikianen, 2005). It is a known scientific fact that inter-group racial attitudes are the representation of group competition and perceived threat to power and status (Group-position theory). According to Blumer (1958) a sense of group threat is an essential element for the emergence of racial prejudice. Consequently, it was essential that we explored the existing relationship between the respondents and the police in order to have an understanding of this concept. The result indicates some local and international contextual issues and assumptions that were of importance tackling prejudice and discrimination as it exists within the police in the country. Moreover, we have to also remember that, for years, many of these African immigrants have been on the receiving end of unjust law enforcement in their various countries of origin, which has resulted in many of them feeling inferior and distrustful of the police even in their own country of origin. While discussing the issues of cultural difference and how it affects policing, we must also keep in mind the socio-cultural background of the participants, their level of language proficiency and educational background. The research data analysed in this study also confirmed the difficulties associated with cultural misunderstandings in interpreting issues and how these misunderstandings have affected police and immigrant relations in Finland. Finally, the fifth paper focussed on cadets’ attitudes towards African immigrants as well as serving police officers’ interaction with African clients. Secondly, the police level of awareness of African immigrants’ distrustfulness of their profession was unclear. For this reason, my questions in this fifth study examined the experiences and attitudes of police cadets and serving police officers as well as those of African immigrants in understanding how to improve this relationship in the country. The data was based on (n=88) immigrant participants, (n=45) police cadets and 6 serving police officers from the Turku police department. The result suggests that there is distrust of the police in the respondents’ interaction; this tends to have galvanised a heightened tension resulting from the lack of language proficiency (Egharevba and White, 2007; Egharevba and Hannikainen, 2005, and Egharevba, 2006b) The result also shows that the allegation of immigrants as being belittled by the police stems from the misconceptions of both parties as well as the notion of stop and search by the police in Turku. All these factors were observed to have contributed to the alleged police evasiveness and the lack of regular contact between the respondents and the police in their dealings. In other words, the police have only had job-related contact with many of the participants in the present study. The results also demonstrated the complexities caused by the low level of education among some of the African immigrants in their understanding about the Finnish culture, norms and values in the country. Thus, the framework constructed in these studies embodies diversity in national culture as well as the need for a further research study with a greater number of respondents (both from the police and immigrant/majority groups), in order to explore the different role cultures play in immigrant and majority citizens’ understanding of police work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The climate variability between the growth and harvesting of sugar cane is very important because it directly affects yield. The MODIS sensor has characteristics like spatial and temporal resolution that can be applied to monitoring of vegetative vigor variability in the land surface and then, temporal profiles generation. Agro meteorological data from ECMWF model are free and easy to access and have a good representation of reality. In this study, we used the period between sugar cane growth and harvest in the state of Sao Paulo, Brazil, from temporal profiles selecting of NDVI behavior. For each period the precipitation, evapotranspiration, global radiation, length (days) and degree-days were accumulated. The periods were presented in a map format on MODIS spatial resolution of 250 meters. The results showed the spatial variability of climate variables and the relationship to the reality presented by official data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The publication of the Law 10,267 of 08/28/2001 changed the paradigm of rural registration in Brazil, because this law known as the "Law of Georeferencing" has created the National Registration of Rural Property, that unifies in a common basis different registrations present in several government agencies, such as the National Institute for Colonization and Agrarian Reform (INCRA), the Secretariat of Federal Revenue, the Brazilian Institute of Environment and Natural Resources, and the National Indian Foundation. Also, this new registration system has a graphical component which has not existed until such date, where the boundaries of rural property are georeferenced to the Brazilian Geodetic System. This new paradigm has resulted in a standardization of the survey and its representation of rural properties according to the Technical Standard for Georeferencing of Rural Properties, published by INCRA in compliance with the new legislation. Due to the georeferencing, the creation of a public GIS of free access on the Internet was possible. Among the difficulties found it may be observed the great Brazilian territory, the need for specialized professionals, and especially the certification process that INCRA has to perform for each georeferenced property. It is hoped that this last difficulty is solved with the implementation of the Land Management System that will allow automated and online certification, making the process more transparent, agile and fast.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the main complexities in the simulation of the nonlinear dynamics of rigid bodies consists in describing properly the finite rotations that they may undergo. It is well known that, to avoid singularities in the representation of the SO(3) rotation group, at least four parameters must be used. However, it is computationally expensive to use a four-parameters representation since, as only three of the parameters are independent, one needs to introduce constraint equations in the model, leading to differential-algebraic equations instead of ordinary differential ones. Three-parameter representations are numerically more efficient. Therefore, the objective of this paper is to evaluate numerically the influence of the parametrization and its singularities on the simulation of the dynamics of a rigid body. This is done through the analysis of a heavy top with a fixed point, using two three-parameter systems, Euler's angles and rotation vector. Theoretical results were used to guide the numerical simulation and to assure that all possible cases were analyzed. The two parametrizations were compared using several integrators. The results show that Euler's angles lead to faster integration compared to the rotation vector. An Euler's angles singular case, where representation approaches a theoretical singular point, was analyzed in detail. It is shown that on the contrary of what may be expected, 1) the numerical integration is very efficient, even more than for any other case, and 2) in spite of the uncertainty on the Euler's angles themselves, the body motion is well represented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this work is to analyze the importance of the gas-solid interface transfer of the kinetic energy of the turbulent motion on the accuracy of prediction of the fluid dynamic of Circulating Fluidized Bed (CFB) reactors. CFB reactors are used in a variety of industrial applications related to combustion, incineration and catalytic cracking. In this work a two-dimensional fluid dynamic model for gas-particle flow has been used to compute the porosity, the pressure, and the velocity fields of both phases in 2-D axisymmetrical cylindrical co-ordinates. The fluid dynamic model is based on the two fluid model approach in which both phases are considered to be continuous and fully interpenetrating. CFB processes are essentially turbulent. The model of effective stress on each phase is that of a Newtonian fluid, where the effective gas viscosity was calculated from the standard k-epsilon turbulence model and the transport coefficients of the particulate phase were calculated from the kinetic theory of granular flow (KTGF). This work shows that the turbulence transfer between the phases is very important for a better representation of the fluid dynamics of CFB reactors, especially for systems with internal recirculation and high gradients of particle concentration. Two systems with different characteristics were analyzed. The results were compared with experimental data available in the literature. The results were obtained by using a computer code developed by the authors. The finite volume method with collocated grid, the hybrid interpolation scheme, the false time step strategy and SIMPLEC (Semi-Implicit Method for Pressure Linked Equations - Consistent) algorithm were used to obtain the numerical solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainability, in its modern meaning, has been discussed for more than forty years. However, many experts believe that humanity is still far from being sustainable. Some experts have argued that humanity should seek survivable development because it is too late for sustainable one, since 1990s. Obviously, some problems prevented humanity from becoming sustainable. This thesis focuses on the agenda of sustainability discussions and seeks for the essential topics missing from it. For this purpose, the research is conducted on 21 out of 33 books endorsed by the Club of Rome. All of these books are titled ‘a report to the Club of Rome’. The Club of Rome is an organization that has been constantly working on the problems of humankind for the past 40 years. This thesis has three main components: first, the messages of the reports to the Club of Rome, second, academics perceptions of the Club, and third, the Club member perceptions of its evolution, messages and missing topics. This thesis investigates the evolution of four aspects in the reports. The first one is the agenda of the reports. The second one is the basic approaches of the Club (i.e., global, long-term and holistic). The third one is the ways that the reports treat free market and growth ideology. The fourth one is the approach of the reports toward components of the global complex system (i.e., society, economy and politics). The outline of the thesis is as follows. First, the original reports are briefly summarized. After this, the academic perceptions are discussed and structured around three concepts (i.e., futures studies, sustainability and degrowth). In the final step, the perceptions of the experts are collected and analysed, using a variation of Delphi method, called ‘in-depth interviews’, and ‘quality content analysis’ method. This thesis is useful for those interested in sustainability, global problems, and the Club of Rome. This thesis concludes that the reports from 1972 up to 1980 were cohesive in discussing topics related to the problems of humankind. The topics of the reports are fragmented after this period. The basic approaches of CoR are visible in all the reports. However, after 1980, those approaches and especially holistic thinking are only visible in the background. Regarding the free market and growth ideology, although all the reports are against them, the early reports were more explicitly expressing their disagreement. A milestone is noticeable around 1980 when such objections went completely to the background. However, recent reports are more similar to those of 1970s both in adopting a holistic approach and in explicitly criticizing free market and growth ideology. Finally, concerning the components of global complex system, the society is excluded and the focus of the reports is on politics, economy and their relation. Concerning the topics missing from the debate, this thesis concludes that no major research has been conducted on the fundamental and underlying reasons of the problems (e.g. beliefs, values and culture). Studying the problems without considering their underlying reasons, obviously, leads to superficial and ineffective solutions. This might be one of the reasons that sustainability discussions have as yet led to no concrete result.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysis of regional corpus callosum fiber composition reveals that callosal regions connecting primary and secondary sensory areas tend to have higher proportions of coarse-diameter, highly myelinated fibers than callosal regions connecting so-called higher-order areas. This suggests that in primary/secondary sensory areas there are strong timing constraints for interhemispheric communication, which may be related to the process of midline fusion of the two sensory hemifields across the hemispheres. We postulate that the evolutionary origin of the corpus callosum in placental mammals is related to the mechanism of midline fusion in the sensory cortices, which only in mammals receive a topographically organized representation of the sensory surfaces. The early corpus callosum may have also served as a substrate for growth of fibers connecting higher-order areas, which possibly participated in the propagation of neuronal ensembles of synchronized activity between the hemispheres. However, as brains became much larger, the increasingly longer interhemispheric distance may have worked as a constraint for efficient callosal transmission. Callosal fiber composition tends to be quite uniform across species with different brain sizes, suggesting that the delay in callosal transmission is longer in bigger brains. There is only a small subset of large-diameter callosal fibers whose size increases with increasing interhemispheric distance. These limitations in interhemispheric connectivity may have favored the development of brain lateralization in some species like humans. "...if the currently received statements are correct, the appearance of the corpus callosum in the placental mammals is the greatest and most sudden modification exhibited by the brain in the whole series of vertebrated animals..." T.H. Huxley (1).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interpretation of oligonucleotide array experiments depends on the quality of the target cRNA used. cRNA target quality is assessed by quantitative analysis of the representation of 5' and 3' sequences of control genes using commercially available Test arrays. The Test array provides an economically priced means of determining the quality of labeled target prior to analysis on whole genome expression arrays. This manuscript validates the use of a duplex RT-PCR assay as a faster (6 h) and less expensive (6 were chosen and classified as degraded cRNAs, and 31 samples with a ß-actin 3'/5' ratio <6 were selected as good quality cRNAs. Blinded samples were then used for the RT-PCR assay. After gel electrophoresis, optical densities of the amplified 3' and 5' fragments of ß-actin were measured and the 3'/5' ratio was calculated. There was a strong correlation (r² = 0.6802) between the array and the RT-PCR ß-actin 3'/5' ratios. Moreover, the RT-PCR 3'/5' ratio was significantly different (P < 0.0001) between undegraded (mean ± SD, 0.34 ± 0.09) and degraded (1.71 ± 0.83) samples. None of the other parameters analyzed, such as i) the starting amount of RNA, ii) RNA quality assessed using the Bioanalyzer Chip technology, or iii) the concentration and OD260/OD280 ratio of the purified biotinylated cRNA, correlated with cRNA quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the field of molecular biology, scientists adopted for decades a reductionist perspective in their inquiries, being predominantly concerned with the intricate mechanistic details of subcellular regulatory systems. However, integrative thinking was still applied at a smaller scale in molecular biology to understand the underlying processes of cellular behaviour for at least half a century. It was not until the genomic revolution at the end of the previous century that we required model building to account for systemic properties of cellular activity. Our system-level understanding of cellular function is to this day hindered by drastic limitations in our capability of predicting cellular behaviour to reflect system dynamics and system structures. To this end, systems biology aims for a system-level understanding of functional intraand inter-cellular activity. Modern biology brings about a high volume of data, whose comprehension we cannot even aim for in the absence of computational support. Computational modelling, hence, bridges modern biology to computer science, enabling a number of assets, which prove to be invaluable in the analysis of complex biological systems, such as: a rigorous characterization of the system structure, simulation techniques, perturbations analysis, etc. Computational biomodels augmented in size considerably in the past years, major contributions being made towards the simulation and analysis of large-scale models, starting with signalling pathways and culminating with whole-cell models, tissue-level models, organ models and full-scale patient models. The simulation and analysis of models of such complexity very often requires, in fact, the integration of various sub-models, entwined at different levels of resolution and whose organization spans over several levels of hierarchy. This thesis revolves around the concept of quantitative model refinement in relation to the process of model building in computational systems biology. The thesis proposes a sound computational framework for the stepwise augmentation of a biomodel. One starts with an abstract, high-level representation of a biological phenomenon, which is materialised into an initial model that is validated against a set of existing data. Consequently, the model is refined to include more details regarding its species and/or reactions. The framework is employed in the development of two models, one for the heat shock response in eukaryotes and the second for the ErbB signalling pathway. The thesis spans over several formalisms used in computational systems biology, inherently quantitative: reaction-network models, rule-based models and Petri net models, as well as a recent formalism intrinsically qualitative: reaction systems. The choice of modelling formalism is, however, determined by the nature of the question the modeler aims to answer. Quantitative model refinement turns out to be not only essential in the model development cycle, but also beneficial for the compilation of large-scale models, whose development requires the integration of several sub-models across various levels of resolution and underlying formal representations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The balance of T helper (Th) cell differentiation is the fundamental process that ensures that the immune system functions correctly and effectively. The differentiation is a fine tuned event, the outcome of which is driven by activation of the T-cell in response to recognition of the specific antigen presented. The co-stimulatory signals from the surrounding cytokine milieu help to determine the outcome. An impairment in the differentiation processes may lead to an imbalance in immune responses and lead to immune-mediated pathologies. An over-representation of Th1 type cytokine producing cells leads to tissue-specific inflammation and autoimmunity, and excessive Th2 response is causative for atopy, asthma and allergy. The major factors of Th-cell differentiation and in the related disease mechanisms have been extensively studied, but the fine tuning of these processes by the other factors cannot be discarded. In the work presented in this thesis, the association of T-cell receptor costimulatory molecules CTLA4 and ICOS with autoimmune diabetes were studied. The underlying aspect of the study was to explore the polymorphism in these genes with the different disease rates observed in two geographically close populations. The main focus of this thesis was set on a GTPase of the immunity associated protein (GIMAP) family of small GTPases. GIMAP genes and proteins are differentially regulated during human Th-cell differentiation and have been linked to immune-mediated disorders. GIMAP4 is believed to contribute to the immunological balance via its role in T-cell survival. To elucidate the function of GIMAP4 and GIMAP5 and their role in human immunity, a study combining genetic association in different immunological diseases and complementing functional analyses was conducted. The study revealed interesting connections with the high susceptibility risk genes. In addition, the role of GIMAP4 during Th1-cell differentiation was investigated. A novel function of GIMAP4 in relation to cytokine secretion was discovered. Further assessment of GIMAP4 and GIMAP5 effect for the transcriptomic profile of differentiating Th1-cells revealed new insights for GIMAP4 and GIMAP5 function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cellular structure of healthy food products, with added dietary fiber and low in calories, is an important factor that contributes to the assessment of quality, which can be quantified by image analysis of visual texture. This study seeks to compare image analysis techniques (binarization using Otsu’s method and the default ImageJ algorithm, a variation of the iterative intermeans method) for quantification of differences in the crumb structure of breads made with different percentages of whole-wheat flour and fat replacer, and discuss the behavior of the parameters number of cells, mean cell area, cell density, and circularity using response surface methodology. Comparative analysis of the results achieved with the Otsu and default ImageJ algorithms showed a significant difference between the studied parameters. The Otsu method demonstrated the crumb structure of the analyzed breads more reliably than the default ImageJ algorithm, and is thus the most suitable in terms of structural representation of the crumb texture.