897 resultados para Method of Theoretical Images
Resumo:
To achieve success in a constantly changing environment and with ever-increasing competition, companies must develop their operations continuously. To do this, they must have a clear vision of what they want to be in the future. This vision can be attained through careful planning and strategising. One method of transforming a strategy and vision into an everyday tool used by employees is the use of a balanced performance measurement system. The importance of performance measurement in the implementation of companies' visions and strategies has grown substantially in the last ten years. Measures are derived from the company's critical success factors and from many different perspectives. There are three time dimensions: past, present and future. Many such performance measurement systems have been created since the 1990s. This is a case study whose main objective is to provide a recommendation for how the case company could make use of performance measurement to support strategic management. To answer this question, the study uses literature-based research and empirical research at the case company's premises. The theoretical part of the study consists of two sections: introducing the Balanced Scorecard and discussing how it supports strategic management and change management. The empirical part of this study determines the company's present performance measurement situation through interviews in the company. The study resulted in a recommendation to the company to start developing the Balanced Scorecard system. By setting up this kind process, the company would be able to change its focus more towards the future, beginning to implement a more process-based organisation and getting its employees to work together towards common goals.
Resumo:
This thesis is composed of three main parts. The first consists of a state of the art of the different notions that are significant to understand the elements surrounding art authentication in general, and of signatures in particular, and that the author deemed them necessary to fully grasp the microcosm that makes up this particular market. Individuals with a solid knowledge of the art and expertise area, and that are particularly interested in the present study are advised to advance directly to the fourth Chapter. The expertise of the signature, it's reliability, and the factors impacting the expert's conclusions are brought forward. The final aim of the state of the art is to offer a general list of recommendations based on an exhaustive review of the current literature and given in light of all of the exposed issues. These guidelines are specifically formulated for the expertise of signatures on paintings, but can also be applied to wider themes in the area of signature examination. The second part of this thesis covers the experimental stages of the research. It consists of the method developed to authenticate painted signatures on works of art. This method is articulated around several main objectives: defining measurable features on painted signatures and defining their relevance in order to establish the separation capacities between groups of authentic and simulated signatures. For the first time, numerical analyses of painted signatures have been obtained and are used to attribute their authorship to given artists. An in-depth discussion of the developed method constitutes the third and final part of this study. It evaluates the opportunities and constraints when applied by signature and handwriting experts in forensic science. A brief summary covering each chapter allows a rapid overview of the study and summarizes the aims and main themes of each chapter. These outlines presented below summarize the aims and main themes addressed in each chapter. Part I - Theory Chapter 1 exposes legal aspects surrounding the authentication of works of art by art experts. The definition of what is legally authentic, the quality and types of the experts that can express an opinion concerning the authorship of a specific painting, and standard deontological rules are addressed. The practices applied in Switzerland will be specifically dealt with. Chapter 2 presents an overview of the different scientific analyses that can be carried out on paintings (from the canvas to the top coat). Scientific examinations of works of art have become more common, as more and more museums equip themselves with laboratories, thus an understanding of their role in the art authentication process is vital. The added value that a signature expertise can have in comparison to other scientific techniques is also addressed. Chapter 3 provides a historical overview of the signature on paintings throughout the ages, in order to offer the reader an understanding of the origin of the signature on works of art and its evolution through time. An explanation is given on the transitions that the signature went through from the 15th century on and how it progressively took on its widely known modern form. Both this chapter and chapter 2 are presented to show the reader the rich sources of information that can be provided to describe a painting, and how the signature is one of these sources. Chapter 4 focuses on the different hypotheses the FHE must keep in mind when examining a painted signature, since a number of scenarios can be encountered when dealing with signatures on works of art. The different forms of signatures, as well as the variables that may have an influence on the painted signatures, are also presented. Finally, the current state of knowledge of the examination procedure of signatures in forensic science in general, and in particular for painted signatures, is exposed. The state of the art of the assessment of the authorship of signatures on paintings is established and discussed in light of the theoretical facets mentioned previously. Chapter 5 considers key elements that can have an impact on the FHE during his or her2 examinations. This includes a discussion on elements such as the skill, confidence and competence of an expert, as well as the potential bias effects he might encounter. A better understanding of elements surrounding handwriting examinations, to, in turn, better communicate results and conclusions to an audience, is also undertaken. Chapter 6 reviews the judicial acceptance of signature analysis in Courts and closes the state of the art section of this thesis. This chapter brings forward the current issues pertaining to the appreciation of this expertise by the non- forensic community, and will discuss the increasing number of claims of the unscientific nature of signature authentication. The necessity to aim for more scientific, comprehensive and transparent authentication methods will be discussed. The theoretical part of this thesis is concluded by a series of general recommendations for forensic handwriting examiners in forensic science, specifically for the expertise of signatures on paintings. These recommendations stem from the exhaustive review of the literature and the issues exposed from this review and can also be applied to the traditional examination of signatures (on paper). Part II - Experimental part Chapter 7 describes and defines the sampling, extraction and analysis phases of the research. The sampling stage of artists' signatures and their respective simulations are presented, followed by the steps that were undertaken to extract and determine sets of characteristics, specific to each artist, that describe their signatures. The method is based on a study of five artists and a group of individuals acting as forgers for the sake of this study. Finally, the analysis procedure of these characteristics to assess of the strength of evidence, and based on a Bayesian reasoning process, is presented. Chapter 8 outlines the results concerning both the artist and simulation corpuses after their optical observation, followed by the results of the analysis phase of the research. The feature selection process and the likelihood ratio evaluation are the main themes that are addressed. The discrimination power between both corpuses is illustrated through multivariate analysis. Part III - Discussion Chapter 9 discusses the materials, the methods, and the obtained results of the research. The opportunities, but also constraints and limits, of the developed method are exposed. Future works that can be carried out subsequent to the results of the study are also presented. Chapter 10, the last chapter of this thesis, proposes a strategy to incorporate the model developed in the last chapters into the traditional signature expertise procedures. Thus, the strength of this expertise is discussed in conjunction with the traditional conclusions reached by forensic handwriting examiners in forensic science. Finally, this chapter summarizes and advocates a list of formal recommendations for good practices for handwriting examiners. In conclusion, the research highlights the interdisciplinary aspect of signature examination of signatures on paintings. The current state of knowledge of the judicial quality of art experts, along with the scientific and historical analysis of paintings and signatures, are overviewed to give the reader a feel of the different factors that have an impact on this particular subject. The temperamental acceptance of forensic signature analysis in court, also presented in the state of the art, explicitly demonstrates the necessity of a better recognition of signature expertise by courts of law. This general acceptance, however, can only be achieved by producing high quality results through a well-defined examination process. This research offers an original approach to attribute a painted signature to a certain artist: for the first time, a probabilistic model used to measure the discriminative potential between authentic and simulated painted signatures is studied. The opportunities and limits that lie within this method of scientifically establishing the authorship of signatures on works of art are thus presented. In addition, the second key contribution of this work proposes a procedure to combine the developed method into that used traditionally signature experts in forensic science. Such an implementation into the holistic traditional signature examination casework is a large step providing the forensic, judicial and art communities with a solid-based reasoning framework for the examination of signatures on paintings. The framework and preliminary results associated with this research have been published (Montani, 2009a) and presented at international forensic science conferences (Montani, 2009b; Montani, 2012).
Resumo:
Several possible methods of increasing the efficiency and power of hydro power plants by improving the flow passages are investigated in this stydy. The theoretical background of diffuser design and its application to the optimisation of hydraulic turbine draft tubes is presented in the first part of this study. Several draft tube modernisation projects that have been carried out recently are discussed. Also, a method of increasing the efficiency of the draft tube by injecting a high velocity jet into the boundary layer is presented. Methods of increasing the head of a hydro power plant by using an ejector or a jet pump are discussed in the second part of this work. The theoretical principles of various ejector and jet pump types are presented and four different methods of calculating them are examined in more detail. A self-made computer code is used to calculate the gain in the head for two example power plants. Suitable ejector installations for the example plants are also discussed. The efficiency of the ejector power was found to be in the range 6 - 15 % for conventional head increasers, and 30 % for the jet pump at its optimum operating point. In practice, it is impossible to install an optimised jet pump with a 30 % efficiency into the draft tube as this would considerabely reduce the efficiency of the draft tube at normal operating conditions. This demonstrates, however, the potential for improvement which lies in conventional head increaser technology. This study is based on previous publications and on published test results. No actual laboratory measurements were made for this study. Certain aspects of modelling the flow in the draft tube using computational fluid dynamics are discussed in the final part of this work. The draft tube inlet velocity field is a vital boundary condition for such a calculation. Several previously measured velocity fields that have successfully been utilised in such flow calculations are presented herein.
Resumo:
The interface of MgO/Ag(001) has been studied with density functional theory applied to slabs. We have found that regular MgO films show a small adhesion to the silver substrate, the binding can be increased in off-stoichiometric regimes, either by the presence of O vacancies at the oxide film or by a small excess of O atoms at the interface between the ceramic to the metal. By means of theoretical methods, the scanning tunneling microscopy signatures of these films is also analyzed in some detail. For defect free deposits containing 1 or 2 ML and at low voltages, tunnelling takes place from the surface Ag substrate, and at large positive voltages Mg atoms are imaged. If defects, oxygen vacancies, are present on the surface of the oxide they introduce much easier channels for tunnelling resulting in big protrusions and controlling the shape of the image, the extra O stored at the interface can also be detected for very thin films.
Resumo:
This paper discusses basic theoretical strategies used to deal with measurement uncertainties arising from different experimental situations. It attempts to indicate the most appropriate method of obtaining a reliable estimate of the quantity to be evaluated depending on the characteristics of the data available. The theoretical strategies discussed are supported by experimental detail, and the conditions and results have been taken from examples in the field of radionuclide metrology. Special care regarding the correct treatment of covariances is emphasized because of the unreliability of the results obtained if these are neglected
Redox dysregulation in schizophrenia : effect on myelination of cortical structures and connectivity
Resumo:
Cette thèse traite du rôle qu'un facteur de risque génétique développé chez les patients souffrant de schizophrénie, à savoir un déficit de la synthèse du glutathion, peut jouer dans les anomalies de la connectivité cérébrale trouvées chez ces patients. L'essentiel du travail a été consacré à évaluer la structure de la substance blanche dans l'ensemble du cerveau chez un modèle animal par une méthode similaire à celle utilisée en recherche clinique avec l'imagerie par résonance magnétique (IRM). Cette approche de translation inverse chez la souris knock-out de glutamate-cystéine ligase modulateur sous-unité (Gclm KO), avait l'objectif d'étudier l'effet des défenses redox déficientes sur le développement des connexions cérébrales, tout en excluant celui des facteurs non liés au génotype. Après avoir établi le protocole de recherche, l'influence d'une manipulation environnementale a également été étudiée. Pour effectuer une analyse statistique fiable des données d'IRM obtenues, nous .avons d'abord créé un atlas du cerveau de la souris afin de l'utiliser comme modèle pour une segmentation précise des différentes régions du cerveau sur les images IRM obtenues in vivo. Les données provenant de chaque région d'intérêt ont ensuite été étudiées séparément. La qualité de cette méthode a été évaluée dans une expérience de simulation pour déduire la puissance statistique réalisable dans chaque région en fonction du nombre d'animaux utilisés. Ces outils d'analyse nous ont permis d'évaluer l'intégrité de la substance blanche dans le cerveau des souris durant le développement grâce à une expérience longitudinale, en utilisant l'imagerie du tenseur de diffusion (DTI). Nous avons ainsi observé des anomalies dans les paramètres dérivés du tenseur (diffusivité et anisotropie) dans la Commissure Antérieure et le Fimbria/Fornix des souris Gclm KO, par rapport aux animaux contrôles. Ces résultats suggèrent une substance blanche endommagée dans ces régions. Dans une expérience électrophysiologique, Pascal Steullet a montré que ces anomalies ont des conséquences fonctionnelles caractérisées par une réduction de la vitesse de conduction dans les fibres nerveuses. Ces données renforcent les conclusions des analyses d'imagerie. Le mécanisme par lequel une dérégulation redox affecte la structure de la substance blanche reste encore à définir, car une analyse immunohistochimique des protéines constituantes de la couche de myéline des fibres concernées n'a pas donné de résultats concluants. Nous avons également constaté un élargissement des ventricules dans les jeunes souris Gclm KO, mais pas chez les adultes et des anomalies neurochimiques déjà connues chez ces animaux (Duarte et al. 2011), à savoir une réduction du Glutathion et une augmentation de l'acide N-acétylaspartique, de l'Alanine et du ratio Glutamine/Glutamate. Nous avons ensuite testé l'effet d'un stress environnemental supplémentaire, l'élevage en isolement social, sur le phénotype. Ce stress n'a eu aucun effet sur la structure de la substance blanche évaluée par DTI, mais a réduit la concentration de myo-Inositol et augmenté le ratio de Glutamine/Glutamate dans le cortex frontal. Nous avons aussi reproduit dans ce groupe indépendant d'animaux les effets du génotype sur le profil neurochimique, sur la taille des ventricules et aussi sur les paramètres dérivés du tenseur de diffusion dans le Fimbria/Fornix, mais pas dans la Commissure Antérieure. Nos résultats montrent qu'une dérégulation redox d'origine génétique perturbe la structure et la fonction de la substance blanche dans des régions spécifiques, causant ainsi l'élargissement des ventricules. Ces phénotypes rassemblent certaines caractéristiques neuro-anatomiques de la schizophrénie, mais les mécanismes qui en sont responsables demeurent encore inconnus. L'isolement social n'a pas d'effet sur la structure de la substance blanche évaluée par DTI, alors qu'il est prouvé qu'il affecte la maturation des oligodendrocytes. La neurochimie corticale et en particulier le rapport Glutamine/Glutamate a été affecté par le dérèglement redox ainsi que par l'isolement social. En conséquence, ce ratio représente un indice prometteur dans la recherche sur l'interaction du stress environnemental avec le déséquilibre redox dans le domaine de la schizophrénie. -- The present doctoral thesis is concerned with the role that a genetic risk factor for the development of schizophrenia, namely a deficit in Glutathione synthesis, may play in the anomalies of brain connectivity found in patients. Most of the effort was devoted to perform a whole-brain assessment of white matter structure in the Glutamate-Cysteine ligase modulatory knockout mouse model (Gclm KO) using Magnetic Resonance Imaging (MRI) techniques similar to those used in state-of-the-art clinical research. Such reverse translational approach taking brain imaging from the bedside to the bench aimed to investigate the role that deficient redox defenses may play in the development of brain connections while excluding all influencing factors beside the genotype. After establishing the protocol, the influence of further environmental manipulations was also studied. Analysis of MRI images acquired in vivo was one of the main challenges of the project. Our strategy consisted in creating an atlas of the mouse brain to use as segmentation guide and then analyze the data from each region of interest separately. The quality of the method was assessed in a simulation experiment by calculating the statistical power achievable in each brain region at different sample sizes. This analysis tool enabled us to assess white matter integrity in the mouse brain along development in a longitudinal experiment using Diffusion Tensor Imaging (DTI). We discovered anomalies in diffusivity parameters derived from the tensor in the Anterior Commissure and Fimbria/Fornix of Gclm KO mice when compared to wild-type animals, which suggest that the structure of these tracts is compromised in the KO mice. In an elegant electrophysiological experiment, Pascal Steullet has provided evidence that these anomalies have functional consequences in form of reduced conduction velocity in the concerned tracts, thus supporting the DTI findings. The mechanism by which redox dysregulation affects WM structure remains unknown, for the immunohistochemical analysis of myelin constituent proteins in the concerned tracts produced inconclusive results. Our experiments also detected an enlargement of the lateral ventricles in young but not adult Gclm KO mice and confirmed neurochemical anomalies already known to affect this animals (Duarte et al. 2011), namely a reduction in Glutathione and an increase in Glutamine/Glutamate ratio, N-acetylaspartate and Alanine. Using the same methods, we tested the effect of an additional environmental stress on the observed phenotype: rearing in social isolation had no effect on white matter structure as assessed by DTI, but it reduced the concentration of myo-Inositol and increased the Glutamine/Glutamate ratio in the frontal cortex. We could also replicate in this separate group of animals the effects of genotype on the frontal neurochemical profile, ventricular size and diffusivity parameters in the Fimbria/Fornix but not in the Anterior Commissure. Our data show that a redox dysregulation of genetic origin may disrupt white matter structure and function in specific tracts and cause a ventricular enlargement, phenotypes that resemble some neuroanatomical features of schizophrenia. The mechanism responsible remains however unknown. We have also demonstrated that environmental stress in form of social isolation does not affect white matter structure as assessed by DTI even though it is known to affect oligodendrocyte maturation. Cortical neurochemistry, and specifically the Glutamine to Glutamate balance was affected both by redox dysregulation and social isolation, and is thus a good target for further research on the interaction of redox imbalance and environmental stress in schizophrenia.
Resumo:
AbstractObjective:To evaluate by magnetic resonance imaging changes in bone marrow of patients undergoing treatment for type I Gaucher’s disease.Materials and Methods:Descriptive, cross-sectional study of Gaucher’s disease patients submitted to 3 T magnetic resonance imaging of femurs and lumbar spine. The images were blindly reviewed and the findings were classified according to the semiquantitative bone marrow burden (BMB) scoring system.Results:All of the seven evaluated patients (three men and four women) presented signs of bone marrow infiltration. Osteonecrosis of the femoral head was found in three patients, Erlenmeyer flask deformity in five, and no patient had vertebral body collapse. The mean BMB score was 11, ranging from 9 to 14.Conclusion:Magnetic resonance imaging is currently the method of choice for assessing bone involvement in Gaucher’s disease in adults due to its high sensitivity to detect both focal and diffuse bone marrow changes, and the BMB score is a simplified method for semiquantitative analysis, without depending on advanced sequences or sophisticated hardware, allowing for the classification of the disease extent and assisting in the treatment monitoring.
Resumo:
The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation
Resumo:
This paper proposes a calibration method which can be utilized for the analysis of SEM images. The field of application of the developed method is a calculation of surface potential distribution of biased silicon edgeless detector. The suggested processing of the data collected by SEM consists of several stages and takes into account different aspects affecting the SEM image. The calibration method doesn’t pretend to be precise but at the same time it gives the basics of potential distribution when the different biasing voltages applied to the detector.
Resumo:
A software development process is a predetermined sequence of steps to create a piece of software. A software development process is used, so that an implementing organization could gain significant benefits. The benefits for software development companies, that can be attributed to software process improvement efforts, are improved predictability in the development effort and improved quality software products. The implementation, maintenance, and management of a software process as well as the software process improvement efforts are expensive. Especially the implementation phase is expensive with a best case scenario of a slow return on investment. Software processes are rare in very small software development companies because of the cost of implementation and an improbable return on investment. This study presents a new method to enable benefits that are usually related to software process improvement to small companies with a low cost. The study presents reasons for the development of the method, a description of the method, and an implementation process for the method, as well as a theoretical case study of a method implementation. The study's focus is on describing the method. The theoretical use case is used to illustrate the theory of the method and the implementation process of the method. The study ends with a few conclusions on the method and on the method's implementation process. The main conclusion is that the method requires further study as well as implementation experiments to asses the value of the method.
Resumo:
Military conscription and peacetime military service were the subjects of heated political, social and cultural controversies during the early years of national independence in Finland. Both the critics and the supporters of the existing military system described it as strongly formative of young men’s physical and moral development into adult men and male citizens. The conflicts over conscription prompted the contemporaries to express their notions about what Finnish men were like, at their best and at their worst, and what should and could be done about it. This thesis studies military conscription as an arena for the “making of manhood” in peacetime Finnish society, 1918–1939. It examines a range of public images of conscripted soldiering, asking how soldiering was depicted and given gendered meanings in parliamentary debates, war hero myths, texts concerned with the military and civic education of conscripts, as well as in works of fiction and reminiscences about military training as a personal experience. Studying conscription with a focus on masculinity, the thesis explores the different cultural images of manliness, soldiering and male citizenship on offer in Finnish society. It investigates how political parties, officers, educators, journalists, writers and “ordinary” conscripts used and developed, embraced or rejected these notions, according to their political purposes or personal needs. The period between the two world wars can be described as a fast-forward into military modernity in Finland. In the process, European middle class gender ideologies clashed with Finnish agrarian masculinities. Nationalistic agendas for the militarisation of Finnish manhood stumbled against intense class conflicts and ideological resistance. Military propaganda used images of military heroism, civic virtue and individual success to persuade the conscripts into ways of thinking and acting that were shaped by bourgeois mentality, nationalistic ideology and religious morality. These images are further analysed as expressive of the personal experiences and emotions of their middle-aged, male authors. The efforts of these military educators were, however, actively resisted on many fronts, ranging from rural working class masculinities among the conscripted young men to ideological critiques of the standing army system in parliament. In narratives about military training, masculinity was depicted as both strengthened and contradicted by the harsh and even brutal practices of interwar Finnish military training. The study represents a combination of new military history and the historical study of men and masculinities. It approaches masculinity as a contested and highly political form of social and cultural knowledge that is actively and selectively used by historic actors. Instead of trying to identify a dominant or “hegemonic” form of masculinity within a pre-determined theoretical structure, this study examines how the meanings ascribed to manhood varied according to class, age, political ideology and social situation. The interwar period in Finland can be understood as a period of contest between different notions of militarised masculinity, yet to judge by the materials studied, there was no clear winning party in that contest. A gradual movement from an atmosphere of conflict surrounding conscription towards political and cultural compromises can be discerned, yet this convergence was incomplete and many division lines remained.
Resumo:
Denna vårdvetenskapliga studie har sökarljuset riktat mot lyssnande som något fundamentalt för människan. Studiens syfte är att utifrån forskarens traditionstillhörighet och en klart uttalad teorikärna, värdegrund, ontologisk grund samt en hermeneutisk ansats avtäcka nytt inseende om vad lyssnande är samt vad som är vårdande i lyssnande. Studiens avsikt är att bringa ontologisk klarhet, klargöra lyssnandets plats inom vårdvetenskaplig teoribildning samt bidra med bilder av det vårdande med relevans för klinisk praxis. Det hermeneutiska arbetssättet är en ständig rörelse mellan det enskilda och gemensamma. Den vårdvetenskapliga traditionens grundläggande ordning utgör siktet för kunskapssökandet samtidigt som det är gentemot den caritativa vårdteorin som nytt inseende ur historiska källor reflekteras för att lysa upp och synliggöra det evidenta och ursprungliga om lyssnande. Begreppsutredningen av lyssna som begrepp ger förståelse för begreppets historiska ursprung och dess starka band med begreppen lyss, lystra, lyda, följa, iaktta, se samt lust, begär, böjelse och glädje. Läsning och begrundan av filosoferna Bubers, Lévinas och författaren Dostojevskijs livsverk avtäcker vittnesbörd och stillbilder som tolkas för att synliggöra tolkningsantaganden om lyssnandets väsen och gestaltning. Jag har läst och begrundat texterna med öppenhet, lyhördhet och en kvardröjande hållning för att låta dem beröra, tilltala mig och framträda som något nytt. Ur de enskilda texterna avtäcks mönster av en gemensam inre ordning och en gemensam stillbild och tolkningsantaganden om lyssnande formas. När den gemensamma inre ordningen får sammanstråla med studiens vårdvetenskapliga grund synliggörs en urbild och teser om lyssnande samt det vårdande i lyssnande. Avtäckandet visar att lyssnande är en grundhållning till livet självt. Människan lyssnar till sitt innersta väsen för att tillägna sig en inre grundordning samt hörsamma sin tillhörighet i en mera allomfattande gemenskap till den skapande, den andre och det oändliga. I ljuset av kärlekens kraft och bejakandet av mänskligt lidande kan ensamhet och gemenskap sluta förbund med varandra i en mellanverklighet som bär oändlighetens och evighetens kännetecken, och som får toner av det heliga och eviga att ljuda i allt och alla i en närvaro här och nu. Lyssnande är en väsenshandling som uttalas genom en vårdande hållning att tjäna, offra och bära ansvar för den andre med medmänsklig barmhärtighet och medmänskligt medlidande. Band till den andre knyts genom den egna sårbarheten och genom att med ödmjukhet, äkthet och gästfrihet bjuda in och välkomna den andre i sin annanhet. Den vårdande gemenskapen skapar en rörelse som bär på en livsbringande kraft genom vilken människan uppnår frihet att varda, finna näring och försoning.
Resumo:
The large and growing number of digital images is making manual image search laborious. Only a fraction of the images contain metadata that can be used to search for a particular type of image. Thus, the main research question of this thesis is whether it is possible to learn visual object categories directly from images. Computers process images as long lists of pixels that do not have a clear connection to high-level semantics which could be used in the image search. There are various methods introduced in the literature to extract low-level image features and also approaches to connect these low-level features with high-level semantics. One of these approaches is called Bag-of-Features which is studied in the thesis. In the Bag-of-Features approach, the images are described using a visual codebook. The codebook is built from the descriptions of the image patches using clustering. The images are described by matching descriptions of image patches with the visual codebook and computing the number of matches for each code. In this thesis, unsupervised visual object categorisation using the Bag-of-Features approach is studied. The goal is to find groups of similar images, e.g., images that contain an object from the same category. The standard Bag-of-Features approach is improved by using spatial information and visual saliency. It was found that the performance of the visual object categorisation can be improved by using spatial information of local features to verify the matches. However, this process is computationally heavy, and thus, the number of images must be limited in the spatial matching, for example, by using the Bag-of-Features method as in this study. Different approaches for saliency detection are studied and a new method based on the Hessian-Affine local feature detector is proposed. The new method achieves comparable results with current state-of-the-art. The visual object categorisation performance was improved by using foreground segmentation based on saliency information, especially when the background could be considered as clutter.
Resumo:
Most studies on measures of transpiration of plants, especially woody fruit, relies on methods of heat supply in the trunk. This study aimed to calibrate the Thermal Dissipation Probe Method (TDP) to estimate the transpiration, study the effects of natural thermal gradients and determine the relation between outside diameter and area of xylem in 'Valencia' orange young plants. TDP were installed in 40 orange plants of 15 months old, planted in boxes of 500 L, in a greenhouse. It was tested the correction of the natural thermal differences (DTN) for the estimation based on two unheated probes. The area of the conductive section was related to the outside diameter of the stem by means of polynomial regression. The equation for estimation of sap flow was calibrated having as standard lysimeter measures of a representative plant. The angular coefficient of the equation for estimating sap flow was adjusted by minimizing the absolute deviation between the sap flow and daily transpiration measured by lysimeter. Based on these results, it was concluded that the method of TDP, adjusting the original calibration and correction of the DTN, was effective in transpiration assessment.
Resumo:
This study aimed to propose methods to identify croplands cultivated with winter cereals in the northern region of Rio Grande do Sul State, Brazil. Thus, temporal profiles of Normalized Difference Vegetation Index (NDVI) from MODIS sensor, from April to December of the 2000 to 2008, were analyzed. Firstly, crop masks were elaborated by subtracting the minimum NDVI image (April to May) from the maximum NDVI image (June to October). Then, an unsupervised classification of NDVI images was carried out (Isodata), considering the crop mask areas. According to the results, crop masks allowed the identification of pixels with greatest green biomass variation. This variation might be associated or not with winter cereals areas established to grain production. The unsupervised classification generated classes in which NDVI temporal profiles were associated with water bodies, pastures, winter cereals for grain production and for soil cover. Temporal NDVI profiles of the class winter cereals for grain production were in agree with crop patterns in the region (developmental stage, management standard and sowing dates). Therefore, unsupervised classification based on crop masks allows distinguishing and monitoring winter cereal crops, which were similar in terms of morphology and phenology.