965 resultados para Espresso coffee
Resumo:
The television and the ways it has invited the audience to take part have been changing during the last decade. Today’s interaction, or rather participation, comes from multiplatform formats, such as TV spectacles that combine TV and web platforms in order to create a wider TV experience. Multiplatform phenomena have spread television consumption and traditional coffee table discussions to several different devices and environments. Television has become a part of the bigger puzzle of interconnected devices that operates on several platforms instead of just one. This thesis examines the Finnish television (2004–2014) through the notion of audience participation and introduces the technical, thematic, and social linkages as three different phases, interactive, participatory, social, and their most characteristic features in terms of audience participation. The aim of the study is also to focus on the idea of a possible change by addressing the possible and subtler variations that have taken place through the concept of digital television. Firstly, Finnish television history has gone through numerous trials, exploring the interactive potential of television formats. Finnish SMS-based iTV had its golden era around 2005, when nearly 50% of the television formats were to some extent interactive. Nowadays, interactive television formats have vanished due to their negative reputation and this important part of recent history is mainly been neglected in the academic scope. The dissertation focuses also on the present situation and the ways television content invites the audience to take part. “TV meets the Internet” is a global expression that characterises digital TV, and the use of the Web combined with television content is also examined. Also the linkages between television and social media are identified. Since television can nowadays be described multifaceted, the research approaches are also versatile. The research is based on qualitative content analysis, media observation, and Internet inquiry. The research material also varies. It consists of primary data: taped iTV formats, website material, and social media traces both from Twitter and Facebook and secondary data: discussion forums, observations from the media and Internet inquiry data. To sum up the results, the iTV phase represented, through its content, a new possibility for audiences to take part in a TV show (through gameful and textual features) in real-time. In participatory phase, the most characteristic features from TV-related content view, is the fact that online platform(s) were used to immerse the audience with additional material and, due to this, to extend the TV watching enjoyment beyond the actual broadcast. During the Social (media) phase, both of these features, real-timeness, and extended enjoyment through additional material, are combined and Facebook & Twitter, for example, are used to immerse people in live events (in real-time) via broadcast-related tweets and extra-material offered on a Facebook page. This thesis fills in the gap in Finnish television research by examining the rapid changes taken place on the field within the last ten years. The main results is that the development of Finnish digital television has been much more diverse and subtle than has been anticipated by following only the news, media, and contemporary discourses on the subject of television. The results will benefit both practitioners and academics by identifying the recent history of Finnish television.
Resumo:
Polyacrylamide gel electrophoresis, SDS-PAGE system, was adjusted to detect the presence of additional whey in dairy beverages distributed in a Brazilian Government School Meals Program. Aqueous solutions of samples in 8 M urea were submitted to a polyacrylamide gel gradient (10% to 18%). Gel scans from electrophoresis patterns of previously adulterated milk samples showed that caseins peak areas decreased while peak areas of beta -lactoglobulin plus alpha -lactalbumin increased as the percentage of raw milk powder replaced by whey powder increased. The relative densitometer areas of caseins or beta -lactoglobulin plus alpha -lactalbumin plotted against the percentage of whey added to the raw milk showed a linear correlation coefficient square higher than 0.97. The caseins plot was used to determine the percentage of additional whey in 116 dairy beverages, chocolate or coffee flavor. Considering that the lowest relative caseins concentration found in commercial milk powder samples by the present method was 72%, the dairy beverages containing caseins percentages equal to or higher than this value were considered free of additional whey. Based on this criterion, about 49% of the coffee-flavor dairy beverages and 29% of the chocolate-flavor beverages, among all the samples analyzed were adulterated with whey protein to reach the total protein contents specified on their labels. The present method showed a sensitivity of 5% to additional whey.
Resumo:
Torrefaction is the partial pyrolysis of wood characterised by thermal degradation of predominantly hemicellulose under inert atmosphere. Torrefaction can be likened to coffee roasting but with wood in place of beans. This relatively new process concept makes wood more like coal. Torrefaction has attracted interest because it potentially enables higher rates of co-firing in existing pulverised-coal power plants and hence greater net CO2 emission reductions. Academic and entrepreneurial interest in torrefaction has sky rocketed in the last decade. Research output has focused on the many aspects of torrefaction – from detailed chemical changes in feedstock to globally-optimised production and supply scenarios with which to sustain EU emission-cutting directives. However, despite its seemingly simple concept, torrefaction has retained a somewhat mysterious standing. Why hasn’t torrefied pellet production become fully commercialised? The question is one of feasibility. This thesis addresses this question. Herein, the feasibility of torrefaction in co-firing applications is approached from three directions. Firstly, the natural limitations imposed by the structure of wood are assessed. Secondly, the environmental impact of production and use of torrefied fuel is evaluated and thirdly, economic feasibility is assessed based on the state of the art of pellet making. The conclusions reached in these domains are as follows. Modification of wood’s chemical structure is limited by its naturally existing constituents. Consequently, key properties of wood with regards to its potential as a co-firing fuel have a finite range. The most ideal benefits gained from wood torrefaction cannot all be realised simultaneously in a single process or product. Although torrefaction at elevated pressure may enhance some properties of torrefied wood, high-energy torrefaction yields are achieved at the expense of other key properties such as heating value, grindability, equilibrium moisture content and the ability to pelletise torrefied wood. Moreover, pelletisation of even moderately torrefied fuels is challenging and achieving a standard level of pellet durability, as required by international standards, is not trivial. Despite a reduced moisture content, brief exposure of torrefied pellets to water from rainfall or emersion results in a high level of moisture retention. Based on the above findings, torrefied pellets are an optimised product. Assessment of energy and CO2-equivalent emission balance indicates that there is no environmental barrier to production and use of torrefied pellets in co-firing. A long product transport distance, however, is necessary in order for emission benefits to exceed those of conventional pellets. Substantial CO2 emission reductions appear possible with this fuel if laboratory milling results carry over to industrial scales for direct co-firing. From demonstrated state-of-the-art pellet properties, however, the economic feasibility of torrefied pellet production falls short of conventional pellets primarily due to the larger capital investment required for production. If the capital investment for torrefied pellet production can be reduced significantly or if the pellet-making issues can be resolved, the two production processes could be economically comparable. In this scenario, however, transatlantic shipping distances and a dry fuel are likely necessary for production to be viable. Based on demonstrated pellet properties to date, environmental aspects and production economics, it is concluded that torrefied pellets do not warrant investment at this time. However, from the presented results, the course of future research in this field is clear.
Resumo:
This thesis answers some important questions about how Fair Trade is experienced and perceived by some Northern sellers, consumers, activists, advocates, practitioners, and an importer. As it relates to sellers, I focus only on small scale independent businesses (i.e. I do not include large corporate businesses in my interview sample). Fair Trade works to establish a dignified livelihood for many producers in the South. Some of the most important actors in the Fair Trade movement are the people who buy, sell, and/or advocate for Fair Trade in the North. Fair Trade is largely a consumer movement which relies on the purchase of Fair Trade products. Without consumers purchasing Fair Trade products, retailers providing the products for sale, and activists raising awareness of Fair Trade, the movement, as it is presently constituted, would be non-existent. This qualitative research is based on 19 in-depth i.nterviews with nine interviewees involved with Fair Trade in Canada. I focus on benefits, challenges, and limitations of Fair Trade in the context of their involvement with it. I describe and analyze how people become involved with Fair Trade, what motivates them to do so, what they hope to achieve, and the benefits of being involved. I also describe and analyze how people understand and deal with any challenges and limitations associated with their involvement with Fair Trade. I also explore whether involvement with Fair Trade influences how people think about other products that they purchase and, if so, in what ways. I focus mainly on the commodity of coffee, but my discussion is not limited to this single commodity. Interviewees' experiences with and participation in Fair Trade vary in terms of their level of involvement and interest in the broader Fair Trade movement (as opposed to just participating in the market component). This research reveals that while Fair Trade is a small movement, sellers, consumers, and activists have had much success in the advancement of Fair Trade. While challenges have not deterred interviewees from continuing to participate in Fair Trade, analysis and explanation of such challenges provides the opportunity for Fair Trade practitioners to develop effective solutions in an effort to meet the needs of various Fair Trade actors.
Resumo:
Le cancer du sein (CS) est la deuxième cause de décès liés au cancer parmi les femmes dans la plupart des pays industrialisés. Les personnes qui ont le CS peuvent ne pas hériter des mutations causant le cancer de leurs parents. Ainsi, certaines cellules subissent des mutations qui mènent au cancer. Dans le cas de cancer héréditaire, les cellules tumorales contiennent généralement des mutations qui ne sont pas trouvées ailleurs dans l'organisme, mais peuvent maintenir des mutations qui vont répartir dans toutes les cellules. La genèse du CS est le résultat des mutations de gènes qui assurent la régulation de la prolifération cellulaire et la réparation de l’ADN. Deux gènes semblent particulièrement concernés par les mutations. Les gènes ‘Breast Cancer 1’ (BRCA1) et ‘Breast Cancer 2’ (BRCA2), sont impliqués dans la prédisposition génétique de CS. On estime que 5-10% des cas de cancer du sein sont attribuables à une prédisposition génétique. La plupart de ces cancers sont liés à une anomalie du gène BRCA1 ou BRCA2. Plusieurs études ont été menées chez les femmes atteintes de CS sporadique et quelques études se sont concentrées sur celles qui sont porteuses de mutations de BRCA. Alors, notre recherche a été entreprise afin de vérifier l’hypothèse d’une association entre le CS, le mode vie et les habitudes alimentaires chez les Canadiennes-françaises non porteuses des 6 mutations de BRCA les plus fréquentes parmi cette population. Nous avons mené une étude cas-témoins dans cette population. Quelque 280 femmes atteintes du cancer du sein et non-porteuses de mutations de BRCA, ont été recrutées en tant que cas. Les témoins étaient recrutés parmi les membres de la famille des cas (n=15) ou à partir d'autres familles atteintes de CS (n=265). Les participantes étaient de tous âges, recrutées à partir d’une étude de cohorte qui est actuellement en cours, menée par une équipe de chercheurs au Centre Hospitalier Universitaire de Montréal (CHUM) Hôtel-Dieu à Montréal. Les apports alimentaires ont été recueillis par un questionnaire de fréquence semi-quantitatif validé et administré par une nutritionniste, qui portait sur la période avant les deux ans précédant le premier diagnostic de CS pour les cas et la période avant les deux ans précédant l’entrevue téléphonique pour les témoins. Un questionnaire de base était administré par l’infirmière de recherche aux participantes afin de colliger des renseignements sociodémographiques et sur les facteurs de risque du CS. Une association positive et significative a été détectée entre l’âge (plus de 50 ans) auquel les sujets avaient atteint leur Indice de Masse Corporel (IMC) le plus élevé et le CS rapport de cotes (OR) =2,83; intervalle de confiance à 95% (IC95%) (2,34-2,91). De plus, une association positive a été détectée entre un gain de poids de >34 lbs comparativement à un gain de poids de ≤15 lbs, dès l’âge de 20 ans OR=1,68; IC95% (1,10-2,58). Un gain de poids de >24 lbs comparativement à un gain de poids de ≤9 lbs, dès l’âge de 30 ans a aussi montré une augmentation de risque de CS OR=1,96; IC95% (1,46-3,06). Une association positive a aussi été détecté entre, un gain de poids de >12 lbs comparativement à un gain de poids de ≤1 lb, dès l’âge de 40 ans OR=1,91; IC95% (1,53-2,66). Concernant le tabagisme, nous avons observé une association positive et significative reliée à la consommation de plus de 9 paquets-années OR = 1,59; IC95% (1,57-2,87). Il fut suggéré que l’activité physique modéré confère une protection contre le CS: une pratique de > 24,8 (‘metabolic equivalent’) MET-hrs par semaine par rapport à ≤10,7 MET-hrs par semaine, diminue le risque du CS de 52% OR = 0,48 ; IC95% (0,31-0,74). L’activité physique totale (entre 16,2 et 33,2 MET-hrs par semaine), a aussi montré une réduction de risque de CS de 43% OR = 0,57 ; IC95% (0,37-0,87). Toutefois, il n'y avait aucune association entre une activité physique vigoureuse et le risque de CS. L’analyse portant sur les macro- et micro-nutriments et les groupes alimentaires a montré qu’un apport en énergie totale de plus de 2057 Kcal par jour augmentait le risque de CS de 2,5 fois OR = 2,54; IC95% (1,67-3,84). En ce qui concerne la consommation de café, les participantes qui buvaient plus de 8 tasses de café par jour avaient un risque de CS augmenté de 40% OR = 1,40; IC95% (1,09-2,24). Les sujets ayant une consommation dépassant 9 g d’alcool (éthanol) par jour avaient également un risque élevé de 55% OR = 1,55; IC95% (1,02-2,37). De plus, une association positive et significative a été détectée entre le CS et la consommation de plus de deux bouteilles de bière par semaine OR = 1,34; IC95% (1,28-2,11), 10 onces de vin par semaine OR = 1,16; IC95% (1,08-2,58) ou 6 onces de spiritueux par semaine OR = 1,09; IC95% (1,02-2,08), respectivement. En résumé, les résultats de cette recherche supportent l’hypothèse selon laquelle le mode de vie et les habitudes alimentaires jouent un rôle important dans l’étiologie de CS chez les Canadiennes-françaises non porteuses de mutations de BRCA. Les résultats nous permettent de constater que le gain de poids et le tabagisme sont liés à des risques élevés de CS, tandis que l'activité physique modérée aide à réduire ce risque. De plus, nos résultats suggèrent qu’un apport énergétique total relativement élevé et une consommation élevée de café et d'alcool peuvent accroître le risque de ce cancer. Ce travail a permis de mettre l’accent sur une nouvelle direction de recherche, jusqu'à présent non investiguée. Les résultats de ce travail de recherche pourraient contribuer à recueillir de nouvelles informations et des conseils pouvant influencer et aider la population à modifier son mode de vie et ses habitudes alimentaires afin de diminuer le risque de cancer du sein.
Resumo:
Through the justice principles –equality, time, status, need, efficiency and worth– developed by Jon Elster, we show in this article how fair trade certification for producers is legitimatised by stakeholders. Based on a field investigation with coffee growers in Peru, Ecuador and Bolivia and with fair trade organisations in the North (Max Havelaar/FLO, Andines and Artisans du Monde), the analysis firstly reviews just certification according to the impersonal criteria of “mechanical” justice, such as equality, time and efficiency. The second section looks at more individualised criteria such as the status, need and worth of the beneficiaries. Finally, it determines in what way fair trade is really a mixed bag, one which calls upon different principles of justice to justify what it is out to accomplish. The main result of the analysis is that fair certification granted to producer organisations is not being distributed according to a unique system of justice based on just one criterion. On the contrary, fair trade is a complex and hybrid bag that uses different components from each distribution procedure.
Resumo:
La thèse vise à analyser la structure des échanges transnationaux de cocaïne, d’héroïne et de marijuana. Partant de la perspective des systèmes-mondes, l’hypothèse que le trafic de drogues forme un système inverse au commerce légal est développée. Les outils de l’analyse de réseaux sont appliqués aux échanges de drogues entre pays. La thèse s’appuie sur deux sources de données complémentaires. La première est une banque d’informations uniques compilées par l’United Nations Office on Drugs and Crime (UNODC) sur les saisies d’importance effectuées dans le monde entre 1998 et 2007 (n = 47629). Ces données sont complétées par les informations contenues dans une dizaine de rapports publiés par des organismes internationaux de surveillance du trafic de drogues. Les réseaux d’échanges dirigés construits à partir de ces données permettent d’examiner l’étendue du trafic entre la plupart des pays du monde et de qualifier leur implication individuelle. Les chapitres 3 et 4 portent sur la structure du trafic elle-même. Dans un premier temps, les différents rôles joués par les pays et les caractéristiques des trois marchés de drogues sont comparés. Les quantités en circulation et les taux d’interception sont estimés pour les 16 régions géographiques définies par l’UNODC. Dans un deuxième temps, leurs caractéristiques structurelles sont comparées à celles des marchés légaux. Il en ressort que les marchés de drogues sont beaucoup moins denses et que les pays périphériques y jouent un rôle plus prononcé. L’inégalité des échanges caractérise les deux économies, mais leurs structures sont inversées. Le chapitre 5 propose une analyse de la principale source de risque pour les trafiquants, les saisies de drogues. Les données compilées permettent de démontrer que les saisies policières de drogues agrégées au niveau des pays sont principalement indicatrices du volume de trafic. L’éventuel biais lié aux pressions policières est négligeable pour les quantités saisies, mais plus prononcé pour le nombre de saisies. Les organismes de contrôle seraient donc plus à même de moduler leurs activités que les retombées éventuelles. Les résultats suggèrent aussi que les trafiquants adoptent des stratégies diverses pour limiter les pertes liées aux saisies. Le chapitre 6 s’attarde à l’impact de la structure sur le prix et la valeur des drogues. Le prix de gros varie considérablement d’un pays à l’autre et d’une drogue à l’autre. Ces variations s’expliquent par les contraintes auxquelles font face les trafiquants dans le cadre de leurs activités. D’une part, la valeur des drogues augmente plus rapidement lorsqu’elles sont destinées à des pays où les risques et les coûts d’importation sont élevés. D’autre part, la majoration des prix est plus prononcée lorsque les échanges sont dirigés vers des pays du cœur de l’économie légale. De nouveau, les rôles sont inversés : les pays généralement avantagés dépendent des plus désavantagés, et les pays pauvres en profitent pour exploiter les riches.
Resumo:
L’extraction aurifère est l’une des activités humaines qui a fortement accru l’émission de contaminants métalliques dans l’environnement. Le mercure (Hg), l’arsenic (As) et le sélénium (Se) sont 3 polluants métalliques de grande toxicité environnementale. En milieu aquatique, ils peuvent subir des transformations menant à des composés capables de bioaccumulation et de bioamplification. Il peut en résulter des concentrations 106 fois celle mesurée dans l’eau chez les poissons et les organismes situés en haut des chaînes alimentaires posant de ce fait de graves menaces pour la santé de ces organismes ainsi que leurs consommateurs y compris les humains. Cette étude a évalué les teneurs en Hg, As et Se dans les milieux aquatiques au Burkina Faso, une région d’Afrique sub-saharienne soumise à une exploitation minière intensive. Le risque potentiel pour les organismes aquatiques et les humains a été évalué en considérant les effets des interactions antagonistes Se/Hg et As/Se. La bioaccumulation et le transfert du Hg et du Se dans les réseaux trophiques sont également décrits. L’exposition au Hg de poissons par les humains a été également évalué au laboratoire par mesure de la bioaccessibilité comme équivalent de la biodisponibilité par simulation de la digestion humaine. En général, les milieux aquatiques étudiés étaient peu affectés par ces 3 métal(loïd)s bien que certaines espèces de poisson issus des réservoirs les plus profonds indiquent des teneurs de Hg au dessus de 500 ngHg/g (poids frais) recommandé par l’OMS. Ces niveaux sont susceptibles de présenter des risques toxicologiques pour les poissons et pour leurs consommateurs. En considérant l’antagonisme Se/Hg, 99 % des échantillons de poisson seraient moins exposés à la toxicité du Hg dû à la présence simultanée du sélénium dans le milieu et pourraient être consommés sans risque. Cependant, les effets potentiels de l’antagonisme As/Se pourraient réduire les effets bénéfiques du Se et ramener cette proportion à 83 %. L’application des mesures de signatures en isotopes stables d’azote (δ15N) et de carbone (δ13C) des organismes aquatiques a permis le traçage des voies de transfert du Hg et du Se dans les réseaux trophiques. On y observe des chaînes trophiques très courtes (3 - 4 niveaux trophiques) et des poissons majoritairement benthiques. L’approche isotopique n’a cependant pas permis de détecter les variations saisonnières des niveaux de contamination en Hg des poissons. L’exploration des contenus stomacaux des poissons a permis de mieux expliquer la baisse des concentrations en Hg et Se observées chez certains poissons au cours de la saison sèche en lien avec la variation de la composition des proies que l’analyse isotopique n’a pas cerné. L’étude suggère que l’analyse de contenus stomacaux ainsi que l’étude de la dynamique des communautés d’invertébrés couplées à celle des métaux pourraient améliorer la compréhension du fonctionnement des écosystèmes étudiés. Enfin, l’évaluation expérimentale de l’exposition au Hg indique que les modes de traitement avant consommation ainsi que l’usage de composés alimentaires tels le thé, le café lors de repas de poisson par certaines communautés humaines ont un impact sur la bioaccessibilité du Hg de poisson. Ces résultats, sous réserve de validation par des modèles animaux, suggèrent la prise en compte des habitudes alimentaires des communautés dans l’élaboration adéquat des avis de consommation de poisson.
Resumo:
The study aims to the hydrodynamic characteristics of swirling fluidized bed, using large particles (Geldart D-type) selected from locally available agricultural produce (coffee beans and black pepper). The important variables considered in the present study include percentage area of opening, angle of air injection and the percentage useful area of the distributor. A total of seven distributors have been designed and fabricated for a bed column of 300 mm, namely single row vane type distributors (15˚ and 20˚ vane angle), inclined hole type distributors (15˚ and 20˚ vane angle) and perforated plate distributors. The useful area of distributor of single row vane type, three now vane-type and inclined hole-type distributors are respectively 64%,91% and 94%. The hydrodynamic parameters considered in the present study include distributor pressure drop, air velocity, minimum fluidizing velocity, bed pressure drop, bed height and the bed behaviour. It has been observed that, in general, the distributor pressure drop decreases with an increase in the percentage area of opening, Further, and increase in the area of opening above 17% will not considerably reduce the distributor pressure drop. In the present study, for the distributor with an area of opening 17%, and corresponding to the maximum measured superficial velocity of 4.33 m/s, the distributor pressure drop obtained was 55.25mm of water. The study on the bed behavior revealed that, in a swirling fluidized bed, once swirl motion starts, the bed pressure drop increases with superficial velocity in the outer region and it decreases in the inner region. This means that, with higher superficial velocity, the air might get by-passed through the inner boundary of the bed (around the cone). So, depending on the process for which the bed is used, the maximum superficial velocity is to be limited to have an optimum bed performance.
Resumo:
Thc tea industry in lndia is going through a period of crisis. The crisis in brought about mainly by cost caculation and declining or stagnant prices. The impact of the present crisis is felt most by the owners of tea plantations in Kcrala . The present study assumes significance due to the fact that the critic which already affected Keralas tea industry is now threatening to extend to other tea-growing areas in south India. Today, ensuring a favourablc price to the producers via-a-via possibilities or reducing the cost of production through increase in productivity of land and labour are the main considerations. The main purpose of the study is to analyse the factors behind the crisis as well as exploring immediate and long-term measures for the sustained growth of the industry.
Continuation and discontinuation of local institution in community based natural resource management
Resumo:
Currently the push toward frontier areas, which until twenty years ago were still largely untouched by commercial agriculture, is taking place on a massive scale. This push is being driven not the least by global economic developments, such as the price increase of agriculture commodities like coffee and cocoa. In most cases the indigenous communities become trapped between the state monopoly in natural resource management and the competition for resources by external actors. In this processes the indigenous communities start to lose their access to resources. Another victim in this process is the environment where the natural resources are imbedded. International and national organizations working to conserve environment have became conscious of the important role that indigenous people could fulfill as partners in this endeavour. This partnership in struggle has produced a new discourse on the relationship between indigenous people and their environment. As a further consequence, programs were set up to develop what became known as Community Based Natural Resource Management (CBNRM) with its numerous variations. Based on a case study in a village on the eastern border of the Lore Lindu National Park in Central Sulawesi, this study questioned the basic assumption behind the concept of Community Based Natural Resource Management (CBNRM). Namely the assumption that communities living at the margin of forest are socially and culturally homogenous, still more or less egalitarian, and basically living in harmony with their natural environment. This study was inspired by the persistent critique – although still a minority – on the basic assumption the CBNRM from academicians and practitioners working through the Entitlement perspective. Another inspiration was the mounting critique toward the participatory approach. In its effort the study explore further the usefulness of certain approaches. One of the approach much relied on in this study was the local history of the community studied, through exerting oral and local written documents on local history, legends and local stories. These sources proofed quite capable in bringing the local history into the light. Another was the actor oriented approach, which later came to be supported by the concept of Social Pool Resources. The latter concept proofed to be useful as analytical instrument to integrate social institutions and the common pool resources, as a field of action for the different actors as human agencies.
Resumo:
Die ubiquitäre Datenverarbeitung ist ein attraktives Forschungsgebiet des vergangenen und aktuellen Jahrzehnts. Es handelt von unaufdringlicher Unterstützung von Menschen in ihren alltäglichen Aufgaben durch Rechner. Diese Unterstützung wird durch die Allgegenwärtigkeit von Rechnern ermöglicht die sich spontan zu verteilten Kommunikationsnetzwerken zusammen finden, um Informationen auszutauschen und zu verarbeiten. Umgebende Intelligenz ist eine Anwendung der ubiquitären Datenverarbeitung und eine strategische Forschungsrichtung der Information Society Technology der Europäischen Union. Das Ziel der umbebenden Intelligenz ist komfortableres und sichereres Leben. Verteilte Kommunikationsnetzwerke für die ubiquitäre Datenverarbeitung charakterisieren sich durch Heterogenität der verwendeten Rechner. Diese reichen von Kleinstrechnern, eingebettet in Gegenstände des täglichen Gebrauchs, bis hin zu leistungsfähigen Großrechnern. Die Rechner verbinden sich spontan über kabellose Netzwerktechnologien wie wireless local area networks (WLAN), Bluetooth, oder UMTS. Die Heterogenität verkompliziert die Entwicklung und den Aufbau von verteilten Kommunikationsnetzwerken. Middleware ist eine Software Technologie um Komplexität durch Abstraktion zu einer homogenen Schicht zu reduzieren. Middleware bietet eine einheitliche Sicht auf die durch sie abstrahierten Ressourcen, Funktionalitäten, und Rechner. Verteilte Kommunikationsnetzwerke für die ubiquitäre Datenverarbeitung sind durch die spontane Verbindung von Rechnern gekennzeichnet. Klassische Middleware geht davon aus, dass Rechner dauerhaft miteinander in Kommunikationsbeziehungen stehen. Das Konzept der dienstorienterten Architektur ermöglicht die Entwicklung von Middleware die auch spontane Verbindungen zwischen Rechnern erlaubt. Die Funktionalität von Middleware ist dabei durch Dienste realisiert, die unabhängige Software-Einheiten darstellen. Das Wireless World Research Forum beschreibt Dienste die zukünftige Middleware beinhalten sollte. Diese Dienste werden von einer Ausführungsumgebung beherbergt. Jedoch gibt es noch keine Definitionen wie sich eine solche Ausführungsumgebung ausprägen und welchen Funktionsumfang sie haben muss. Diese Arbeit trägt zu Aspekten der Middleware-Entwicklung für verteilte Kommunikationsnetzwerke in der ubiquitären Datenverarbeitung bei. Der Schwerpunkt liegt auf Middleware und Grundlagentechnologien. Die Beiträge liegen als Konzepte und Ideen für die Entwicklung von Middleware vor. Sie decken die Bereiche Dienstfindung, Dienstaktualisierung, sowie Verträge zwischen Diensten ab. Sie sind in einem Rahmenwerk bereit gestellt, welches auf die Entwicklung von Middleware optimiert ist. Dieses Rahmenwerk, Framework for Applications in Mobile Environments (FAME²) genannt, beinhaltet Richtlinien, eine Definition einer Ausführungsumgebung, sowie Unterstützung für verschiedene Zugriffskontrollmechanismen um Middleware vor unerlaubter Benutzung zu schützen. Das Leistungsspektrum der Ausführungsumgebung von FAME² umfasst: • minimale Ressourcenbenutzung, um auch auf Rechnern mit wenigen Ressourcen, wie z.B. Mobiltelefone und Kleinstrechnern, nutzbar zu sein • Unterstützung für die Anpassung von Middleware durch Änderung der enthaltenen Dienste während die Middleware ausgeführt wird • eine offene Schnittstelle um praktisch jede existierende Lösung für das Finden von Diensten zu verwenden • und eine Möglichkeit der Aktualisierung von Diensten zu deren Laufzeit um damit Fehlerbereinigende, optimierende, und anpassende Wartungsarbeiten an Diensten durchführen zu können Eine begleitende Arbeit ist das Extensible Constraint Framework (ECF), welches Design by Contract (DbC) im Rahmen von FAME² nutzbar macht. DbC ist eine Technologie um Verträge zwischen Diensten zu formulieren und damit die Qualität von Software zu erhöhen. ECF erlaubt das aushandeln sowie die Optimierung von solchen Verträgen.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
Organic agriculture in Uganda is developing at a fast pace and despite this trend Uganda is still unable to produce enough fresh and dry organic fruits mainly pineapple to meet the exporters demand. This current research investigated the strategies of farmers at production level by assessing the pros and cons of fruit growing, organic agriculture and fruit drying in order to understand the underlying causal factor for the low production of organic dry fruits in a major fruit producing district of Uganda. The study was carried out in two separate and distinctive areas; one which only produces and export fresh organic pineapple and the other which exports dried fruits (mainly pineapple and papaya). About 10% of the farmers in the two study areas were surveyed using questionnaires which were further followed by semi-structured interviews and participatory rural appraisals activities with various types of farmers in order to understand the different decisions and strategies of farmers. 82% and 74% of farmers in the two study areas grew fruits as it gave better economic returns and for 77% and 90% respectively in the two study areas, the reasons for growing fruit was the ease of selling compared to other crops. All the farmers were relying on coffee husk for growing organic pineapples. However, 50% of the farmers want to grow pineapples (either organic or conventional) but couldn't afford to buy coffee husk. Fruit drying was mainly a strategy to utilize cheap fruits during harvesting seasons for value addition. 71% and 42% of farmers in the two study areas wanted to dry fruits but it was beyond their economic capacity to buy the driers. Decision of the farmers whether to grow fruits or cereals, organic or conventional agriculture and selling the fruits as fresh or dry were dependent mainly on the economic, knowledge and resource availability of each type of practices. It is concluded that the main barrier for an increase in the production of organic dried fruits is at the processing level, and the limited capacity for investments in drying facilities.
Resumo:
In East Africa, Uganda is one of the major producers of organic pineapples for export. These pineapples are mainly produced in central Uganda and have to meet stringent quality standards before they can be allowed on international markets. These quality standards may put considerable strain on farmers and may not be wholly representative of their quality interpretation. The aim of this paper is therefore, to determine the Ugandan organic pineapple farmers’ quality perception, the activities they carry out in order to attain that quality and challenges (production, postharvest & marketing) faced on the same. Qualitative semi-structured interviews were carried out among 28 organic pineapple farmers in Kayunga district, central Uganda. Findings suggest that quality of organic pineapples is mainly perceived in terms of product attributes particularly appearance followed by food security provision. Certification plays a minor role in what farmers describe as organic quality. High production input costs (labour and coffee husks) coupled with a stagnant premium are some of the major challenges faced by farmers in attaining organic quality. The paper argues that currently there are concealed negative food security effects embroiled in these pineapple schemes. It is recommended that the National Organic Agricultural Movement of Uganda (NOGAMU) works with all relevant stakeholders to have the farmer premium price raised and an official organic policy enacted.