9 resultados para Appropriate Selection Processes Are Available For Choosing Hospitality Texts

em Universitätsbibliothek Kassel, Universität Kassel, Germany


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Dissertation befasst sich mit der Einführung komplexer Softwaresysteme, die, bestehend aus einer Kombination aus parametrisierter Standardsoftware gepaart mit Wettbewerbsvorteil sichernden Individualsoftwarekomponenten, keine Software-Engineering-Projekte im klassischen Sinn mehr darstellen, sondern einer strategieorientierten Gestaltung von Geschäftsprozessen und deren Implementierung in Softwaresystemen bedürfen. Die Problemstellung einer adäquaten Abwägung zwischen TCO-optimierender Einführung und einer gleichzeitigen vollständigen Unterstützung der kritischen Erfolgsfaktoren des Unternehmens ist hierbei von besonderer Bedeutung. Der Einsatz integrierter betriebswirtschaftlicher Standardsoftware, mit den Möglichkeiten einer TCO-Senkung, jedoch ebenfalls der Gefahr eines Verlustes von Alleinstellungsmerkmalen am Markt durch Vereinheitlichungstendenzen, stellt ein in Einführungsprojekten wesentliches zu lösendes Problem dar, um Suboptima zu vermeiden. Die Verwendung von Vorgehensmodellen, die sich oftmals an klassischen Softwareentwicklungsprojekten orientieren oder vereinfachte Phasenmodelle für das Projektmanagement darstellen, bedingt eine fehlende Situationsadäquanz in den Detailsituationen der Teilprojekte eines komplexen Einführungsprojektes. Das in dieser Arbeit entwickelte generische Vorgehensmodell zur strategieorientierten und partizipativen Einführung komplexer Softwaresysteme im betriebswirtschaftlichen Anwendungsbereich macht - aufgrund der besonders herausgearbeiteten Ansätze zu einer strategieorientierten Einführung, respektive Entwicklung derartiger Systeme sowie aufgrund der situationsadäquaten Vorgehensstrategien im Rahmen der Teilprojektorganisation � ein Softwareeinführungsprojekt zu einem Wettbewerbsfaktor stärkenden, strategischen Element im Unternehmen. Die in der Dissertation diskutierten Überlegungen lassen eine Vorgehensweise präferieren, die eine enge Verschmelzung des Projektes zur Organisationsoptimierung mit dem Softwareimplementierungsprozess impliziert. Eine Priorisierung der Geschäftsprozesse mit dem Ziel, zum einen bei Prozessen mit hoher wettbewerbsseitiger Priorität ein organisatorisches Suboptimum zu vermeiden und zum anderen trotzdem den organisatorischen Gestaltungs- und den Systemimplementierungsprozess schnell und ressourcenschonend durchzuführen, ist ein wesentliches Ergebnis der Ausarbeitungen. Zusätzlich führt die Ausgrenzung weiterer Prozesse vom Einführungsvorgang zunächst zu einem Produktivsystem, welches das Unternehmen in den wesentlichen Punkten abdeckt, das aber ebenso in späteren Projektschritten zu einem System erweitert werden kann, welches eine umfassende Funktionalität besitzt. Hieraus ergeben sich Möglichkeiten, strategischen Anforderungen an ein modernes Informationssystem, das die kritischen Erfolgsfaktoren eines Unternehmens konsequent unterstützen muss, gerecht zu werden und gleichzeitig ein so weit als möglich ressourcenschonendes, weil die Kostenreduktionsaspekte einer Standardlösung nutzend, Projekt durchzuführen. Ein weiterer wesentlicher Aspekt ist die situationsadäquate Modellinstanziierung, also die projektspezifische Anpassung des Vorgehensmodells sowie die situationsadäquate Wahl der Vorgehensweisen in Teilprojekten und dadurch Nutzung der Vorteile der verschiedenen Vorgehensstrategien beim konkreten Projektmanagement. Der Notwendigkeit der Entwicklung einer Projektorganisation für prototypingorientiertes Vorgehen wird in diesem Zusammenhang ebenfalls Rechnung getragen. Die Notwendigkeit der Unternehmen, sich einerseits mit starken Differenzierungspotenzialen am Markt hervorzuheben und andererseits bei ständig sinkenden Margen einer Kostenoptimierung nachzukommen, lässt auch in Zukunft das entwickelte Modell als erfolgreich erscheinen. Hinzu kommt die Tendenz zu Best-Of-Breed-Ansätzen und komponentenbasierten Systemen im Rahmen der Softwareauswahl, die eine ausgesprochen differenzierte Vorgehensweise in Projekten verstärkt notwendig machen wird. Durch die in das entwickelte Modell integrierten Prototyping-Ansätze wird der auch in Zukunft an Bedeutung gewinnenden Notwendigkeit der Anwenderintegration Rechnung getragen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During recent years, quantum information processing and the study of N−qubit quantum systems have attracted a lot of interest, both in theory and experiment. Apart from the promise of performing efficient quantum information protocols, such as quantum key distribution, teleportation or quantum computation, however, these investigations also revealed a great deal of difficulties which still need to be resolved in practise. Quantum information protocols rely on the application of unitary and non–unitary quantum operations that act on a given set of quantum mechanical two-state systems (qubits) to form (entangled) states, in which the information is encoded. The overall system of qubits is often referred to as a quantum register. Today the entanglement in a quantum register is known as the key resource for many protocols of quantum computation and quantum information theory. However, despite the successful demonstration of several protocols, such as teleportation or quantum key distribution, there are still many open questions of how entanglement affects the efficiency of quantum algorithms or how it can be protected against noisy environments. To facilitate the simulation of such N−qubit quantum systems and the analysis of their entanglement properties, we have developed the Feynman program. The program package provides all necessary tools in order to define and to deal with quantum registers, quantum gates and quantum operations. Using an interactive and easily extendible design within the framework of the computer algebra system Maple, the Feynman program is a powerful toolbox not only for teaching the basic and more advanced concepts of quantum information but also for studying their physical realization in the future. To this end, the Feynman program implements a selection of algebraic separability criteria for bipartite and multipartite mixed states as well as the most frequently used entanglement measures from the literature. Additionally, the program supports the work with quantum operations and their associated (Jamiolkowski) dual states. Based on the implementation of several popular decoherence models, we provide tools especially for the quantitative analysis of quantum operations. As an application of the developed tools we further present two case studies in which the entanglement of two atomic processes is investigated. In particular, we have studied the change of the electron-ion spin entanglement in atomic photoionization and the photon-photon polarization entanglement in the two-photon decay of hydrogen. The results show that both processes are, in principle, suitable for the creation and control of entanglement. Apart from process-specific parameters like initial atom polarization, it is mainly the process geometry which offers a simple and effective instrument to adjust the final state entanglement. Finally, for the case of the two-photon decay of hydrogenlike systems, we study the difference between nonlocal quantum correlations, as given by the violation of the Bell inequality and the concurrence as a true entanglement measure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Web services from different partners can be combined to applications that realize a more complex business goal. Such applications built as Web service compositions define how interactions between Web services take place in order to implement the business logic. Web service compositions not only have to provide the desired functionality but also have to comply with certain Quality of Service (QoS) levels. Maximizing the users' satisfaction, also reflected as Quality of Experience (QoE), is a primary goal to be achieved in a Service-Oriented Architecture (SOA). Unfortunately, in a dynamic environment like SOA unforeseen situations might appear like services not being available or not responding in the desired time frame. In such situations, appropriate actions need to be triggered in order to avoid the violation of QoS and QoE constraints. In this thesis, proper solutions are developed to manage Web services and Web service compositions with regard to QoS and QoE requirements. The Business Process Rules Language (BPRules) was developed to manage Web service compositions when undesired QoS or QoE values are detected. BPRules provides a rich set of management actions that may be triggered for controlling the service composition and for improving its quality behavior. Regarding the quality properties, BPRules allows to distinguish between the QoS values as they are promised by the service providers, QoE values that were assigned by end-users, the monitored QoS as measured by our BPR framework, and the predicted QoS and QoE values. BPRules facilitates the specification of certain user groups characterized by different context properties and allows triggering a personalized, context-aware service selection tailored for the specified user groups. In a service market where a multitude of services with the same functionality and different quality values are available, the right services need to be selected for realizing the service composition. We developed new and efficient heuristic algorithms that are applied to choose high quality services for the composition. BPRules offers the possibility to integrate multiple service selection algorithms. The selection algorithms are applicable also for non-linear objective functions and constraints. The BPR framework includes new approaches for context-aware service selection and quality property predictions. We consider the location information of users and services as context dimension for the prediction of response time and throughput. The BPR framework combines all new features and contributions to a comprehensive management solution. Furthermore, it facilitates flexible monitoring of QoS properties without having to modify the description of the service composition. We show how the different modules of the BPR framework work together in order to execute the management rules. We evaluate how our selection algorithms outperform a genetic algorithm from related research. The evaluation reveals how context data can be used for a personalized prediction of response time and throughput.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study focuses on multiple linear regression models relating six climate indices (temperature humidity THI, environmental stress ESI, equivalent temperature index ETI, heat load HLI, modified HLI (HLI new), and respiratory rate predictor RRP) with three main components of cow’s milk (yield, fat, and protein) for cows in Iran. The least absolute shrinkage selection operator (LASSO) and the Akaike information criterion (AIC) techniques are applied to select the best model for milk predictands with the smallest number of climate predictors. Uncertainty estimation is employed by applying bootstrapping through resampling. Cross validation is used to avoid over-fitting. Climatic parameters are calculated from the NASA-MERRA global atmospheric reanalysis. Milk data for the months from April to September, 2002 to 2010 are used. The best linear regression models are found in spring between milk yield as the predictand and THI, ESI, ETI, HLI, and RRP as predictors with p-value < 0.001 and R2 (0.50, 0.49) respectively. In summer, milk yield with independent variables of THI, ETI, and ESI show the highest relation (p-value < 0.001) with R2 (0.69). For fat and protein the results are only marginal. This method is suggested for the impact studies of climate variability/change on agriculture and food science fields when short-time series or data with large uncertainty are available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Landwirtschaft spielt eine zentrale Rolle im Erdsystem. Sie trägt durch die Emission von CO2, CH4 und N2O zum Treibhauseffekt bei, kann Bodendegradation und Eutrophierung verursachen, regionale Wasserkreisläufe verändern und wird außerdem stark vom Klimawandel betroffen sein. Da all diese Prozesse durch die zugrunde liegenden Nährstoff- und Wasserflüsse eng miteinander verknüpft sind, sollten sie in einem konsistenten Modellansatz betrachtet werden. Dennoch haben Datenmangel und ungenügendes Prozessverständnis dies bis vor kurzem auf der globalen Skala verhindert. In dieser Arbeit wird die erste Version eines solchen konsistenten globalen Modellansatzes präsentiert, wobei der Schwerpunkt auf der Simulation landwirtschaftlicher Erträge und den resultierenden N2O-Emissionen liegt. Der Grund für diese Schwerpunktsetzung liegt darin, dass die korrekte Abbildung des Pflanzenwachstums eine essentielle Voraussetzung für die Simulation aller anderen Prozesse ist. Des weiteren sind aktuelle und potentielle landwirtschaftliche Erträge wichtige treibende Kräfte für Landnutzungsänderungen und werden stark vom Klimawandel betroffen sein. Den zweiten Schwerpunkt bildet die Abschätzung landwirtschaftlicher N2O-Emissionen, da bislang kein prozessbasiertes N2O-Modell auf der globalen Skala eingesetzt wurde. Als Grundlage für die globale Modellierung wurde das bestehende Agrarökosystemmodell Daycent gewählt. Neben der Schaffung der Simulationsumgebung wurden zunächst die benötigten globalen Datensätze für Bodenparameter, Klima und landwirtschaftliche Bewirtschaftung zusammengestellt. Da für Pflanzzeitpunkte bislang keine globale Datenbasis zur Verfügung steht, und diese sich mit dem Klimawandel ändern werden, wurde eine Routine zur Berechnung von Pflanzzeitpunkten entwickelt. Die Ergebnisse zeigen eine gute Übereinstimmung mit Anbaukalendern der FAO, die für einige Feldfrüchte und Länder verfügbar sind. Danach wurde das Daycent-Modell für die Ertragsberechnung von Weizen, Reis, Mais, Soja, Hirse, Hülsenfrüchten, Kartoffel, Cassava und Baumwolle parametrisiert und kalibriert. Die Simulationsergebnisse zeigen, dass Daycent die wichtigsten Klima-, Boden- und Bewirtschaftungseffekte auf die Ertragsbildung korrekt abbildet. Berechnete Länderdurchschnitte stimmen gut mit Daten der FAO überein (R2 = 0.66 für Weizen, Reis und Mais; R2 = 0.32 für Soja), und räumliche Ertragsmuster entsprechen weitgehend der beobachteten Verteilung von Feldfrüchten und subnationalen Statistiken. Vor der Modellierung landwirtschaftlicher N2O-Emissionen mit dem Daycent-Modell stand eine statistische Analyse von N2O-und NO-Emissionsmessungen aus natürlichen und landwirtschaftlichen Ökosystemen. Die als signifikant identifizierten Parameter für N2O (Düngemenge, Bodenkohlenstoffgehalt, Boden-pH, Textur, Feldfrucht, Düngersorte) und NO (Düngemenge, Bodenstickstoffgehalt, Klima) entsprechen weitgehend den Ergebnissen einer früheren Analyse. Für Emissionen aus Böden unter natürlicher Vegetation, für die es bislang keine solche statistische Untersuchung gab, haben Bodenkohlenstoffgehalt, Boden-pH, Lagerungsdichte, Drainierung und Vegetationstyp einen signifikanten Einfluss auf die N2O-Emissionen, während NO-Emissionen signifikant von Bodenkohlenstoffgehalt und Vegetationstyp abhängen. Basierend auf den daraus entwickelten statistischen Modellen betragen die globalen Emissionen aus Ackerböden 3.3 Tg N/y für N2O, und 1.4 Tg N/y für NO. Solche statistischen Modelle sind nützlich, um Abschätzungen und Unsicherheitsbereiche von N2O- und NO-Emissionen basierend auf einer Vielzahl von Messungen zu berechnen. Die Dynamik des Bodenstickstoffs, insbesondere beeinflusst durch Pflanzenwachstum, Klimawandel und Landnutzungsänderung, kann allerdings nur durch die Anwendung von prozessorientierten Modellen berücksichtigt werden. Zur Modellierung von N2O-Emissionen mit dem Daycent-Modell wurde zunächst dessen Spurengasmodul durch eine detailliertere Berechnung von Nitrifikation und Denitrifikation und die Berücksichtigung von Frost-Auftau-Emissionen weiterentwickelt. Diese überarbeitete Modellversion wurde dann an N2O-Emissionsmessungen unter verschiedenen Klimaten und Feldfrüchten getestet. Sowohl die Dynamik als auch die Gesamtsummen der N2O-Emissionen werden befriedigend abgebildet, wobei die Modelleffizienz für monatliche Mittelwerte zwischen 0.1 und 0.66 für die meisten Standorte liegt. Basierend auf der überarbeiteten Modellversion wurden die N2O-Emissionen für die zuvor parametrisierten Feldfrüchte berechnet. Emissionsraten und feldfruchtspezifische Unterschiede stimmen weitgehend mit Literaturangaben überein. Düngemittelinduzierte Emissionen, die momentan vom IPCC mit 1.25 +/- 1% der eingesetzten Düngemenge abgeschätzt werden, reichen von 0.77% (Reis) bis 2.76% (Mais). Die Summe der berechneten Emissionen aus landwirtschaftlichen Böden beträgt für die Mitte der 1990er Jahre 2.1 Tg N2O-N/y, was mit den Abschätzungen aus anderen Studien übereinstimmt.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To increase the organic matter (OM) content in the soil is one main goal in arable soil management. The adoption of tillage systems with reduced tillage depth and/or frequency (reduced tillage) or of no-tillage was found to increase the concentration of soil OM compared to conventional tillage (CT; ploughing to 20-30 cm). However, the underlying processes are not yet clear and are discussed contradictorily. So far, few investigations were conducted on tillage systems with a shallow tillage depth (minimum tillage = MT; maximum tillage depth of 10 cm). A better understanding of the interactions between MT implementation and changes in OM transformation in soils is essential in order to evaluate the possible contribution of MT to a sustainable management of arable soils. The objectives of the present thesis were (i) to compare OM concentrations, microbial biomass, water-stable aggregates, and particulate OM (POM) between CT and MT soils, (ii) to estimate the temporal variability of water-stable aggregate size classes occurring in the field and the dynamics of macroaggregate (>250 µm) formation and disruption under controlled conditions, (iii) to investigate whether a lower disruption or a higher formation rate accounts for a higher occurrence of macroaggregates under MT compared to CT, (iv) to determine which fraction is the major agent for storing the surplus of OM found under MT compared to CT, and (v) to observe the early OM transformation after residue incorporation in different tillage systems simulated. Two experimental sites (Garte-Süd and Hohes Feld) near Göttingen, Germany, were investigated. Soil type of both sites was a Haplic Luvisol. Since about 40 years, both sites receive MT by a rotary harrow (to 5-8 cm depth) and CT by a plough (to 25 cm depth). Surface soils (0-5 cm) and subsoils (10-20 cm) of two sampling dates (after fallow and directly after tillage) were investigated for concentrations of organic C (Corg) and total N (N), different water-stable aggregate size classes, different density fractions (for the sampling date after fallow only), microbial biomass, and for biochemically stabilized Corg and N (by acid hydrolysis; for the sampling date after tillage only). In addition, two laboratory incubations were performed under controlled conditions: Firstly, MT and CT soils were incubated (28 days at 22°C) as bulk soil and with destroyed macroaggregates in order to estimate the importance of macroaggregates for the physical protection of the very labile OM against mineralization. Secondly, in a microcosm experiment simulating MT and CT systems with soil <250 µm and with 15N and 13C labelled maize straw incorporated to different depths, the mineralization, the formation of new macroaggregates, and the partitioning of the recently added C and N were followed (28 days at 15°C). Forty years of MT regime led to higher concentrations of microbial biomass and of Corg and N compared to CT, especially in the surface soil. After fallow and directly after tillage, a higher proportion of water-stable macroaggregates rich in OM was found in the MT (36% and 66%, respectively) than in the CT (19% and 47%, respectively) surface soils of both sites (data shown are of the site Garte-Süd only). The subsoils followed the same trend. For the sampling date after fallow, no differences in the POM fractions were found but there was more OM associated to the mineral fraction detected in the MT soils. A large temporal variability was observed for the abundance of macroaggregates. In the field and in the microcosm simulations, macroaggregates were found to have a higher formation rate after the incorporation of residues under MT than under CT. Thus, the lower occurrence of macroaggregates in CT soils cannot be attributed to a higher disruption but to a lower formation rate. A higher rate of macroaggregate formation in MT soils may be due to (i) the higher concentrated input of residues in the surface soil and/or (ii) a higher abundance of fungal biomass in contrast to CT soils. Overall, as a location of storage of the surplus of OM detected under MT compared to CT, water-stable macroaggregates were found to play a key role. In the incubation experiment, macroaggregates were not found to protect the very labile OM against mineralization. Anyway, the surplus of OM detected after tillage in the MT soil was biochemically degradable. MT simulations in the microcosm experiment showed a lower specific respiration and a less efficient translocation of recently added residues than the CT simulations. Differences in the early processes of OM translocation between CT and MT simulations were attributed to a higher residue to soil ratio and to a higher proportion of fungal biomass in the MT simulations. Overall, MT was found to have several beneficial effects on the soil structure and on the storage of OM, especially in the surface soil. Furthermore, it was concluded that the high concentration of residues in the surface soil of MT may alter the processes of storage and decomposition of OM. In further investigations, especially analysis of the residue-soil-interface and of effects of the depth of residue incorporation should be emphasised. Moreover, further evidence is needed on differences in the microbial community between CT and MT soils.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Auszeichnungssprache XML dient zur Annotation von Dokumenten und hat sich als Standard-Datenaustauschformat durchgesetzt. Dabei entsteht der Bedarf, XML-Dokumente nicht nur als reine Textdateien zu speichern und zu transferieren, sondern sie auch persistent in besser strukturierter Form abzulegen. Dies kann unter anderem in speziellen XML- oder relationalen Datenbanken geschehen. Relationale Datenbanken setzen dazu bisher auf zwei grundsätzlich verschiedene Verfahren: Die XML-Dokumente werden entweder unverändert als binäre oder Zeichenkettenobjekte gespeichert oder aber aufgespalten, sodass sie in herkömmlichen relationalen Tabellen normalisiert abgelegt werden können (so genanntes „Flachklopfen“ oder „Schreddern“ der hierarchischen Struktur). Diese Dissertation verfolgt einen neuen Ansatz, der einen Mittelweg zwischen den bisherigen Lösungen darstellt und die Möglichkeiten des weiterentwickelten SQL-Standards aufgreift. SQL:2003 definiert komplexe Struktur- und Kollektionstypen (Tupel, Felder, Listen, Mengen, Multimengen), die es erlauben, XML-Dokumente derart auf relationale Strukturen abzubilden, dass der hierarchische Aufbau erhalten bleibt. Dies bietet zwei Vorteile: Einerseits stehen bewährte Technologien, die aus dem Bereich der relationalen Datenbanken stammen, uneingeschränkt zur Verfügung. Andererseits lässt sich mit Hilfe der SQL:2003-Typen die inhärente Baumstruktur der XML-Dokumente bewahren, sodass es nicht erforderlich ist, diese im Bedarfsfall durch aufwendige Joins aus den meist normalisierten und auf mehrere Tabellen verteilten Tupeln zusammenzusetzen. In dieser Arbeit werden zunächst grundsätzliche Fragen zu passenden, effizienten Abbildungsformen von XML-Dokumenten auf SQL:2003-konforme Datentypen geklärt. Darauf aufbauend wird ein geeignetes, umkehrbares Umsetzungsverfahren entwickelt, das im Rahmen einer prototypischen Applikation implementiert und analysiert wird. Beim Entwurf des Abbildungsverfahrens wird besonderer Wert auf die Einsatzmöglichkeit in Verbindung mit einem existierenden, ausgereiften relationalen Datenbankmanagementsystem (DBMS) gelegt. Da die Unterstützung von SQL:2003 in den kommerziellen DBMS bisher nur unvollständig ist, muss untersucht werden, inwieweit sich die einzelnen Systeme für das zu implementierende Abbildungsverfahren eignen. Dabei stellt sich heraus, dass unter den betrachteten Produkten das DBMS IBM Informix die beste Unterstützung für komplexe Struktur- und Kollektionstypen bietet. Um die Leistungsfähigkeit des Verfahrens besser beurteilen zu können, nimmt die Arbeit Untersuchungen des nötigen Zeitbedarfs und des erforderlichen Arbeits- und Datenbankspeichers der Implementierung vor und bewertet die Ergebnisse.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With Chinas rapid economic development during the last decades, the national demand for livestock products has quadrupled within the last 20 years. Most of that increase in demand has been answered by subsidized industrialized production systems, while million of smallholders, which still provide the larger share of livestock products in the country, have been neglected. Fostering those systems would help China to lower its strong urban migration streams, enhance the livelihood of poorer rural population and provide environmentally save livestock products which have a good chance to satisfy customers demand for ecological food. Despite their importance, China’s smallholder livestock keepers have not yet gained appropriate attention from governmental authorities and researchers. However, profound analysis of those systems is required so that adequate support can lead to a better resource utilization and productivity in the sector. To this aim, this pilot study analyzes smallholder livestock production systems in Xishuangbanna, located in southern China. The area is bordered by Lao and Myanmar and geographically counts as tropical region. Its climate is characterized by dry and temperate winters and hot summers with monsoon rains from May to October. While the regionis plain, at about 500 m asl above sea level in the south, outliers of the Himalaya mountains reach out into the north of Xishuangbanna, where the highest peak reaches 2400 m asl. Except of one larger city, Jinghong, Xishuangbanna mainly is covered by tropical rainforest, areas under agricultural cultivation and villages. The major income is generated through inner-Chinese tourism and agricultural production. Intensive rubber plantations are distinctive for the lowland plains while small-scaled traditional farms are scattered in the mountane regions. In order to determine the current state and possible future chances of smallholder livestock production in that region, this study analyzed the current status of the smallholder livestock sector in the Naban River National Nature Reserve (NRNNR), an area which is largely representative for the whole prefecture. It covers an area of about 50square kilometer and reaches from 470 up to 2400 m asl. About 5500 habitants of different ethnic origin are situated in 24 villages. All data have been collected between October 2007 and May 2010. Three major objectives have been addressed in the study: 1. Classifying existing pig production systems and exploring respective pathways for development 2. Quantifying the performance of pig breeding systemsto identify bottlenecks for production 3. Analyzing past and current buffalo utilization to determine the chances and opportunities of buffalo keeping in the future In order to classify the different pig production s ystems, a baseline survey (n=204, stratified cluster sampling) was carried out to gain data about livestock species, numbers, management practices, cultivated plant species and field sizes as well associo-economic characteristics. Sampling included two clusters at village level (altitude, ethnic affiliation), resulting in 13 clusters of which 13-17 farms were interviewed respectively. Categorical Principal Component Analysis (CatPCA) and a two-step clustering algorithm have been applied to identify determining farm characteristics and assort recorded households into classes of livestock production types. The variables keep_sow_yes/no, TLU_pig, TLU_buffalo, size_of_corn_fields, altitude_class, size_of_tea_plantationand size_of_rubber_fieldhave been found to be major determinants for the characterization of the recorded farms. All farms have extensive or semi-intensive livestock production, pigs and buffaloes are predominant livestock species while chicken and aquaculture are available but play subordinate roles for livelihoods. All pig raisers rely on a single local breed, which is known as Small Ear Pig (SMEP) in the region. Three major production systemshave been identified: Livestock-corn based LB; 41%), rubber based (RB; 39%) and pig based (PB;20%) systems. RB farms earn high income from rubber and fatten 1.9 ±1.80 pigs per household (HH), often using purchased pig feed at markets. PB farms own similar sized rubber plantations and raise 4.7 ±2.77 pigs per HH, with fodder mainly being cultivated and collected in theforest. LB farms grow corn, rice and tea and keep 4.6 ±3.32 pigs per HH, also fed with collected and cultivated fodder. Only 29% of all pigs were marketed (LB: 20%; RB: 42%; PB: 25%), average annual mortality was 4.0 ±4.52 pigs per farm (LB: 4.6 ±3.68; RB: 1.9 ±2.14; PB: 7.1 ±10.82). Pig feed mainly consists of banana pseudo stem, corn and rice hives and is prepared in batches about two to three times per week. Such fodder might be sufficient in energy content but lacks appropriate content of protein. Pigs therefore suffer from malnutrition, which becomes most critical in the time before harvest season around October. Farmers reported high occurrences of gastrointestinal parasites in carcasses and often pig stables were wet and filled with manure. Deficits in nutritional and hygienic management are major limits for development and should be the first issues addressed to improve productivity. SME pork was found to be known and referred by local customers in town and by richer lowland farmers. However, high prices and lacking availability of SME pork at local wet-markets were the reasons which limited purchase. If major management constraints are overcome, pig breeders (PB and LB farms) could increase the share of marketed pigs for town markets and provide fatteners to richer RB farmers. RB farmers are interested in fattening pigs for home consumption but do not show any motivation for commercial pig raising. To determine the productivity of input factors in pig production, eproductive performance, feed quality and quantity as well as weight development of pigs under current management were recorded. The data collection included a progeny history survey covering 184 sows and 437 farrows, bi-weekly weighing of 114 pigs during a 16-months time-span on 21 farms (10 LB and 11 PB) as well as the daily recording of feed quality and quantity given to a defined number of pigs on the same 21 farms. Feed samples of all recorded ingredients were analyzed for their respective nutrient content. Since no literature values on thedigestibility of banana pseudo stem – which is a major ingredient of traditional pig feed in NRNNR – were found, a cross-sectional digestibility trial with 2x4 pigs has been conducted on a station in the research area. With the aid of PRY Herd Life Model, all data have been utilized to determine thesystems’ current (Status Quo = SQ) output and the productivity of the input factor “feed” in terms of saleable life weight per kg DM feed intake and monetary value of output per kg DM feed intake.Two improvement scenarios were simulated, assuming 1) that farmers adopt a culling managementthat generates the highest output per unit input (Scenario 1; SC I) and 2) that through improved feeding, selected parameters of reproduction are improved by 30% (SC II). Daily weight gain averaged 55 ± 56 g per day between day 200 and 600. The average feed energy content of traditional feed mix was 14.92 MJ ME. Age at first farrowing averaged 14.5 ± 4.34 months, subsequent inter-farrowing interval was 11.4 ± 2.73 months. Littersize was 5.8 piglets and weaning age was 4.3 ± 0.99 months. 18% of piglets died before weaning. Simulating pig production at actualstatus, it has been show that monetary returns on inputs (ROI) is negative (1:0.67), but improved (1:1.2) when culling management was optimized so that highest output is gained per unit feed input. If in addition better feeding, controlled mating and better resale prices at fixed dates were simulated, ROI further increased to 1:2.45, 1:2.69, 1:2.7 and 1:3.15 for four respective grower groups. Those findings show the potential of pork production, if basic measures of improvement are applied. Futureexploration of the environment, including climate, market-season and culture is required before implementing the recommended measures to ensure a sustainable development of a more effective and resource conserving pork production in the future. The two studies have shown that the production of local SME pigs plays an important role in traditional farms in NRNNR but basic constraints are limiting their productivity. However, relatively easy approaches are sufficient for reaching a notable improvement. Also there is a demand for more SME pork on local markets and, if basic constraints have been overcome, pig farmers could turn into more commercial producers and provide pork to local markets. By that, environmentally safe meat can be offered to sensitive consumers while farmers increase their income and lower the risk of external shocks through a more diverse income generating strategy. Buffaloes have been found to be the second important livestock species on NRNNR farms. While they have been a core resource of mixed smallholderfarms in the past, the expansion of rubber tree plantations and agricultural mechanization are reasons for decreased swamp buffalo numbers today. The third study seeks to predict future utilization of buffaloes on different farm types in NRNNR by analyzing the dynamics of its buffalo population and land use changes over time and calculating labor which is required for keeping buffaloes in view of the traction power which can be utilized for field preparation. The use of buffaloes for field work and the recent development of the egional buffalo population were analyzed through interviews with 184 farmers in 2007/2008 and discussions with 62 buffalo keepers in 2009. While pig based farms (PB; n=37) have abandoned buffalo keeping, 11% of the rubber based farms (RB; n=71) and 100% of the livestock-corn based farms (LB; n=76) kept buffaloes in 2008. Herd size was 2.5 ±1.80 (n=84) buffaloes in early 2008 and 2.2 ±1.69 (n=62) in 2009. Field work on own land was the main reason forkeeping buffaloes (87.3%), but lending work buffaloes to neighbors (79.0%) was also important. Other purposes were transport of goods (16.1%), buffalo trade (11.3%) and meat consumption(6.4%). Buffalo care required 6.2 ±3.00 working hours daily, while annual working time of abuffalo was 294 ±216.6 hours. The area ploughed with buffaloes remained constant during the past 10 years despite an expansion of land cropped per farm. Further rapid replacement of buffaloes by tractors is expected in the near future. While the work economy is drastically improved by the use of tractors, buffaloes still can provide cheap work force and serve as buffer for economic shocks on poorer farms. Especially poor farms, which lack alternative assets that could quickly be liquidizedin times of urgent need for cash, should not abandon buffalo keeping. Livestock has been found to be a major part of small mixed farms in NRNNR. The general productivity was low in both analyzed species, buffaloes and pigs. Productivity of pigs can be improved through basic adjustments in feeding, reproductive and hygienic management, and with external support pig production could further be commercialized to provide pork and weaners to local markets and fattening farms. Buffalo production is relatively time intensive, and only will be of importance in the future to very poor farms and such farms that cultivate very small terraces on steep slopes. These should be encouraged to further keep buffaloes. With such measures, livestock production in NRNNR has good chances to stay competitive in the future.