11 resultados para Generic Core Scales
em Universitätsbibliothek Kassel, Universität Kassel, Germany
Resumo:
In this work, we present a generic formula for the polynomial solution families of the well-known differential equation of hypergeometric type s(x)y"n(x) + t(x)y'n(x) - lnyn(x) = 0 and show that all the three classical orthogonal polynomial families as well as three finite orthogonal polynomial families, extracted from this equation, can be identified as special cases of this derived polynomial sequence. Some general properties of this sequence are also given.
Resumo:
In a previous paper we have determined a generic formula for the polynomial solution families of the well-known differential equation of hypergeometric type σ(x)y"n(x)+τ(x)y'n(x)-λnyn(x)=0. In this paper, we give another such formula which enables us to present a generic formula for the values of monic classical orthogonal polynomials at their boundary points of definition.
Resumo:
The process of developing software that takes advantage of multiple processors is commonly referred to as parallel programming. For various reasons, this process is much harder than the sequential case. For decades, parallel programming has been a problem for a small niche only: engineers working on parallelizing mostly numerical applications in High Performance Computing. This has changed with the advent of multi-core processors in mainstream computer architectures. Parallel programming in our days becomes a problem for a much larger group of developers. The main objective of this thesis was to find ways to make parallel programming easier for them. Different aims were identified in order to reach the objective: research the state of the art of parallel programming today, improve the education of software developers about the topic, and provide programmers with powerful abstractions to make their work easier. To reach these aims, several key steps were taken. To start with, a survey was conducted among parallel programmers to find out about the state of the art. More than 250 people participated, yielding results about the parallel programming systems and languages in use, as well as about common problems with these systems. Furthermore, a study was conducted in university classes on parallel programming. It resulted in a list of frequently made mistakes that were analyzed and used to create a programmers' checklist to avoid them in the future. For programmers' education, an online resource was setup to collect experiences and knowledge in the field of parallel programming - called the Parawiki. Another key step in this direction was the creation of the Thinking Parallel weblog, where more than 50.000 readers to date have read essays on the topic. For the third aim (powerful abstractions), it was decided to concentrate on one parallel programming system: OpenMP. Its ease of use and high level of abstraction were the most important reasons for this decision. Two different research directions were pursued. The first one resulted in a parallel library called AthenaMP. It contains so-called generic components, derived from design patterns for parallel programming. These include functionality to enhance the locks provided by OpenMP, to perform operations on large amounts of data (data-parallel programming), and to enable the implementation of irregular algorithms using task pools. AthenaMP itself serves a triple role: the components are well-documented and can be used directly in programs, it enables developers to study the source code and learn from it, and it is possible for compiler writers to use it as a testing ground for their OpenMP compilers. The second research direction was targeted at changing the OpenMP specification to make the system more powerful. The main contributions here were a proposal to enable thread-cancellation and a proposal to avoid busy waiting. Both were implemented in a research compiler, shown to be useful in example applications, and proposed to the OpenMP Language Committee.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
Formal Concept Analysis allows to derive conceptual hierarchies from data tables. Formal Concept Analysis is applied in various domains, e.g., data analysis, information retrieval, and knowledge discovery in databases. In order to deal with increasing sizes of the data tables (and to allow more complex data structures than just binary attributes), conceputal scales habe been developed. They are considered as metadata which structure the data conceptually. But in large applications, the number of conceptual scales increases as well. Techniques are needed which support the navigation of the user also on this meta-level of conceptual scales. In this paper, we attack this problem by extending the set of scales by hierarchically ordered higher level scales and by introducing a visualization technique called nested scaling. We extend the two-level architecture of Formal Concept Analysis (the data table plus one level of conceptual scales) to many-level architecture with a cascading system of conceptual scales. The approach also allows to use representation techniques of Formal Concept Analysis for the visualization of thesauri and ontologies.
Resumo:
To study the behaviour of beam-to-column composite connection more sophisticated finite element models is required, since component model has some severe limitations. In this research a generic finite element model for composite beam-to-column joint with welded connections is developed using current state of the art local modelling. Applying mechanically consistent scaling method, it can provide the constitutive relationship for a plane rectangular macro element with beam-type boundaries. Then, this defined macro element, which preserves local behaviour and allows for the transfer of five independent states between local and global models, can be implemented in high-accuracy frame analysis with the possibility of limit state checks. In order that macro element for scaling method can be used in practical manner, a generic geometry program as a new idea proposed in this study is also developed for this finite element model. With generic programming a set of global geometric variables can be input to generate a specific instance of the connection without much effort. The proposed finite element model generated by this generic programming is validated against testing results from University of Kaiserslautern. Finally, two illustrative examples for applying this macro element approach are presented. In the first example how to obtain the constitutive relationships of macro element is demonstrated. With certain assumptions for typical composite frame the constitutive relationships can be represented by bilinear laws for the macro bending and shear states that are then coupled by a two-dimensional surface law with yield and failure surfaces. In second example a scaling concept that combines sophisticated local models with a frame analysis using a macro element approach is presented as a practical application of this numerical model.
Resumo:
The objective of this study was to develop an internet-based seminar framework applicable for landscape architecture education. This process was accompanied by various aims. The basic expectation was to keep the main characteristics of landscape architecture education also in the online format. On top of that, four further objectives were anticipated: (1) training of competences for virtual team work, (2) fostering intercultural competence, (3) creation of equal opportunities for education through internet-based open access and (4) synergy effects and learning processes across institutional boundaries. This work started with the hypothesis that these four expected advantages would compensate for additional organisational efforts caused by the online delivery of the seminars and thus lead to a sustainable integration of this new learning mode into landscape architecture curricula. This rationale was followed by a presentation of four areas of knowledge to which the seminar development was directly related (1) landscape architecture as a subject and its pedagogy, (2) general learning theories, (3) developments in the ICT sector and (4) wider societal driving forces such as global citizenship and the increase of open educational resources. The research design took the shape of a pedagogical action research cycle. This approach was constructive: The author herself is teaching international landscape architecture students so that the model could directly be applied in practice. Seven online seminars were implemented in the period from 2008 to 2013 and this experience represents the core of this study. The seminars were conducted with varying themes while its pedagogy, organisation and the technological tools remained widely identical. The research design is further based on three levels of observation: (1) the seminar design on the basis of theory and methods from the learning sciences, in particular educational constructivism, (2) the seminar evaluation and (3) the evaluation of the seminars’ long term impact. The seminar model itself basically consists of four elements: (1) the taxonomy of learning objectives, (2) ICT tools and their application and pedagogy, (3) process models and (4) the case study framework. The seminar framework was followed by the presentation of the evaluation findings. The major findings of this study can be summed up as follows: Implementing online seminars across educational and national boundaries was possible both in term of organisation and technology. In particular, a high level of cultural diversity among the seminar participants has definitively been achieved. However, there were also obvious obstacles. These were primarily competing study commitments and incompatible schedules among the students attending from different academic programmes, partly even in different time zones. Both factors had negative impact on the individual and working group performances. With respect to the technical framework it can be concluded that the majority of the participants were able to use the tools either directly without any problem or after overcoming some smaller problems. Also the seminar wiki was intensively used for completing the seminar assignments. However, too less truly collaborative text production was observed which could be improved by changing the requirements for the collaborative task. Two different process models have been applied for guiding the collaboration of the small groups and both were in general successful. However, it needs to be said that even if the students were able to follow the collaborative task and to co-construct and compare case studies, most of them were not able to synthesize the knowledge they had compiled. This means that the area of consideration often remained on the level of the case and further reflections, generalisations and critique were largely missing. This shows that the seminar model needs to find better ways for triggering knowledge building and critical reflection. It was also suggested to have a more differentiated group building strategy in future seminars. A comparison of pre- and post seminar concept maps showed that an increase of factual and conceptual knowledge on the individual level was widely recognizable. Also the evaluation of the case studies (the major seminar output) revealed that the students have undergone developments of both the factual and the conceptual knowledge domain. Also their self-assessment with respect to individual learning development showed that the highest consensus was achieved in the field of subject-specific knowledge. The participants were much more doubtful with regard to the progress of generic competences such as analysis, communication and organisation. However, 50% of the participants confirmed that they perceived individual development on all competence areas the survey had asked for. Have the additional four targets been met? Concerning the competences for working in a virtual team it can be concluded that the vast majority was able to use the internet-based tools and to work with them in a target-oriented way. However, there were obvious differences regarding the intensity and activity of participation, both because of external and personal factors. A very positive aspect is the achievement of a high cultural diversity supporting the participants’ intercultural competence. Learning from group members was obviously a success factor for the working groups. Regarding the possibilities for better accessibility of educational opportunities it became clear that a significant number of participants were not able to go abroad during their studies because of financial or personal reasons. They confirmed that the online seminar was to some extent a compensation for not having been abroad for studying. Inter-institutional learning and synergy was achieved in so far that many teachers from different countries contributed with individual lectures. However, those teachers hardly ever followed more than one session. Therefore, the learning effect remained largely within the seminar learning group. Looking back at the research design it can be said that the pedagogical action research cycle was an appropriate and valuable approach allowing for strong interaction between theory and practice. However, some more external evaluation from peers in particular regarding the participants’ products would have been valuable.
Resumo:
The working paper’s main objective is to explore the extent to which non-compliance to international labor rights is caused by global competition. From the perspective of institutional economics, compliance with core labor rights is beneficial for sustainable development. Nonetheless, violations of these rights occur on a massive scale. The violators usually blame competitive pressures. A number of studies have come to the conclusion that non-compliance does not provide for a competitive edge, thereby denying any economic rationale for non-compliance. While we sympathize with this conclusion, we find that these studies suffer from faulty assumptions in the design of their regression analyses. The assumption of perfect markets devoid of power relations is particularly unrealistic. While workers' rights promise long-term benefits, they may incur short-term production cost increases. On the supply side, the production sites with the highest amount of labor rights violations are characterized by a near perfect competitive situation. The demand side, however, is dominated by an oligopoly of brand name companies and large retailers. Facing a large pool of suppliers, these companies enjoy more bargaining power. Developing countries, the hosts to most of these suppliers, are therefore limited in their ability to raise labor standards on their own. This competitive situation, however, is the very reason why labor rights have to be negotiated internationally. Our exploration starts with an outline of the institutionalist argument of the benefits of core labor rights. Second, we briefly examine some cross-country empirical studies on the impact of trade liberalization (as a proxy for competitive pressures). Third, we develop our own argument which differentiates the impact of trade liberalization along the axes of labor- and capital-intensive production as well as low and medium skill production. Finally, we present evidence from a study on the impact of trade liberalization in Indonesia on the garment industry as an example of a low skill, laborintensive industry on the one hand, and the automobile as an example for a medium skill, capital-intensive industry on the other hand. Because the garment industry’s workforce consists mainly of women, we also discuss the gender dimension of trade liberalization.
Resumo:
The main objective of this thesis was to determine the potential impact of heat stress (HS) on physiological traits of lactating cows and semen quality of bulls kept in a temperate climate. The thesis is comprised of three studies. An innovative statistical modeling aspect common to all three studies was the application of random regression methodology (RRM) to study the phenotypic and genetic trajectory of traits in dependency of a continuous temperature humidity index (THI). In the first study, semen quality and quantity traits of 562 Holstein sires kept on an AI station in northwestern Germany were analyzed in the course of THI calculated from data obtained from the nearest weather station. Heat stress was identified based on a decline in semen quality and quantity parameters. The identified general HS threshold (THI = 60) and the thermoneutal zone (THI in the range from 50 to 60) for semen production were lower than detected in studies conducted in tropical and subtropical climates. Even though adult bulls were characterized by higher semen productivity compared to younger bulls, they responded with a stronger semen production loss during harsh environments. Heritabilities (low to moderate range) and additive genetic variances of semen characteristics varied with different levels of THI. Also, based on genetic correlations genotype, by environment interactions were detected. Taken together, these findings suggest the application of specific selection strategies for specific climate conditions. In the second study, the effect of the continuous environmental descriptor THI as measured inside the barns on rectal temperatures (RT), skin temperatures (ST), vaginal temperatures (VT), respiration rates (RR), and pulse rate (PR) of lactating Holstein Friesian (HF) and dual-purpose German black pied cattle (DSN) was analyzed. Increasing HS from THI 65 (threshold) to THI 86 (maximal THI) resulted in an increase of RT by 0.6 °C (DSN) and 1 °C (HF), ST by 3.5 °C (HF) and 8 °C (DSN), VT by 0.3 °C (DSN), and RR by 47 breaths / minute (DSN), and decreased PR by 7 beats / minute (DSN). The undesired effects of rising THI on physiological traits were most pronounced for cows with high levels of milk yield and milk constituents, cows in early days in milk and later parities, and during summer seasons in the year 2014. In the third study of this dissertation, the genetic components of the cow’s physiological responses to HS were investigated. Heat stress was deduced from indoor THI measurements, and physiological traits were recorded on native DSN cows and their genetically upgraded crosses with Holstein Friesian sires in two experimental herds from pasture-based production systems reflecting a harsh environment of the northern part of Germany. Although heritabilities were in a low range (from 0.018 to 0.072), alterations of heritabilities, repeatabilities, and genetic components in the course of THI justify the implementation of genetic evaluations including heat stress components. However, low repeatabilities indicate the necessity of using repeated records for measuring physiological traits in German cattle. Moderate EBV correlations between different trait combinations indicate the potential of selection for one trait to simultaneously improve the other physiological attributes. In conclusion, bulls of AI centers and lactating cows suffer from HS during more extreme weather conditions also in the temperate climate of Northern Germany. Monitoring physiological traits during warm and humid conditions could provide precious information for detection of appropriate times for implementation of cooling systems and changes in feeding and management strategies. Subsequently, the inclusion of these physiological traits with THI specific breeding values into overall breeding goals could contribute to improving cattle adaptability by selecting the optimal animal for extreme hot and humid conditions. Furthermore, the recording of meteorological data in close distance to the cow and visualizing the surface body temperature by infrared thermography techniques might be helpful for recognizing heat tolerance and adaptability in cattle.