965 resultados para ADD


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Regional climate models are becoming increasingly popular to provide high resolution climate change information for impacts assessments to inform adaptation options. Many countries and provinces requiring these assessments are as small as 200,000 km2 in size, significantly smaller than an ideal domain needed for successful applications of one-way nested regional climate models. Therefore assessments on sub-regional scales (e.g., river basins) are generally carried out using climate change simulations performed for relatively larger regions. Here we show that the seasonal mean hydrological cycle and the day-to-day precipitation variations of a sub-region within the model domain are sensitive to the domain size, even though the large scale circulation features over the region are largely insensitive. On seasonal timescales, the relatively smaller domains intensify the hydrological cycle by increasing the net transport of moisture into the study region and thereby enhancing the precipitation and local recycling of moisture. On daily timescales, the simulations run over smaller domains produce higher number of moderate precipitation days in the sub-region relative to the corresponding larger domain simulations. An assessment of daily variations of water vapor and the vertical velocity within the sub-region indicates that the smaller domains may favor more frequent moderate uplifting and subsequent precipitation in the region. The results remained largely insensitive to the horizontal resolution of the model, indicating the robustness of the domain size influence on the regional model solutions. These domain size dependent precipitation characteristics have the potential to add one more level of uncertainty to the downscaled projections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Plants and microorganisms provide the pharmaceutical industry with some of the most important sources of components for the research of new medications This thesis involves the study of three medicinal plants belonging to three different important families viz, Cyperus rotundus (Cyperaceae), Stereospermum colais (Bignoniaceae) as well as the well known medicinal plant Zingiber officinale (Zingiberaceae) as the third. The first chapter gives an overview of biologically active natural products with special reference to antioxidant, antidiabetic, anti-inflammatory and antimicrobial molecules from terrestrial sources. Chapter 2 of the thesis deals with the isolation of phytochemical constituents of the medicinal plant Cyperus rotundus and its antioxidant and radical scavenging potential. Chapter 3 of the thesis describes the studies on the roots of Stereospermum colais, A Bignoniaceae plant belonging to the genus Stereospermum which is used extensively. Chapter 3 of the thesis describes the studies on the roots of Stereospermum colais, a Bignoniaceae plant belonging to the genus Stereospermum which is used extensively in Ayurveda. Chapter 4 describes the biological potential of rhizomes of Zingiber officinale. Ethyl acetate extract of ginger (EAG) possessed antioxidant activity as is evident from the results of various in vitro assays compared to other extracts .In conclusion, medicinal plants Cyperus rotundus and Stereospermum colais have been analysed for their phytochemical constituents. Also, the positive results obtained from biological activity studies such as antioxidant, anti-inflammatory and antimicrobial activity on the isolated compounds/extracts add on to the medicinal properties of these plants. Apart from that, ethyl acetate extract of Zingiber officinale (ginger) rhizomes has been shown to have very good biological potential including glucose lowering and adipocyte differentiation inhibitory effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to investigate the role of operational flexibility for effective project management in the construction industry. The specific objectives are to: a) Identify the determinants of operational flexibility potential in construction project management b) Investigate the contribution of each of the determinants to operational flexibility potential in the construction industry c) Investigate on the moderating factors of operational flexibility potential in a construction project environment d) Investigate whether moderated operational flexibility potential mediates the path between predictors and effective construction project management e) Develop and test a conceptual model of achieving operational flexibility for effective project management The purpose of this study is to findout ways to utilize flexibility inorder to manage uncertain project environment and ultimately achieve effective project management. In what configuration these operational flexibility determinants are demanded by construction project environment in order to achieve project success. This research was conducted in three phases, namely: (i) exploratory phase (ii) questionnaire development phase; and (iii) data collection and analysis phase. The study needs firm level analysis and therefore real estate developers who are members of CREDAI, Kerala Chapter were considered. This study provides a framework on the functioning of operational flexibility, offering guidance to researchers and practitioners for discovering means to gain operational flexibility in construction firms. The findings provide an empirical understanding on kinds of resources and capabilities a construction firm must accumulate to respond flexibly to the changing project environment offering practitioners insights into practices that build firms operational flexibility potential. Firms are dealing with complex, continuous changing and uncertain environments due trends of globalization, technical changes and innovations and changes in the customers’ needs and expectations. To cope with the increasingly uncertain and quickly changing environment firms strive for flexibility. To achieve the level of flexibility that adds value to the customers, firms should look to flexibility from a day to day operational perspective. Each dimension of operational flexibility is derived from competences and capabilities. In this thesis only the influence on customer satisfaction and learning exploitation of flexibility dimensions which directly add value in the customers eyes are studied to answer the followingresearch questions: “What is the impact of operational flexibility on customer satisfaction?.” What are the predictors of operational flexibility in construction industry? .These questions can only be answered after answering the questions like “Why do firms need operational flexibility?” and “how can firms achieve operational flexibility?” in the context of the construction industry. The need for construction firms to be flexible, via the effective utilization of organizational resources and capabilities for improved responsiveness, is important because of the increasing rate of changes in the business environment within which they operate. Achieving operational flexibility is also important because it has a significant correlation with a project effectiveness and hence a firm’s turnover. It is essential for academics and practitioners to recognize that the attainment of operational flexibility involves different types namely: (i) Modification (ii) new product development and (iii) demand management requires different configurations of predictors (i.e., resources, capabilities and strategies). Construction firms should consider these relationships and implement appropriate management practices for developing and configuring the right kind of resources, capabilities and strategies towards achieving different operational flexibility types.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Landwirtschaft spielt eine zentrale Rolle im Erdsystem. Sie trägt durch die Emission von CO2, CH4 und N2O zum Treibhauseffekt bei, kann Bodendegradation und Eutrophierung verursachen, regionale Wasserkreisläufe verändern und wird außerdem stark vom Klimawandel betroffen sein. Da all diese Prozesse durch die zugrunde liegenden Nährstoff- und Wasserflüsse eng miteinander verknüpft sind, sollten sie in einem konsistenten Modellansatz betrachtet werden. Dennoch haben Datenmangel und ungenügendes Prozessverständnis dies bis vor kurzem auf der globalen Skala verhindert. In dieser Arbeit wird die erste Version eines solchen konsistenten globalen Modellansatzes präsentiert, wobei der Schwerpunkt auf der Simulation landwirtschaftlicher Erträge und den resultierenden N2O-Emissionen liegt. Der Grund für diese Schwerpunktsetzung liegt darin, dass die korrekte Abbildung des Pflanzenwachstums eine essentielle Voraussetzung für die Simulation aller anderen Prozesse ist. Des weiteren sind aktuelle und potentielle landwirtschaftliche Erträge wichtige treibende Kräfte für Landnutzungsänderungen und werden stark vom Klimawandel betroffen sein. Den zweiten Schwerpunkt bildet die Abschätzung landwirtschaftlicher N2O-Emissionen, da bislang kein prozessbasiertes N2O-Modell auf der globalen Skala eingesetzt wurde. Als Grundlage für die globale Modellierung wurde das bestehende Agrarökosystemmodell Daycent gewählt. Neben der Schaffung der Simulationsumgebung wurden zunächst die benötigten globalen Datensätze für Bodenparameter, Klima und landwirtschaftliche Bewirtschaftung zusammengestellt. Da für Pflanzzeitpunkte bislang keine globale Datenbasis zur Verfügung steht, und diese sich mit dem Klimawandel ändern werden, wurde eine Routine zur Berechnung von Pflanzzeitpunkten entwickelt. Die Ergebnisse zeigen eine gute Übereinstimmung mit Anbaukalendern der FAO, die für einige Feldfrüchte und Länder verfügbar sind. Danach wurde das Daycent-Modell für die Ertragsberechnung von Weizen, Reis, Mais, Soja, Hirse, Hülsenfrüchten, Kartoffel, Cassava und Baumwolle parametrisiert und kalibriert. Die Simulationsergebnisse zeigen, dass Daycent die wichtigsten Klima-, Boden- und Bewirtschaftungseffekte auf die Ertragsbildung korrekt abbildet. Berechnete Länderdurchschnitte stimmen gut mit Daten der FAO überein (R2 = 0.66 für Weizen, Reis und Mais; R2 = 0.32 für Soja), und räumliche Ertragsmuster entsprechen weitgehend der beobachteten Verteilung von Feldfrüchten und subnationalen Statistiken. Vor der Modellierung landwirtschaftlicher N2O-Emissionen mit dem Daycent-Modell stand eine statistische Analyse von N2O-und NO-Emissionsmessungen aus natürlichen und landwirtschaftlichen Ökosystemen. Die als signifikant identifizierten Parameter für N2O (Düngemenge, Bodenkohlenstoffgehalt, Boden-pH, Textur, Feldfrucht, Düngersorte) und NO (Düngemenge, Bodenstickstoffgehalt, Klima) entsprechen weitgehend den Ergebnissen einer früheren Analyse. Für Emissionen aus Böden unter natürlicher Vegetation, für die es bislang keine solche statistische Untersuchung gab, haben Bodenkohlenstoffgehalt, Boden-pH, Lagerungsdichte, Drainierung und Vegetationstyp einen signifikanten Einfluss auf die N2O-Emissionen, während NO-Emissionen signifikant von Bodenkohlenstoffgehalt und Vegetationstyp abhängen. Basierend auf den daraus entwickelten statistischen Modellen betragen die globalen Emissionen aus Ackerböden 3.3 Tg N/y für N2O, und 1.4 Tg N/y für NO. Solche statistischen Modelle sind nützlich, um Abschätzungen und Unsicherheitsbereiche von N2O- und NO-Emissionen basierend auf einer Vielzahl von Messungen zu berechnen. Die Dynamik des Bodenstickstoffs, insbesondere beeinflusst durch Pflanzenwachstum, Klimawandel und Landnutzungsänderung, kann allerdings nur durch die Anwendung von prozessorientierten Modellen berücksichtigt werden. Zur Modellierung von N2O-Emissionen mit dem Daycent-Modell wurde zunächst dessen Spurengasmodul durch eine detailliertere Berechnung von Nitrifikation und Denitrifikation und die Berücksichtigung von Frost-Auftau-Emissionen weiterentwickelt. Diese überarbeitete Modellversion wurde dann an N2O-Emissionsmessungen unter verschiedenen Klimaten und Feldfrüchten getestet. Sowohl die Dynamik als auch die Gesamtsummen der N2O-Emissionen werden befriedigend abgebildet, wobei die Modelleffizienz für monatliche Mittelwerte zwischen 0.1 und 0.66 für die meisten Standorte liegt. Basierend auf der überarbeiteten Modellversion wurden die N2O-Emissionen für die zuvor parametrisierten Feldfrüchte berechnet. Emissionsraten und feldfruchtspezifische Unterschiede stimmen weitgehend mit Literaturangaben überein. Düngemittelinduzierte Emissionen, die momentan vom IPCC mit 1.25 +/- 1% der eingesetzten Düngemenge abgeschätzt werden, reichen von 0.77% (Reis) bis 2.76% (Mais). Die Summe der berechneten Emissionen aus landwirtschaftlichen Böden beträgt für die Mitte der 1990er Jahre 2.1 Tg N2O-N/y, was mit den Abschätzungen aus anderen Studien übereinstimmt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: Quality Management is an essential part of successful organisations. But the effect of it is mostly not directly visible. The effects are more indirect, and have a time lag till results appear. In today’s challenging times, all activities of an organisation have to proof their ability to add value. While Value Based Management is more focussed on financial value, other concepts and models like EFQM Excellence Model and Kaplan and Norton’s Balanced Score Card also point on values that are the basis and driver for financial success. Quality Management has to proof its effects on company values, and therefore the transacting mechanisms have to be identified and procedure to manage the process of Value Adding Quality Management has to be developed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fujaba is an Open Source UML CASE tool project started at the software engineering group of Paderborn University in 1997. In 2002 Fujaba has been redesigned and became the Fujaba Tool Suite with a plug-in architecture allowing developers to add functionality easily while retaining full control over their contributions. Multiple Application Domains Fujaba followed the model-driven development philosophy right from its beginning in 1997. At the early days, Fujaba had a special focus on code generation from UML diagrams resulting in a visual programming language with a special emphasis on object structure manipulating rules. Today, at least six rather independent tool versions are under development in Paderborn, Kassel, and Darmstadt for supporting (1) reengineering, (2) embedded real-time systems, (3) education, (4) specification of distributed control systems, (5) integration with the ECLIPSE platform, and (6) MOF-based integration of system (re-) engineering tools. International Community According to our knowledge, quite a number of research groups have also chosen Fujaba as a platform for UML and MDA related research activities. In addition, quite a number of Fujaba users send requests for more functionality and extensions. Therefore, the 8th International Fujaba Days aimed at bringing together Fujaba develop- ers and Fujaba users from all over the world to present their ideas and projects and to discuss them with each other and with the Fujaba core development team.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis aims at empowering software customers with a tool to build software tests them selves, based on a gradual refinement of natural language scenarios into executable visual test models. The process is divided in five steps: 1. First, a natural language parser is used to extract a graph of grammatical relations from the textual scenario descriptions. 2. The resulting graph is transformed into an informal story pattern by interpreting structurization rules based on Fujaba Story Diagrams. 3. While the informal story pattern can already be used by humans the diagram still lacks technical details, especially type information. To add them, a recommender based framework uses web sites and other resources to generate formalization rules. 4. As a preparation for the code generation the classes derived for formal story patterns are aligned across all story steps, substituting a class diagram. 5. Finally, a headless version of Fujaba is used to generate an executable JUnit test. The graph transformations used in the browser application are specified in a textual domain specific language and visualized as story pattern. Last but not least, only the heavyweight parsing (step 1) and code generation (step 5) are executed on the server side. All graph transformation steps (2, 3 and 4) are executed in the browser by an interpreter written in JavaScript/GWT. This result paves the way for online collaboration between global teams of software customers, IT business analysts and software developers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This analysis was stimulated by the real data analysis problem of household expenditure data. The full dataset contains expenditure data for a sample of 1224 households. The expenditure is broken down at 2 hierarchical levels: 9 major levels (e.g. housing, food, utilities etc.) and 92 minor levels. There are also 5 factors and 5 covariates at the household level. Not surprisingly, there are a small number of zeros at the major level, but many zeros at the minor level. The question is how best to model the zeros. Clearly, models that try to add a small amount to the zero terms are not appropriate in general as at least some of the zeros are clearly structural, e.g. alcohol/tobacco for households that are teetotal. The key question then is how to build suitable conditional models. For example, is the sub-composition of spending excluding alcohol/tobacco similar for teetotal and non-teetotal households? In other words, we are looking for sub-compositional independence. Also, what determines whether a household is teetotal? Can we assume that it is independent of the composition? In general, whether teetotal will clearly depend on the household level variables, so we need to be able to model this dependence. The other tricky question is that with zeros on more than one component, we need to be able to model dependence and independence of zeros on the different components. Lastly, while some zeros are structural, others may not be, for example, for expenditure on durables, it may be chance as to whether a particular household spends money on durables within the sample period. This would clearly be distinguishable if we had longitudinal data, but may still be distinguishable by looking at the distribution, on the assumption that random zeros will usually be for situations where any non-zero expenditure is not small. While this analysis is based on around economic data, the ideas carry over to many other situations, including geological data, where minerals may be missing for structural reasons (similar to alcohol), or missing because they occur only in random regions which may be missed in a sample (similar to the durables)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

“What is value in product development?” is the key question of this paper. The answer is critical to the creation of lean in product development. By knowing how much value is added by product development (PD) activities, decisions can be more rationally made about how to allocate resources, such as time and money. In order to apply the principles of Lean Thinking and remove waste from the product development system, value must be precisely defined. Unfortunately, value is a complex entity that is composed of many dimensions and has thus far eluded definition on a local level. For this reason, research has been initiated on “Measuring Value in Product Development.” This paper serves as an introduction to this research. It presents the current understanding of value in PD, the critical questions involved, and a specific research design to guide the development of a methodology for measuring value. Work in PD value currently focuses on either high-level perspectives on value, or detailed looks at the attributes that value might have locally in the PD process. Models that attempt to capture value in PD are reviewed. These methods, however, do not capture the depth necessary to allow for application. A methodology is needed to evaluate activities on a local level to determine the amount of value they add and their sensitivity with respect to performance, cost, time, and risk. Two conceptual tools are proposed. The first is a conceptual framework for value creation in PD, referred to here as the Value Creation Model. The second tool is the Value-Activity Map, which shows the relationships between specific activities and value attributes. These maps will allow a better understanding of the development of value in PD, will facilitate comparison of value development between separate projects, and will provide the information necessary to adapt process analysis tools (such as DSM) to consider value. The key questions that this research entails are: · What are the primary attributes of lifecycle value within PD? · How can one model the creation of value in a specific PD process? · Can a useful methodology be developed to quantify value in PD processes? · What are the tools necessary for application? · What PD metrics will be integrated with the necessary tools? The research milestones are: · Collection of value attributes and activities (September, 200) · Development of methodology of value-activity association (October, 2000) · Testing and refinement of the methodology (January, 2001) · Tool Development (March, 2001) · Present findings at July INCOSE conference (April, 2001) · Deliver thesis that captures a formalized methodology for defining value in PD (including LEM data sheets) (June, 2001) The research design aims for the development of two primary deliverables: a methodology to guide the incorporation of value, and a product development tool that will allow direct application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

”compositions” is a new R-package for the analysis of compositional and positive data. It contains four classes corresponding to the four different types of compositional and positive geometry (including the Aitchison geometry). It provides means for computation, plotting and high-level multivariate statistical analysis in all four geometries. These geometries are treated in an fully analogous way, based on the principle of working in coordinates, and the object-oriented programming paradigm of R. In this way, called functions automatically select the most appropriate type of analysis as a function of the geometry. The graphical capabilities include ternary diagrams and tetrahedrons, various compositional plots (boxplots, barplots, piecharts) and extensive graphical tools for principal components. Afterwards, ortion and proportion lines, straight lines and ellipses in all geometries can be added to plots. The package is accompanied by a hands-on-introduction, documentation for every function, demos of the graphical capabilities and plenty of usage examples. It allows direct and parallel computation in all four vector spaces and provides the beginner with a copy-and-paste style of data analysis, while letting advanced users keep the functionality and customizability they demand of R, as well as all necessary tools to add own analysis routines. A complete example is included in the appendix

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper sets out to identify the initial positions of the different decision makers who intervene in a group decision making process with a reduced number of actors, and to establish possible consensus paths between these actors. As a methodological support, it employs one of the most widely-known multicriteria decision techniques, namely, the Analytic Hierarchy Process (AHP). Assuming that the judgements elicited by the decision makers follow the so-called multiplicative model (Crawford and Williams, 1985; Altuzarra et al., 1997; Laininen and Hämäläinen, 2003) with log-normal errors and unknown variance, a Bayesian approach is used in the estimation of the relative priorities of the alternatives being compared. These priorities, estimated by way of the median of the posterior distribution and normalised in a distributive manner (priorities add up to one), are a clear example of compositional data that will be used in the search for consensus between the actors involved in the resolution of the problem through the use of Multidimensional Scaling tools

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel test of spatial independence of the distribution of crystals or phases in rocks based on compositional statistics is introduced. It improves and generalizes the common joins-count statistics known from map analysis in geographic information systems. Assigning phases independently to objects in RD is modelled by a single-trial multinomial random function Z(x), where the probabilities of phases add to one and are explicitly modelled as compositions in the K-part simplex SK. Thus, apparent inconsistencies of the tests based on the conventional joins{count statistics and their possibly contradictory interpretations are avoided. In practical applications we assume that the probabilities of phases do not depend on the location but are identical everywhere in the domain of de nition. Thus, the model involves the sum of r independent identical multinomial distributed 1-trial random variables which is an r-trial multinomial distributed random variable. The probabilities of the distribution of the r counts can be considered as a composition in the Q-part simplex SQ. They span the so called Hardy-Weinberg manifold H that is proved to be a K-1-affine subspace of SQ. This is a generalisation of the well-known Hardy-Weinberg law of genetics. If the assignment of phases accounts for some kind of spatial dependence, then the r-trial probabilities do not remain on H. This suggests the use of the Aitchison distance between observed probabilities to H to test dependence. Moreover, when there is a spatial uctuation of the multinomial probabilities, the observed r-trial probabilities move on H. This shift can be used as to check for these uctuations. A practical procedure and an algorithm to perform the test have been developed. Some cases applied to simulated and real data are presented. Key words: Spatial distribution of crystals in rocks, spatial distribution of phases, joins-count statistics, multinomial distribution, Hardy-Weinberg law, Hardy-Weinberg manifold, Aitchison geometry

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los mapas de riesgo de inundaciones deberían mostrar las inundaciones en relación con los impactos potenciales que éstas pueden llegar a producir en personas, bienes y actividades. Por ello, es preciso añadir el concepto de vulnerabilidad al mero estudio del fenómeno físico. Así pues, los mapas de riesgo de daños por inundación son los verdaderos mapas de riesgo, ya que se elaboran, por una parte, a partir de cartografía que localiza y caracteriza el fenómeno físico de las inundaciones, y, por la otra, a partir de cartografía que localiza y caracteriza los elementos expuestos. El uso de las llamadas «nuevas tecnologías», como los SIG, la percepción remota, los sensores hidrológicos o Internet, representa un potencial de gran valor para el desarrollo de los mapas de riesgo de inundaciones, que es, hoy por hoy, un campo abierto a la investigación

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El problema del transporte público colectivo (TPC) en Bogotá es complejo, caracterizado por una inconstante sobreoferta que se ve reflejada, como primera medida, en una tarifa para el transporte público inflada por encima del costo real. Sumado a esto, la debilidad institucional es muy grande, generando así una acumulación de poder a favor de los transportadores, a quienes la sobreoferta les resulta lucrativa. Y si le adicionamos el componente de incentivos que la misma Administración Distrital promueve, con el ineficiente sistema de funcionamiento y adjudicación de rutas, la situación se hace aún más critica. Para reducir la sobreoferta el gobierno de Bogotá ha promulgado una serie de políticas que no han resultado efectivas. Las más estructuradas y optimistas se dieron en la administración Mockus, donde se expidieron los decretos 112 a 116 de 2003 de reestructuración del transporte público, que buscaban reducir la sobreoferta y mejorar la calidad del servicio, que iba en decadencia. El siguiente documento intenta analizar como los decretos mencionados pueden haber sido apenas una estrategia jurídica y una muy limitada política pública al menos por 3 razones especificas: la primera hace referencia a las deficiencias en el momento de la planificación y formulación de la política pública, la segunda analiza el problema de desconocer la existencia e influencia determinante de los transportadores como grupo de presión con poderes económicos y políticos. Y la tercera describe de que manera el alto grado de impunidad a la hora de detectar, juzgar y sancionar a los infractores; ha creado aún más traumatismos en la implementación de estas normas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a method for enhancing current QoS routing methods by means of QoS protection is presented. In an MPLS network, the segments (links) to be protected are predefined and an LSP request involves, apart from establishing a working path, creating a specific type of backup path (local, reverse or global). Different QoS parameters, such as network load balancing, resource optimization and minimization of LSP request rejection should be considered. QoS protection is defined as a function of QoS parameters, such as packet loss, restoration time, and resource optimization. A framework to add QoS protection to many of the current QoS routing algorithms is introduced. A backup decision module to select the most suitable protection method is formulated and different case studies are analyzed