848 resultados para Scenario methodology
Resumo:
Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education
Resumo:
Cochin estuarine system is among the most productive aquatic environment along the Southwest coast of India, exhibits unique ecological features and possess greater socioeconomic relevance. Serious investigations carried out during the past decades on the hydro biogeochemical variables pointed out variations in the health and ecological functioning of this ecosystem. Characterisation of organic matter in the estuary has been attempted in many investigations. But detailed studies covering the degradation state of organic matter using molecular level approach is not attempted. The thesis entitled Provenance, Isolation and Characterisation of Organic Matter in the Cochin Estuarine Sediment-“ A Diagenetic Amino Acid Marker Scenario” is an integrated approach to evaluate the source, quantity, quality, and degradation state of the organic matter in the surface sediments of Cochin estuarine system with the combined application of bulk and molecular level tools. Sediment and water samples from nine stations situated at Cochin estuary were collected in five seasonal sampling campaigns, for the biogeochemical assessment and their distribution pattern of sedimentary organic matter. The sampling seasons were described and abbreviated as follows: April- 2009 (pre monsoon: PRM09), August-2009 (monsoon: MON09), January-2010 (post monsoon: POM09), April-2010 (pre monsoon: PRM10) and September- 2012 (monsoon: MON12). In order to evaluate the general environmental conditions of the estuary, water samples were analysed for water quality parameters, chlorophyll pigments and nutrients by standard methods. Investigations suggested the fact that hydrographical variables and nutrients in Cochin estuary supports diverse species of flora and fauna. Moreover the sedimentary variables such as pH, Eh, texture, TOC, fractions of nitrogen and phosphorous were determined to assess the general geochemical setting as well as redox status. The periodically fluctuating oxic/ anoxic conditions and texture serve as the most significant variables controlling other variables of the aquatic environment. The organic matter in estuary comprise of a complex mixture of autochthonous as well as allochthonous materials. Autochthonous input is limited or enhanced by the nutrient elements like N and P (in their various fractions), used as a tool to evaluate their bioavailability. Bulk parameter approach like biochemical composition, stoichiometric elemental ratios and stable carbon isotope ratio was also employed to assess the quality and quantity of sedimentary organic matter in the study area. Molecular level charactersation of free sugars and amino acids were carried out by liquid chromatographic techniques. Carbohydrates are the products of primary production and their occurrence in sediments as free sugars can provide information on the estuarine productivity. Amino acid biogeochemistry provided implications on the system productivity, nature of organic matter as well as degradation status of the sedimentary organic matter in the study area. The predominance of carbohydrates over protein indicated faster mineralisation of proteinaceous organic matter in sediments and the estuary behaves as a detrital trap for the accumulation of aged organic matter. The higher lipid content and LPD/CHO ratio pointed towards the better food quality that supports benthic fauna and better accumulation of lipid compounds in the sedimentary environment. Allochthonous addition of carbohydrates via terrestrial run off was responsible for the lower PRT/CHO ratio estimated in thesediments and the lower ratios also denoted a detrital heterotrophic environment. Biopolymeric carbon and the algal contribution to BPC provided important information on the better understanding the trophic state of the estuarine system and the higher values of chlorophyll-a to phaeophytin ratio indicated deposition of phytoplankton to sediment at a rapid rate. The estimated TOC/TN ratios implied the combined input of both terrestrial and autochthonous organic matter to sedimentsAmong the free sugars, depleted levels of glucose in sediments in most of the stations and abundance of mannose at station S5 was observed during the present investigation. Among aldohexoses, concentration of galactose was found to be higher in most of the stationsRelative abundance of AAs in the estuarine sediments based on seasons followed the trend: PRM09-Leucine > Phenylalanine > Argine > Lysine, MON09-Lysine > Aspartic acid > Histidine > Tyrosine > Phenylalanine, POM09-Lysine > Histadine > Phenyalanine > Leucine > Methionine > Serine > Proline > Aspartic acid, PRM10-Valine > Aspartic acid > Histidine > Phenylalanine > Serine > Proline, MON12-Lysine > Phenylalanine > Aspartic acid > Histidine > Valine > Tyrsine > MethionineThe classification of study area into three zones based on salinity was employed in the present study for the sake of simplicity and generalized interpretations. The distribution of AAs in the three zones followed the trend: Fresh water zone (S1, S2):- Phenylalanine > Lysine > Aspartic acid > Methionine > Valine ῀ Leucine > Proline > Histidine > Glycine > Serine > Glutamic acid > Tyrosine > Arginine > Alanine > Threonine > Cysteine > Isoleucine. Estuarine zone (S3, S4, S5, S6):- Lysine > Aspartic acid > Phenylalanine > Leucine > Valine > Histidine > Methionine > Tyrosine > Serine > Glutamic acid > Proline > Glycine > Arginine > Alanine > Isoleucine > Cysteine > Threonine. Riverine /Industrial zone (S7, S8, S9):- Phenylalanine > Lysine > Aspartic acid > Histidine > Serine > Arginine > Tyrosine > Leucine > Methionine > Glutamic acid > Alanine > Glycine > Cysteine > Proline > Isoleucine > Threonine > Valine. The abundance of AAs like glutamic acid, aspartic acid, isoleucine, valine, tyrosine, and phenylalanine in sediments of the study area indicated freshly derived organic matter.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
The 21st century has brought new challenges for forest management at a time when globalization in world trade is increasing and global climate change is becoming increasingly apparent. In addition to various goods and services like food, feed, timber or biofuels being provided to humans, forest ecosystems are a large store of terrestrial carbon and account for a major part of the carbon exchange between the atmosphere and the land surface. Depending on the stage of the ecosystems and/or management regimes, forests can be either sinks, or sources of carbon. At the global scale, rapid economic development and a growing world population have raised much concern over the use of natural resources, especially forest resources. The challenging question is how can the global demands for forest commodities be satisfied in an increasingly globalised economy, and where could they potentially be produced? For this purpose, wood demand estimates need to be integrated in a framework, which is able to adequately handle the competition for land between major land-use options such as residential land or agricultural land. This thesis is organised in accordance with the requirements to integrate the simulation of forest changes based on wood extraction in an existing framework for global land-use modelling called LandSHIFT. Accordingly, the following neuralgic points for research have been identified: (1) a review of existing global-scale economic forest sector models (2) simulation of global wood production under selected scenarios (3) simulation of global vegetation carbon yields and (4) the implementation of a land-use allocation procedure to simulate the impact of wood extraction on forest land-cover. Modelling the spatial dynamics of forests on the global scale requires two important inputs: (1) simulated long-term wood demand data to determine future roundwood harvests in each country and (2) the changes in the spatial distribution of woody biomass stocks to determine how much of the resource is available to satisfy the simulated wood demands. First, three global timber market models are reviewed and compared in order to select a suitable economic model to generate wood demand scenario data for the forest sector in LandSHIFT. The comparison indicates that the ‘Global Forest Products Model’ (GFPM) is most suitable for obtaining projections on future roundwood harvests for further study with the LandSHIFT forest sector. Accordingly, the GFPM is adapted and applied to simulate wood demands for the global forestry sector conditional on selected scenarios from the Millennium Ecosystem Assessment and the Global Environmental Outlook until 2050. Secondly, the Lund-Potsdam-Jena (LPJ) dynamic global vegetation model is utilized to simulate the change in potential vegetation carbon stocks for the forested locations in LandSHIFT. The LPJ data is used in collaboration with spatially explicit forest inventory data on aboveground biomass to allocate the demands for raw forest products and identify locations of deforestation. Using the previous results as an input, a methodology to simulate the spatial dynamics of forests based on wood extraction is developed within the LandSHIFT framework. The land-use allocation procedure specified in the module translates the country level demands for forest products into woody biomass requirements for forest areas, and allocates these on a five arc minute grid. In a first version, the model assumes only actual conditions through the entire study period and does not explicitly address forest age structure. Although the module is in a very preliminary stage of development, it already captures the effects of important drivers of land-use change like cropland and urban expansion. As a first plausibility test, the module performance is tested under three forest management scenarios. The module succeeds in responding to changing inputs in an expected and consistent manner. The entire methodology is applied in an exemplary scenario analysis for India. A couple of future research priorities need to be addressed, particularly the incorporation of plantation establishments; issue of age structure dynamics; as well as the implementation of a new technology change factor in the GFPM which can allow the specification of substituting raw wood products (especially fuelwood) by other non-wood products.
Resumo:
DIADEM, created by THOMSON-CSF, is a methodology for specifying and developing user interfaces. It improves productivity of the interface development process as well as quality of the interface. The method provides support to user interface development in three aspects. (1) DIADEM defines roles of people involved and their tasks and organises the sequence of activities. (2) It provides graphical formalisms supporting information exchange between people. (3) It offers a basic set of rules for optimum human-machine interfaces. The use of DIADEM in three areas (process control, sales support, and multimedia presentation) was observed and evaluated by our laboratory in the European project DIAMANTA (ESPRIT P20507). The method provides an open procedure that leaves room for adaptation to a specific application and environment. This paper gives an overview of DIADEM and shows how to extend formalisms for developing multimedia interfaces.
Optimal Methodology for Synchronized Scheduling of Parallel Station Assembly with Air Transportation
Resumo:
We present an optimal methodology for synchronized scheduling of production assembly with air transportation to achieve accurate delivery with minimized cost in consumer electronics supply chain (CESC). This problem was motivated by a major PC manufacturer in consumer electronics industry, where it is required to schedule the delivery requirements to meet the customer needs in different parts of South East Asia. The overall problem is decomposed into two sub-problems which consist of an air transportation allocation problem and an assembly scheduling problem. The air transportation allocation problem is formulated as a Linear Programming Problem with earliness tardiness penalties for job orders. For the assembly scheduling problem, it is basically required to sequence the job orders on the assembly stations to minimize their waiting times before they are shipped by flights to their destinations. Hence the second sub-problem is modelled as a scheduling problem with earliness penalties. The earliness penalties are assumed to be independent of the job orders.
Resumo:
Performance and manufacturability are two important issues that must be taken into account during MEMS design. Existing MEMS design models or systems follow a process-driven design paradigm, that is, design starts from the specification of process sequence or the customization of foundry-ready process template. There has been essentially no methodology or model that supports generic, high-level design synthesis for MEMS conceptual design. As a result, there lacks a basis for specifying the initial process sequences. To address this problem, this paper proposes a performance-driven, microfabrication-oriented methodology for MEMS conceptual design. A unified behaviour representation method is proposed which incorporates information of both physical interactions and chemical/biological/other reactions. Based on this method, a behavioural process based design synthesis model is proposed, which exploits multidisciplinary phenomena for design solutions, including both the structural components and their configuration for the MEMS device, as well as the necessary substances for the chemical/biological/other reactions. The model supports both forward and backward synthetic search for suitable phenomena. To ensure manufacturability, a strategy of using microfabrication-oriented phenomena as design knowledge is proposed, where the phenomena are developed from existing MEMS devices that have associated MEMS-specific microfabrication processes or foundry-ready process templates. To test the applicability of the proposed methodology, the paper also studies microfluidic device design and uses a micro-pump design for the case study.
Resumo:
A study of tin deposits from Priamurye (Russia) is performed to analyze the differences between them based on their origin and also on commercial criteria. A particular analysis based on their vertical zonality is also given for samples from Solnechnoe deposit. All the statistical analysis are made on the subcomposition formed by seven trace elements in cassiterite (In, Sc, Be, W, Nb, Ti and V) using the Aitchison’ methodology of analysis of compositional data
Experience in introduction of English terminology in engineering lessons: methodology and evaluation
Resumo:
This communication explains a experience for the introduction of English terminology in a technical degree of higher education. We present the methodology and assessment procedures used to evaluate the way the students perceived the introduction of terminology in English in two different subjects from 3rd and 5th year courses of a Computer Science degree in which English was not the vehicular language. We propose a strategy based on two main pillars, namely: 1) The design of materials, explanations, and exams, paying particular attention to the way in which the specific terminology was exposed to the students, and 2) The assessment of the impact in the students by means of the analysis of the feedback trough a set of enquiries. Our experience showed that the students responded very positively to the introduction of English terminology, and presented an affirmative feedback about the impact that an improvement of their linguistic abilities would have in their future work. Further, we present statistics regarding the use of English as the vehicular language for technical reports, which is envisaged as very useful by the students. Finally, we propose a set of questions for further debate which are centered in the role that English terminology should pay in technical degrees, and about the way in which universities should deploy resources in English languages within the different Syllabus
Resumo:
Proposes a behavior-based scheme for high-level control of autonomous underwater vehicles (AUVs). Two main characteristics can be highlighted in the control scheme. Behavior coordination is done through a hybrid methodology, which takes in advantages of the robustness and modularity in competitive approaches, as well as optimized trajectories
Resumo:
In this paper a novel methodology aimed at minimizing the probability of network failure and the failure impact (in terms of QoS degradation) while optimizing the resource consumption is introduced. A detailed study of MPLS recovery techniques and their GMPLS extensions are also presented. In this scenario, some features for reducing the failure impact and offering minimum failure probabilities at the same time are also analyzed. Novel two-step routing algorithms using this methodology are proposed. Results show that these methods offer high protection levels with optimal resource consumption
Resumo:
When underwater vehicles navigate close to the ocean floor, computer vision techniques can be applied to obtain motion estimates. A complete system to create visual mosaics of the seabed is described in this paper. Unfortunately, the accuracy of the constructed mosaic is difficult to evaluate. The use of a laboratory setup to obtain an accurate error measurement is proposed. The system consists on a robot arm carrying a downward looking camera. A pattern formed by a white background and a matrix of black dots uniformly distributed along the surveyed scene is used to find the exact image registration parameters. When the robot executes a trajectory (simulating the motion of a submersible), an image sequence is acquired by the camera. The estimated motion computed from the encoders of the robot is refined by detecting, to subpixel accuracy, the black dots of the image sequence, and computing the 2D projective transform which relates two consecutive images. The pattern is then substituted by a poster of the sea floor and the trajectory is executed again, acquiring the image sequence used to test the accuracy of the mosaicking system
Resumo:
It has been shown that the accuracy of mammographic abnormality detection methods is strongly dependent on the breast tissue characteristics, where a dense breast drastically reduces detection sensitivity. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. Here, we describe the development of an automatic breast tissue classification methodology, which can be summarized in a number of distinct steps: 1) the segmentation of the breast area into fatty versus dense mammographic tissue; 2) the extraction of morphological and texture features from the segmented breast areas; and 3) the use of a Bayesian combination of a number of classifiers. The evaluation, based on a large number of cases from two different mammographic data sets, shows a strong correlation ( and 0.67 for the two data sets) between automatic and expert-based Breast Imaging Reporting and Data System mammographic density assessment
Resumo:
La perdurabilidad y la estrategia son temas que influyen de manera significativa en los sectores productivos. Los grupos de investigación, como eje principal de la ciencia, tecnología e innovación no son ajenos esto. Es así que requieren fortalecer su estructura logrando desempeño superior y entenderse como empresas. Esta investigación busca diseñar, para los factores de talento humano y productividad, un escenario deseado y sus estrategias de dirección para favorecer la perdurabilidad de los grupos privados de investigación en salud. La hipótesis planteada a partir de la cual se realizó el análisis es que los grupos de investigación privados, tienen mayores amenazas para perdurar porque no cuentan con una estructura propia de talento humano financiada o con soporte estructural de procesos para competir por las iniciativas de investigación disponibles. Esta hipótesis se aborda para los grupos privados de investigación que pertenecen al Programa Nacional de Ciencia y Tecnología de la Salud, área del conocimiento en ciencias de la salud y a través de la pregunta: ¿Cuáles pueden ser los escenarios posibles para lograr la perdurabilidad de los grupos privados de investigación en salud, a partir de estrategias relacionadas con los factores de talento humano y productividad? El desarrollo se hizo a través de 2 fases: Un estudio piloto donde se exploró el sector estratégico a través de las herramientas de prospectiva: MICMAC y SMIC y la fase de intervención a través de una metodología conjugada entre aplicación de AESE, un análisis DEA y una simulación de los escenarios encontrados y deseados.