942 resultados para Projects Analysis
Resumo:
The research work reported in this Thesis was held along two main lines of research. The first and main line of research is about the synthesis of heteroaromatic compounds with increasing steric hindrance, with the aim of preparing stable atropisomers. The main tools used for the study of these dynamic systems, as described in the Introduction, are DNMR, coupled with line shape simulation and DFT calculations, aimed to the conformational analysis for the prediction of the geometries and energy barriers to the trasition states. This techniques have been applied to the research projects about: • atropisomers of arylmaleimides; • atropisomers of 4-arylpyrazolo[3,4-b]pyridines; • study of the intramolecular NO2/CO interaction in solution; • study on 2-arylpyridines. Parallel to the main project, in collaboration with other groups, the research line about determination of the absolute configuration was followed. The products, deriving form organocatalytic reactions, in many cases couldn’t be analyzed by means of X-Ray diffraction, making necessary the development of a protocol based on spectroscopic methodologies: NMR, circular dichroism and computational tools (DFT, TD-DFT) have been implemented in this scope. In this Thesis are reported the determination of the absolute configuration of: • substituted 1,2,3,4-tetrahydroquinolines; • compounds from enantioselective Friedel-Crafts alkylation-acetalization cascade of naphthols with α,β-unsaturated cyclic ketones; • substituted 3,4-annulated indoles.
Resumo:
The purpose of this dissertation is to reflect on the meaning of feminism in order to decide whether the movement needs a rebranding. The first part will focus on the history of feminism to clarify its beliefs and goals. The following part will highlight how feminism gained a bad reputation over time and will show Elle’s attempt in November 2013 to launch a project to rebrand the movement. The last part will explain what present-day feminism consists of by listing some of the latest projects and the most important issues the movement has to tackle. My analysis will finally show that the F-movement needs a rebranding and, in order to be effective, men should join women’s fight for equality. “We should all be feminists”.
Resumo:
Changes in marine net primary productivity (PP) and export of particulate organic carbon (EP) are projected over the 21st century with four global coupled carbon cycle-climate models. These include representations of marine ecosystems and the carbon cycle of different structure and complexity. All four models show a decrease in global mean PP and EP between 2 and 20% by 2100 relative to preindustrial conditions, for the SRES A2 emission scenario. Two different regimes for productivity changes are consistently identified in all models. The first chain of mechanisms is dominant in the low- and mid-latitude ocean and in the North Atlantic: reduced input of macro-nutrients into the euphotic zone related to enhanced stratification, reduced mixed layer depth, and slowed circulation causes a decrease in macro-nutrient concentrations and in PP and EP. The second regime is projected for parts of the Southern Ocean: an alleviation of light and/or temperature limitation leads to an increase in PP and EP as productivity is fueled by a sustained nutrient input. A region of disagreement among the models is the Arctic, where three models project an increase in PP while one model projects a decrease. Projected changes in seasonal and interannual variability are modest in most regions. Regional model skill metrics are proposed to generate multi-model mean fields that show an improved skill in representing observation-based estimates compared to a simple multi-model average. Model results are compared to recent productivity projections with three different algorithms, usually applied to infer net primary production from satellite observations.
Resumo:
‘where the land is greener’ looks at soil and water conservation from a global perspective. In total, 42 soil and water conservation technologies and 28 approaches are described – each fully illustrated with photographs, graphs and line drawings – as applied in case studies in more than 20 countries around the world. This unique presentation of case studies draws on WOCAT’s extensive database, gathered in over 12 years of field experience. The book is intended as a prototype for national and regional compilations of sustainable land management practices a practical – instrument for making field knowledge available to decision makers. Various land use categories are covered, from crop farming to grazing and forestry. The technologies presented range from terrace-building to agroforestry systems; from rehabilitation of common pastures to conservation agriculture; from Vermiculture to water harvesting. Several of these technologies are already well-established successes – others are innovative, relatively unknown, but full of promise. Descriptions of the various technologies are complemented by studies of the ‘approaches’ that have underpinned their development and dissemination. Some of these approaches were developed specifically for individual projects; others developed and spread spontaneously in fascinating processes that offer a new perspective for development policy. In addition to the case studies, the book includes two analytical sections on the technologies and approaches under study. By identifying common elements of success, these analyses offer hope for productive conservation efforts at the local level with simultaneous global environmental benefits. Policy pointers for decision makers and donors offer a new impetus for further investment – to make the land greener.
Resumo:
This thesis is composed of three life-cycle analysis (LCA) studies of manufacturing to determine cumulative energy demand (CED) and greenhouse gas emissions (GHG). The methods proposed could reduce the environmental impact by reducing the CED in three manufacturing processes. First, industrial symbiosis is proposed and a LCA is performed on both conventional 1 GW-scaled hydrogenated amorphous silicon (a-Si:H)-based single junction and a-Si:H/microcrystalline-Si:H tandem cell solar PV manufacturing plants and such plants coupled to silane recycling plants. Using a recycling process that results in a silane loss of only 17 versus 85 percent, this results in a CED savings of 81,700 GJ and 290,000 GJ per year for single and tandem junction plants, respectively. This recycling process reduces the cost of raw silane by 68 percent, or approximately $22.6 and $79 million per year for a single and tandem 1 GW PV production facility, respectively. The results show environmental benefits of silane recycling centered around a-Si:H-based PV manufacturing plants. Second, an open-source self-replicating rapid prototype or 3-D printer, the RepRap, has the potential to reduce the environmental impact of manufacturing of polymer-based products, using distributed manufacturing paradigm, which is further minimized by the use of PV and improvements in PV manufacturing. Using 3-D printers for manufacturing provides the ability to ultra-customize products and to change fill composition, which increases material efficiency. An LCA was performed on three polymer-based products to determine the CED and GHG from conventional large-scale production and are compared to experimental measurements on a RepRap producing identical products with ABS and PLA. The results of this LCA study indicate that the CED of manufacturing polymer products can possibly be reduced using distributed manufacturing with existing 3-D printers under 89% fill and reduced even further with a solar photovoltaic system. The results indicate that the ability of RepRaps to vary fill has the potential to diminish environmental impact on many products. Third, one additional way to improve the environmental performance of this distributed manufacturing system is to create the polymer filament feedstock for 3-D printers using post-consumer plastic bottles. An LCA was performed on the recycling of high density polyethylene (HDPE) using the RecycleBot. The results of the LCA showed that distributed recycling has a lower CED than the best-case scenario used for centralized recycling. If this process is applied to the HDPE currently recycled in the U.S., more than 100 million MJ of energy could be conserved per annum along with significant reductions in GHG. This presents a novel path to a future of distributed manufacturing suited for both the developed and developing world with reduced environmental impact. From improving manufacturing in the photovoltaic industry with the use of recycling to recycling and manufacturing plastic products within our own homes, each step reduces the impact on the environment. The three coupled projects presented here show a clear potential to reduce the environmental impact of manufacturing and other processes by implementing complimenting systems, which have environmental benefits of their own in order to achieve a compounding effect of reduced CED and GHG.
Resource-allocation capabilities of commercial project management software. An experimental analysis
Resumo:
When project managers determine schedules for resource-constrained projects, they commonly use commercial project management software packages. Which resource-allocation methods are implemented in these packages is proprietary information. The resource-allocation problem is in general computationally difficult to solve to optimality. Hence, the question arises if and how various project management software packages differ in quality with respect to their resource-allocation capabilities. None of the few existing papers on this subject uses a sizeable data set and recent versions of common software packages. We experimentally analyze the resource-allocation capabilities of Acos Plus.1, AdeptTracker Professional, CS Project Professional, Microsoft Office Project 2007, Primavera P6, Sciforma PS8, and Turbo Project Professional. Our analysis is based on 1560 instances of the precedence- and resource-constrained project scheduling problem RCPSP. The experiment shows that using the resource-allocation feature of these packages may lead to a project duration increase of almost 115% above the best known feasible schedule. The increase gets larger with increasing resource scarcity and with increasing number of activities. We investigate the impact of different complexity scenarios and priority rules on the project duration obtained by the software packages. We provide a decision table to support managers in selecting a software package and a priority rule.
Resumo:
Software metrics offer us the promise of distilling useful information from vast amounts of software in order to track development progress, to gain insights into the nature of the software, and to identify potential problems. Unfortunately, however, many software metrics exhibit highly skewed, non-Gaussian distributions. As a consequence, usual ways of interpreting these metrics --- for example, in terms of "average" values --- can be highly misleading. Many metrics, it turns out, are distributed like wealth --- with high concentrations of values in selected locations. We propose to analyze software metrics using the Gini coefficient, a higher-order statistic widely used in economics to study the distribution of wealth. Our approach allows us not only to observe changes in software systems efficiently, but also to assess project risks and monitor the development process itself. We apply the Gini coefficient to numerous metrics over a range of software projects, and we show that many metrics not only display remarkably high Gini values, but that these values are remarkably consistent as a project evolves over time.
Resumo:
Gaining economic benefits from substantially lower labor costs has been reported as a major reason for offshoring labor-intensive information systems services to low-wage countries. However, if wage differences are so high, why is there such a high level of variation in the economic success between offshored IS projects? This study argues that offshore outsourcing involves a number of extra costs for the ^his paper was recommended for acceptance by Associate Guest Editor Erran Carmel. client organization that account for the economic failure of offshore projects. The objective is to disaggregate these extra costs into their constituent parts and to explain why they differ between offshored software projects. The focus is on software development and maintenance projects that are offshored to Indian vendors. A theoretical framework is developed a priori based on transaction cost economics (TCE) and the knowledge-based view of the firm, comple mented by factors that acknowledge the specific offshore context The framework is empirically explored using a multiple case study design including six offshored software projects in a large German financial service institution. The results of our analysis indicate that the client incurs post contractual extra costs for four types of activities: (1) re quirements specification and design, (2) knowledge transfer, (3) control, and (4) coordination. In projects that require a high level of client-specific knowledge about idiosyncratic business processes and software systems, these extra costs were found to be substantially higher than in projects where more general knowledge was needed. Notably, these costs most often arose independently from the threat of oppor tunistic behavior, challenging the predominant TCE logic of market failure. Rather, the client extra costs were parti cularly high in client-specific projects because the effort for managing the consequences of the knowledge asymmetries between client and vendor was particularly high in these projects. Prior experiences of the vendor with related client projects were found to reduce the level of extra costs but could not fully offset the increase in extra costs in highly client-specific projects. Moreover, cultural and geographic distance between client and vendor as well as personnel turnover were found to increase client extra costs. Slight evidence was found, however, that the cost-increasing impact of these factors was also leveraged in projects with a high level of required client-specific knowledge (moderator effect).
Resumo:
We present the results of an investigation into the nature of information needs of software developers who work in projects that are part of larger ecosystems. This work is based on a quantitative survey of 75 professional software developers. We corroborate the results identified in the sur- vey with needs and motivations proposed in a previous sur- vey and discover that tool support for developers working in an ecosystem context is even more meager than we thought: mailing lists and internet search are the most popular tools developers use to satisfy their ecosystem-related information needs.
Resumo:
Software developers are often unsure of the exact name of the method they need to use to invoke the desired behavior in a given context. This results in a process of searching for the correct method name in documentation, which can be lengthy and distracting to the developer. We can decrease the method search time by enhancing the documentation of a class with the most frequently used methods. Usage frequency data for methods is gathered by analyzing other projects from the same ecosystem - written in the same language and sharing dependencies. We implemented a proof of concept of the approach for Pharo Smalltalk and Java. In Pharo Smalltalk, methods are commonly searched for using a code browser tool called "Nautilus", and in Java using a web browser displaying HTML based documentation - Javadoc. We developed plugins for both browsers and gathered method usage data from open source projects, in order to increase developer productivity by reducing method search time. A small initial evaluation has been conducted showing promising results in improving developer productivity.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
Project justification is regarded as one of the major methodological deficits in Data Warehousing practice. As reasons for applying inappropriate methods, performing incomplete evaluations, or even entirely omitting justifications, the special nature of Data Warehousing benefits and the large portion of infrastructure-related activities are stated. In this paper, the economic justification of Data Warehousing projects is analyzed, and first results from a large academiaindustry collaboration project in the field of non-technical issues of Data Warehousing are presented. As conceptual foundations, the role of the Data Warehouse system in corporate application architectures is analyzed, and the specific properties of Data Warehousing projects are discussed. Based on an applicability analysis of traditional approaches to economic IT project justification, basic steps and responsibilities for the justification of Data Warehousing projects are derived.
Resumo:
Study objective. This was a secondary data analysis of a study designed and executed in two phases in order to investigate several questions: Why aren't more investigators conducting successful cross-border research on human health issues? What are the barriers to conducting this research? What interventions might facilitate cross-border research? ^ Methods. Key informant interviews and focus groups were used in Phase One, and structured questionnaires in Phase Two. A multi-question survey was created based on the findings of focus groups and distributed to a wider circle of researchers and academics for completion. The data was entered and analyzed using SPSS software. ^ Setting. El Paso, TX located on the U.S-Mexico Border. ^ Participants. Individuals from local academic institutions and the State Department of Health. ^ Results. From the transcribed data of the focus groups, eight major themes emerged: Political Barriers, Language/Cultural Barriers, Differing Goals, Geographic Issues, Legal Barriers, Technology/Material Issues, Financial Barriers, and Trust Issues. Using these themes, the questionnaire was created. ^ The response rate for the questionnaires was 47%. The largest obstacles revealed by this study were identifying a funding source for the project (47% agreeing or strongly agreeing), difficulties paying a foreign counterpart (33% agreeing or strongly agreeing) and administrative changes in Mexico (31% agreeing or strongly agreeing). ^ Conclusions. Many U.S. investigators interested in cross-border research have been discouraged in their efforts by varying barriers. The majority of respondents in the survey felt financial issues and changes in Mexican governments were the most significant obstacles. While some of these barriers can be overcome simply by collaboration among motivated groups, other barriers may be more difficult to remove. Although more evaluation of this research question is warranted, the information obtained through this study is sufficient to support creation of a Cross-Border Research Resource Manual to be used by individuals interested in conducting research with Mexico. ^
Resumo:
The Greenland Ice Sheet Project 2 (GISP2) core can enhance our understanding of the relationship between parameters measured in the ice in central Greenland and variability in the ocean, atmosphere, and cryosphere of the North Atlantic Ocean and adjacent land masses. Seasonal (summer, winter) to annual responses of dD and deuterium excess isotopic signals in the GISP2 core to the seesaw in winter temperatures between West Greenland and northern Europe from A.D. 1840 to 1970 are investigated. This seesaw represents extreme modes of the North Atlantic Oscillation, which also influences sea surface temperatures (SSTs), atmospheric pressures, geostrophic wind strength, and sea ice extents beyond the winter season. Temperature excursions inferred from the dD record during seesaw/extreme NAO mode years move in the same direction as the West Greenland side of the seesaw. Symmetry with the West Greenland side of the seesaw suggests a possible mechanism for damping in the ice core record of the lowest decadal temperatures experienced in Europe from A.D. 1500 to 1700. Seasonal and annual deuterium excess excursions during seesaw years show negative correlation with dD. This suggests an isotopic response to a SST/ land temperature seesaw. The isotopic record from GISP2 may therefore give information on both ice sheet and sea surface temperature variability. Cross-plots of dD and d show a tendency for data to be grouped according to the prevailing mode of the seesaw, but do not provide unambiguous identification of individual seesaw years. A combination of ice core and tree ring data sets may allow more confident identification of GA and GB (extreme NAO mode) years prior to 1840.
Resumo:
The data collection "Deep Drilling of Glaciers: Soviet-Russian projects in Arctic, 1975-1995" was collected by the following basic considerations: - compilation of deep (>100 m) drilling projects on Arctic glaciers, using data of (a) publications; (b) archives of IGRAN; (c) personal communication of project participants; - documentation of parameters, references. Accuracy of data and techniques applied to determine different parameters are not evaluated. The accuracy of some geochemical parameters (up to 1984 and heavy metalls) is uncertain. Most reconstructions of ice core age and of annual layer thickness are discussed; - digitizing of published diagrams (in case, when original numerical data were lost) and subsequent data conversion to equal range series and adjustment to the common units. Therefore, the equal-range series were calculated from original data or converted from digitized chart values as indicated in the metadata. For the methodological purpose, the equal-range series obtained from original and reconstructed data were compared repeatedly; the systematic difference was less then 5-7%. Special attention should be given to the fact, that the data for individual ice core parameters varies, because some parameters were originally measured or registered. Parameters were converted in equal-range series using 2 m steps; - two or more parameter values were determined, then the mean-weighted (i.e. accounting the sample length) value is assigned to the entire interval; - one parameter value was determined, measured or registered independently from the parameter values in depth intervals which over- and underlie it, then the value is assigned to the entire interval; - one parameter value was determined, measured or registered for two adjoining depth intervals, then the specific value is assigned to the depth interval, which represents >75% of sample length ; if each of adjoining depth intervals represents <75% of sample length, then the correspondent parameter value is assigned to both intervals of depth. This collection of ice core data (version 2000) was made available through the EU funded QUEEN project by S.M. Arkhipov, Moscow.