923 resultados para Project Read and Write
Resumo:
Measurements of 87Sr/86Sr ratios of interstitial waters from leg 25, site 245 and leg 38, site 336 of the Deep Sea Drilling Project show that the enrichment of Sr[2+] with depth is caused both by the alteration of volcanic material and by the introduction of strontium derived from calcium carbonate. 87Sr/86 Sr ratios range from 0.70913 to 0.70794 at site 245 and from 0.70916 to 0.70694 at site 336. The low ratios compared with contemporaneous seawater reflect the release of Sr from a volcanic source having, according to material-balance calculations, a 87Sr/86 Sr ratio of about 0.7034 at site 336. At this site the source appears to be volcanic ash and not basaltic basement which acts as a sink for Sr[2+] during in situ low-temperature weathering. The volcanic contribution to the strontium enrichment in the basal interstitial waters varies from <10% at site 245 to >50% at site 336. The remaining Sr[2+] is derived from Sr-rich biogenic carbonate during diagenetic recrystallization to form Sr-poor calcite.
Resumo:
In this paper we present a support strategy of guidance for the young people from vulnerable population, both urban and rural, who have completed high school and are involved in an extension program so called Equity and guidance: the challenge of a proposal in state educational institutions from the La Plata, Berisso and Ensenada districts. This program gives them a reflection space to analyze their personal, social and community situation as a self-managed process towards a project of life. For some time we have supported young people with the purpose of helping them to make a life project. When doing so, we've always noticed the need of finding strategies to provide reasons to carry on the project of completing high school, to search new knowledge, request scholarships, to think of alternatives according to their circumstances, etc. In this sense, the Project: ?Equity and Guidance: the challenge of following up young graduates? is a revealing experience since it shows the urgent need to provide real means to enable communities we work with to make the most of opportunities. For the young people who belong to a vulnerable community, a project of life doesn't mean just thinking of the choice of careers, trainings or jobs. It also means to find how to make it possible in their daily life. Everything we make, think and manage in this project has the same purpose: help them so they can make a life project according to their subjectivity
Resumo:
One hundred and twenty point counts of Oligocene to Recent sands and sandstones from DSDP sites in the Japan and Mariana intraoceanic forearc and backarc basins demonstrate that there is a clear compositional difference between the continentally influenced Japan forearc and backarc sediments, and the totally oceanic Mariana forearc and backarc sediments. Japan forearc sediments average 10 QFL%Q, 0.82 P/F, 2 Framework%Mica, 74 LmLvLst%Lv, and 19 LmLvLst%Lst. In contrast, the Mariana forearc and backarc sediments average 0 QFL%Q, 1.00 P/F, 0 Framework%Mica, 98 LmLvLst%Lv, and 1 LmLvLst%Lst. Sediment compositions in the Japan region are variable. The Honshu forearc sediments average 5 QFL%Q, 0.94 P/F, 1 Framework%Mica, 82 LmLvLst%Lv, and 15 LmLvLst%Lst. The Yamato Basin sediments (DSDP Site 299) average 13 QFL%Q, 0.70 P/F, 3 Framework%Mica, 78 LmLvLst%Lv, and 14 LmLvLst%Lst. The Japan Basin sediments (DSDP Site 301) average 24 QFL%Q, 0.54 P/F, 9 Framework%Mica, 58 LmLvLst%Lv, and 21 LmLvLst%Lst. P/F and Framework%Mica are higher in the Yamato Basin sediments than in the forearc sediments due to an increase in modal potassium content of volcanic rocks from east to west, on the island of Honshu. Site 301 possesses a higher QFL%Q and LmLvLst%Lst, and lower LmLvLst%Lv than Site 299 because it receives sediment from the Asian mainland as well as the island of Honshu. DSDP Site 293 sediments, in the Mariana region, average 0.97 P/F, 1 Framework%Mica, 13 LmLvLst%Lm and 83 LmLvLst%Lv, due to their proximity to the island of Luzon. The remaining Mariana forearc and backarc sediments show a uniform composition.
Resumo:
La presente propuesta corresponde a un proyecto de extensión universitaria, acreditado y subsidiado por la Secretaría de Extensión Universitaria de la Universidad Nacional de La Plata. Este plan de trabajo interinstitucional e interdisciplinario formó parte de una de las acciones de transferencia de nuestro último proyecto de investigación denominado La Orientación Vocacional Ocupacional en escuelas denominadas de alta vulnerabilidad psicosocial.
Resumo:
North Atlantic sediment records (MD95-2042), Greenland (Greenland Ice Core Project (GRIP)) and Antarctica (Byrd and Vostok) ice core climate records have been synchronized over marine isotopic stage 3 (MIS 3) (64 to 24 kyr B.P.) (Shackleton et al., 2000). The resulting common timescale suggested that MD95-2042 d18Obenthic fluctuations were synchronous with temperature changes in Antarctica (dDice or d18Oice records). In order to assess the persistency of this result we have used here the recent Greenland NorthGRIP ice core covering the last glacial inception. We transfer the Antarctic Vostok GT4 timescale to NorthGRIP d18Oice and MD95-2042 d18Oplanktonic records and precisely quantify all the relative timing uncertainties. During the rapid warming of Dansgaard-Oeschger 24, MD95-2042 d18Obenthic decrease is in phase with d18Oplanktonic decrease and therefore with NorthGRIP temperature increase, but it takes place 1700 ± 1100 years after the Antarctic warming. Thus the present study reveals that the results obtained previously for MIS 3 cannot be generalized and demonstrates the need to improve common chronologies for marine and polar archives.
Resumo:
Recent efforts to link the isotopic composition of snow in Greenland with meteorological and climatic parameters have indicated that relatively local information such as observed annual temperatures from coastal Greenland sites, as well as more synoptic scale features such as the North Atlantic Oscillation (NAO) and the temperature seesaw between Jakobshaven, Greenland, and Oslo, Norway, are significantly correlated with d18O and dD values from the past few hundred years measured in ice cores. In this study we review those efforts and then use a new record of isotope values from the Greenland Ice Sheet Project 2 and Greenland Ice Core Project sites at Summit, Greenland, to compare with meteorological and climatic parameters. This new record consists of six individual annually resolved isotopic records which have been average to produce a Summit stacked isotope record. The stacked record is significantly correlated with local Greenland temperatures over the past century (r=0.471), as well as a number of other records including temperatures and pressures from specific locations as well as temperature and pressure patterns such as the temperature seesaw and the North Atlantic Oscillation. A multiple linear regression of the stacked isotope record with a number of meteorological and climatic parameters in the North Atlantic region reveals that five variables contribute significantly to the variance in the isotope record: winter NAO, solar irradiance (as recorded by sunspot numbers), average Greenland coastal temperature, sea surface temperature in the moisture source region for Summit (30°-20°N), and the annual temperature seesaw between Jakobshaven and Oslo. Combined, these variables yield a correlation coefficient of r=0.71, explaining half of the variance in the stacked isotope record.
Resumo:
This paper describes the ways and means of assembling and quality controling the Irminger Sea and Iceland Sea time-series biogeochemical data which are included in the CARINA data set. The Irminger Sea and the Iceland Sea are hydrographically different regions where measurements of sea water carbon and nutrient chemistry were started in 1983. The sampling is seasonal, four times a year. The carbon chemistry is studied with measurements of the partial pressure of carbon dioxide in seawater, pCO2, and total dissolved inorganic carbon, TCO2. The carbon chemistry data are for surface waters only until 1991 when water column sampling was initiated. Other measured parameters are salinity, dissolved oxygen and the inorganic nutrients nitrate, phosphate and silicate. Because of the CARINA criteria for secondary quality control, depth >1500 m, the IRM-TS could not be included in the routine QC and the IS-TS only in a limited way. However, with the information provided here, the quality of the data can be assessed, e.g. on the basis of the results obtained with the use of reference materials.
Resumo:
Hierarchical clustering. Taxonomic assignment of reads was performed using a preexisting database of SSU rDNA sequences from including XXX reference sequences generated by Sanger sequencing. Experimental amplicons (reads), sorted by abundance, were then concatenated with the reference extracted sequences sorted by decreasing length. All sequences, experimental and referential, were then clustered to 85% identity using the global alignment clustering option of the uclust module from the usearch v4.0 software (Edgar, 2010). Each 85% cluster was then reclustered at a higher stringency level (86%) and so on (87%, 88%,.) in a hierarchical manner up to 100% similarity. Each experimental sequence was then identified by the list of clusters to which it belonged at 85% to 100% levels. This information can be viewed as a matrix with the lines corresponding to different sequences and the columns corresponding to the cluster membership at each clustering level. Taxonomic assignment for a given read was performed by first looking if reference sequences clustered with the experimental sequence at the 100% clustering level. If this was the case, the last common taxonomic name of the reference sequence(s) within the cluster was used to assign the environmental read. If not, the same procedure was applied to clusters from 99% to 85% similarity if necessary, until a cluster was found containing both the experimental read and reference sequence(s), in which case sequences were taxonomically assigned as described above.
Resumo:
Rare earth element (REE), major, and trace element abundances and relative fractionations in forty nodular cherts sampled by the Deep Sea Drilling Project (DSDP) and Ocean Drilling Program (ODP) indicate that the REE composition of chert records the interplay between terrigenous sources and scavenging from the local seawater. Major and (non-REE) trace element ratios indicate that the aluminosilicate fraction within the chert is similar to NASC (North American Shale Composite), with average Pacific chert including ~7% NASC-like particles, Indian chert ~11% NASC, Atlantic chert ~17% NASC, and southern high latitude (SHL) chert 53% NASC. Using La as a proxy for sum REE, approximations of excessive La (the amount of La in excess of that supplied by the detrital aluminosilicate fraction) indicate that Pacific chert contains the greatest excessive La (85% of total La) and SHL chert the least (38% of total La). As shown by interelement associations, this excessive La is most likely an adsorbed component onto aluminosilicate and phosphatic phases. Accordingly, chert from the large Pacific Ocean, where deposition occurs relatively removed from significant terrigenous input, records a depositional REE signal dominated by adsorption of dissolved REEs from seawater. Pacific chert Ce/Ce* <<1 and normative La/Yb ~ 0.8-1, resulting from adsorption of local Ce-depleted seawater and preferential adsorption of LREEs from seawater (e.g., normative La/Yb ~0.4), which increases the normative La/Yb ratio recorded in chert. Chert from the Atlantic basin, a moderately sized ocean basin lined by passive margins and with more terrigenous input than the Pacific, records a mix of adsorptive and terrigenous REE signals, with moderately negative Ce anomalies and normative La/Yb ratios intermediate to those of the Pacific and those of terrigenous input. Chert from the SHL region is dominated by the large terrigenous input on the Antarctic passive margin, with inherited Ce/Ce* ~1 and inherited normative La/Yb values of ~1.2-1.4. Ce/Ce* does not vary with age, either throughout the entire data base or within a particular basin. Overall, Ce/Ce* does not correlate with P2O5 concentrations, even though phosphatic phases may be an important REE carrier.
Resumo:
The acquisition of technical, contextual and behavioral competences is a prerequisite for sustainable development and strengthening of rural communities. Territorial display of the status of these skills helps to design the necessary learning, so its inclusion in planning processes is useful for decision making. The article discusses the application of visual representation of competences in a rural development project with Aymara women communities in Peru. The results show an improvement of transparency and dialogue, resulting in a more successful project management and strengthening of social organization.
Resumo:
Purpose The purpose of this paper is to present what kind of elements and evaluation methods should be included into a framework for evaluating the achievements and impacts of transport projects supported in EC Framework Programmes (FP). Further, the paper discusses the possibilities of such an evaluation framework in producing recommendations regarding future transport research and policy objectives as well as mutual learning for the basis of strategic long term planning. Methods The paper describes the two-dimensional evaluation methodology developed in the course of the FP7 METRONOME project. The dimensions are: (1) achievement of project objectives and targets in different levels and (2) research project impacts according to four impact groups. The methodology uses four complementary approaches in evaluation, namely evaluation matrices, coordinator questionnaires, lead user interviews and workshops. Results Based on the methodology testing, with a sample of FP5 and FP6 projects, the main results relating to the rationale, implementation and achievements of FP projects is presented. In general, achievement of objectives in both FPs was good. Strongest impacts were identified within the impact group of management and co-ordination. Also scientific and end-user impacts of the projects were adequate, but wider societal impacts quite modest. The paper concludes with a discussion both on the theoretical and practical implications of the proposed methodology and by presenting some relevant future research needs.
Resumo:
El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.
Resumo:
This document presents an innovative, formal educational initiative that is aimed at enhancing the development of engineering students’ specific competences when studying Project Management (PM) subject. The framework of the experience combines (1) theoretical concepts, (2) the development of a real-case project carried out by multidisciplinary groups of three different universities, (3) the use of software web 2.0 tools and (4) group and individual assignments of students that play different roles (project managers and team members). Under this scenario, the study focuses on monitoring the communication competence in the ever growing PM virtual environment. Factors such as corporal language, technical means, stage, and PM specific vocabulary among others have been considered in order to assess the students’ performance on this issue. As a main contribution, the paper introduces an ad-hoc rubric that, based on previous investigations, has been adapted and tested for the first time to this new and specific context. Additionally, the research conducted has provided some interesting findings that suggest further actions to improve and better define future rubrics, oriented to communication or even other competences. As specific PM subject concerns, it has been detected that students playing the role of Project Managers strengthen their competences more than those ones that play the role of Team Members. It has also been detected that students have more difficulty assimilating concepts related to risk and quality management. However those concepts related with scope, time or cost areas of knowledge have been better assimilated by the students.
Resumo:
This paper presents a new hazard-consistent ground motion characterization of the Itoiz dam site, located in Northern Spain. Firstly, we propose a methodology with different approximation levels to the expected ground motion at the dam site. Secondly, we apply this methodology taking into account the particular characteristics of the site and of the dam. Hazard calculations were performed following the Probabilistic Seismic Hazard Assessment method using a logic tree, which accounts for different seismic source zonings and different ground-motion attenuation relationships. The study was done in terms of peak ground acceleration and several spectral accelerations of periods coinciding with the fundamental vibration periods of the dam. In order to estimate these ground motions we consider two different dam conditions: when the dam is empty (T = 0.1 s) and when it is filled with water to its maximum capacity (T = 0.22 s). Additionally, seismic hazard analysis is done for two return periods: 975 years, related to the project earthquake, and 4,975 years, identified with an extreme event. Soil conditions were also taken into account at the site of the dam. Through the proposed methodology we deal with different forms of characterizing ground motion at the study site. In a first step, we obtain the uniform hazard response spectra for the two return periods. In a second step, a disaggregation analysis is done in order to obtain the controlling earthquakes that can affect the dam. Subsequently, we characterize the ground motion at the dam site in terms of specific response spectra for target motions defined by the expected values SA (T) of T = 0.1 and 0.22 s for the return periods of 975 and 4,975 years, respectively. Finally, synthetic acceleration time histories for earthquake events matching the controlling parameters are generated using the discrete wave-number method and subsequently analyzed. Because of the short relative distances between the controlling earthquakes and the dam site we considered finite sources in these computations. We conclude that directivity effects should be taken into account as an important variable in this kind of studies for ground motion characteristics.
Resumo:
This dissertation discusses how different practitioners define project success and success factors for software projects and products. The motivation for this work is to identify the way software practitioners’ value and define project success. This can have implications for both practitioner motivation and software development productivity. Accordingly, in this work, we are interested in the various perceptions of the term “success” for different software practitioners and researchers. To get this information we performed a systematic mapping of the recent year’s software development literature trying to identify stakeholders’ perceptions about the success of a project and also possible differences among the views of the various stakeholders of a project. Some common terms related to project success (success project; software project success factors) were considered in formulating the search strings. The results were limited to twenty-two selected peer-reviewed conferences, papers/journal articles, published between 2003 and 2012.