966 resultados para Project method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sampling area was extended to the Western-South area off the Black Sea coast from Kaliakra cape toward the Bosforous. Samples were collected along four transects. The whole dataset is composed of 17 samples (from 10 stations) with data of mesozooplankton species composition abundance and biomass. Sampling for zooplankton was performed from bottom up to the surface at depths depending on water column stratification and the thermocline depth. These data are organized in the "Control of eutrophication, hazardous substances and related measures for rehabilitating the Black Sea ecosystem: Phase 2: Leg I: PIMS 3065". Data Report is not published. Zooplankton samples were collected with vertical closing Juday net,diameter - 36cm, mesh size 150 µm. Tows were performed from surface down to bottom meters depths in discrete layers. Samples were preserved by a 4% formaldehyde sea water buffered solution. Sampling volume was estimated by multiplying the mouth area with the wire length. Mesozooplankton abundance: The collected material was analysed using the method of Domov (1959). Samples were brought to volume of 25-30 ml depending upon zooplankton density and mixed intensively until all organisms were distributed randomly in the sample volume. After that 5 ml of sample was taken and poured in the counting chamber which is a rectangle form for taxomomic identification and count. Large (> 1 mm body length) and not abundant species were calculated in whole sample. Counting and measuring of organisms were made in the Dimov chamber under the stereomicroscope to the lowest taxon possible. Taxonomic identification was done at the Institute of Oceanology by Kremena Stefanova using the relevant taxonomic literature (Mordukhay-Boltovskoy, F.D. (Ed.). 1968, 1969,1972). Taxon-specific abundance: The collected material was analysed using the method of Domov (1959). Samples were brought to volume of 25-30 ml depending upon zooplankton density and mixed intensively until all organisms were distributed randomly in the sample volume. After that 5 ml of sample was taken and poured in the counting chamber which is a rectangle form for taxomomic identification and count. Copepods and Cladoceras were identified and enumerated; the other mesozooplankters were identified and enumerated at higher taxonomic level (commonly named as mesozooplankton groups). Large (> 1 mm body length) and not abundant species were calculated in whole sample. Counting and measuring of organisms were made in the Dimov chamber under the stereomicroscope to the lowest taxon possible. Taxonomic identification was done at the Institute of Oceanology by Kremena Stefanova using the relevant taxonomic literature (Mordukhay-Boltovskoy, F.D. (Ed.). 1968, 1969,1972).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A quick new method is described for the quantification of absolute nannofossil proportions in deep-sea sediments. This method (SMS) is the combination of Spiking a sample with Microbeads and Spraying it on a cover slide. It is suitable for scanning electron microscope (SEM) analyses and for light microscope (LM) analyses. Repeated preparation and counting of the same sample (30 times) revealed a standard deviation of 10.5%. The application of tracer microbeads with different diameters and densities revealed no statistically significant differences between counts. The SMS-method yielded coccolith numbers that are statistically not significantly different from values obtained from the filtration-method. However, coccolith counts obtained by the random settling method are three times higher than the values obtained by the SMS- and the filtration-method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The adaptation to the European Higher Education Area (EHEA) is becoming a great challenge for the University Community, especially for its teaching and research staff, which is involved actively in the teaching-learning process. It is also inducing a paradigm change for lecturers and students. Among the methodologies used for processes of teaching innovation, system thinking plays an important role when working mainly with mind maps, and is focused to highlighting the essence of the knowledge, allowing its visual representation. In this paper, a method for using these mind maps for organizing a particular subject is explained. This organization is completed with the definition of duration, precedence relationships and resources for each of these activities, as well as with their corresponding monitoring. Mind maps are generated by means of the MINDMANAGER package whilst Ms-PROJECT is used for establishing tasks relationships, durations, resources, and monitoring. Summarizing, a procedure and the necessary set of applications for self organizing and managing (timed) scheduled teaching tasks has been described in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The estimation of power losses due to wind turbine wakes is crucial to understanding overall wind farm economics. This is especially true for large offshore wind farms, as it represents the primary source of losses in available power, given the regular arrangement of rotors, their generally largerdiameter and the lower ambient turbulence level, all of which conspire to dramatically affect wake expansion and, consequently, the power deficit. Simulation of wake effects in offshore wind farms (in reasonable computational time) is currently feasible using CFD tools. An elliptic CFD model basedon the actuator disk method and various RANS turbulence closure schemes is tested and validated using power ratios extracted from Horns Rev and Nysted wind farms, collected as part of the EU-funded UPWIND project. The primary focus of the present work is on turbulence modeling, as turbulent mixing is the main mechanism for flow recovery inside wind farms. A higher-order approach, based on the anisotropic RSM model, is tested to better take into account the imbalance in the length scales inside and outside of the wake, not well reproduced by current two-equation closure schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fission product yields are fundamental parameters for several nuclear engineering calculations and in particular for burn-up/activation problems. The impact of their uncertainties was widely studied in the past and valuations were released, although still incomplete. Recently, the nuclear community expressed the need for full fission yield covariance matrices to produce inventory calculation results that take into account the complete uncertainty data. In this work, we studied and applied a Bayesian/generalised least-squares method for covariance generation, and compared the generated uncertainties to the original data stored in the JEFF-3.1.2 library. Then, we focused on the effect of fission yield covariance information on fission pulse decay heat results for thermal fission of 235U. Calculations were carried out using different codes (ACAB and ALEPH-2) after introducing the new covariance values. Results were compared with those obtained with the uncertainty data currently provided by the library. The uncertainty quantification was performed with the Monte Carlo sampling technique. Indeed, correlations between fission yields strongly affect the statistics of decay heat. Introduction Nowadays, any engineering calculation performed in the nuclear field should be accompanied by an uncertainty analysis. In such an analysis, different sources of uncertainties are taken into account. Works such as those performed under the UAM project (Ivanov, et al., 2013) treat nuclear data as a source of uncertainty, in particular cross-section data for which uncertainties given in the form of covariance matrices are already provided in the major nuclear data libraries. Meanwhile, fission yield uncertainties were often neglected or treated shallowly, because their effects were considered of second order compared to cross-sections (Garcia-Herranz, et al., 2010). However, the Working Party on International Nuclear Data Evaluation Co-operation (WPEC)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La comparación de las diferentes ofertas presentadas en la licitación de un proyecto,con el sistema de contratación tradicional de medición abierta y precio unitario cerrado, requiere herramientas de análisis que sean capaces de discriminar propuestas que teniendo un importe global parecido pueden presentar un impacto económico muy diferente durante la ejecución. Una de las situaciones que no se detecta fácilmente con los métodos tradicionales es el comportamiento del coste real frente a las variaciones de las cantidades realmente ejecutadas en obra respecto de las estimadas en el proyecto. Este texto propone abordar esta situación mediante un sistema de análisis cuantitativo del riesgo como el método de Montecarlo. Este procedimiento, como es sabido, consiste en permitir que los datos de entrada que definen el problema varíen unas funciones de probabilidad definidas, generar un gran número de casos de prueba y tratar los resultados estadísticamente para obtener los valores finales más probables,con los parámetros necesarios para medir la fiabilidad de la estimación. Se presenta un modelo para la comparación de ofertas, desarrollado de manera que puede aplicarse en casos reales aplicando a los datos conocidos unas condiciones de variación que sean fáciles de establecer por los profesionales que realizan estas tareas. ABSTRACT: The comparison of the different bids in the tender for a project, with the traditional contract system based on unit rates open to and re-measurement, requires analysis tools that are able to discriminate proposals having a similar overall economic impact, but that might show a very different behaviour during the execution of the works. One situation not easily detected by traditional methods is the reaction of the actual cost to the changes in the exact quantity of works finally executed respect of the work estimated in the project. This paper intends to address this situation through the Monte Carlo method, a system of quantitative risk analysis. This procedure, as is known, is allows the input data defining the problem to vary some within well defined probability functions, generating a large number of test cases, the results being statistically treated to obtain the most probable final values, with the rest of the parameters needed to measure the reliability of the estimate. We present a model for the comparison of bids, designed in a way that it can be applied in real cases, based on data and assumptions that are easy to understand and set up by professionals who wish to perform these tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main goal of the cofounded by the European Commission LIFE Project, New4Old (LIFE10 ENV/ES/439), is to define the most appropriate method and the best available practice in social housing rehabilitation with energy and environmental sustainability criteria, as well as to apply innovative technologies in the fight against climate change through an efficient use of resources and energy. The institutions involved in the Project are the Technological Centre AITEMIN, Madrid Polytechnic University (UPM), Portugal Technological Centre for Ceramics and Glass (CTCV) and the Zaragoza City Housing Society (SMZV). The demonstrator project consists in the energy rehabilitation of a rental social housing building located in Zaragoza?s historic quarter, according to the conclusions and strategies developed for the LIFE project. In actions taken in households of this nature passive design strategies are essential due to the limited income of owners, who often cannot afford energy bills. Therefore, the proposed actions will help improve the building?s passive performance and reach a higher thermal comfort, without increasing the economic cost linked to energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concentrating Photovoltaics (CPV) is an alternative to flat-plate module photovoltaic (PV) technology. The bankability of CPV projects is an important issue to pave the way toward a swift and sustained growth in this technology. The bankability of a PV plant is generally addressed through the modeling of its energy yield under a baseline loss scenario, followed by an on-site measurement campaign aimed at verifying its energy performance. This paper proposes a procedure for assessing the performance of a CPV project, articulated around four main successive steps: Solar Resource Assessment, Yield Assessment, Certificate of Provisional Acceptance, and Certificate of Final Acceptance. This methodology allows the long-term energy production of a CPV project to be estimated with an associated uncertainty of ≈5%. To our knowledge, no such method has been proposed to the CPV industry yet, and this critical situation has hindered or made impossible the completion of several important CPV projects undertaken in the world. The main motive for this proposed method is to bring a practical solution to this urgent problem. This procedure can be operated under a wide range of climatic conditions, and makes it possible to assess the bankability of a CPV plant whose design uses any of the technologies currently available on the market. The method is also compliant with both international standards and local regulations. In consequence, its applicability is both general and international.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a novel filter for feature selection. Such filter relies on the estimation of the mutual information between features and classes. We bypass the estimation of the probability density function with the aid of the entropic-graphs approximation of Rényi entropy, and the subsequent approximation of the Shannon one. The complexity of such bypassing process does not depend on the number of dimensions but on the number of patterns/samples, and thus the curse of dimensionality is circumvented. We show that it is then possible to outperform a greedy algorithm based on the maximal relevance and minimal redundancy criterion. We successfully test our method both in the contexts of image classification and microarray data classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several recent works deal with 3D data in mobile robotic problems, e.g., mapping. Data comes from any kind of sensor (time of flight, Kinect or 3D lasers) that provide a huge amount of unorganized 3D data. In this paper we detail an efficient approach to build complete 3D models using a soft computing method, the Growing Neural Gas (GNG). As neural models deal easily with noise, imprecision, uncertainty or partial data, GNG provides better results than other approaches. The GNG obtained is then applied to a sequence. We present a comprehensive study on GNG parameters to ensure the best result at the lowest time cost. From this GNG structure, we propose to calculate planar patches and thus obtaining a fast method to compute the movement performed by a mobile robot by means of a 3D models registration algorithm. Final results of 3D mapping are also shown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A parallel algorithm for image noise removal is proposed. The algorithm is based on peer group concept and uses a fuzzy metric. An optimization study on the use of the CUDA platform to remove impulsive noise using this algorithm is presented. Moreover, an implementation of the algorithm on multi-core platforms using OpenMP is presented. Performance is evaluated in terms of execution time and a comparison of the implementation parallelised in multi-core, GPUs and the combination of both is conducted. A performance analysis with large images is conducted in order to identify the amount of pixels to allocate in the CPU and GPU. The observed time shows that both devices must have work to do, leaving the most to the GPU. Results show that parallel implementations of denoising filters on GPUs and multi-cores are very advisable, and they open the door to use such algorithms for real-time processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, there is an increasing number of robotic applications that need to act in real three-dimensional (3D) scenarios. In this paper we present a new mobile robotics orientated 3D registration method that improves previous Iterative Closest Points based solutions both in speed and accuracy. As an initial step, we perform a low cost computational method to obtain descriptions for 3D scenes planar surfaces. Then, from these descriptions we apply a force system in order to compute accurately and efficiently a six degrees of freedom egomotion. We describe the basis of our approach and demonstrate its validity with several experiments using different kinds of 3D sensors and different 3D real environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work, a three-dimensional (3D) formulation based on the method of fundamental solutions (MFS) is applied to the study of acoustic horns. The implemented model follows and extends previous works that only considered two-dimensional and axisymmetric horn configurations. The more realistic case of 3D acoustic horns with symmetry regarding two orthogonal planes is addressed. The use of the domain decomposition technique with two interconnected sub-regions along a continuity boundary is proposed, allowing for the computation of the sound pressure generated by an acoustic horn installed on a rigid screen. In order to reduce the model discretization requirements for these cases, Green’s functions derived with the image source methodology are adopted, automatically accounting for the presence of symmetry conditions. A strategy for the calculation of an optimal position of the virtual sources used by the MFS to define the solution is also used, leading to improved reliability and flexibility of the proposed method. The responses obtained by the developed model are compared to reference solutions, computed by well-established models based on the boundary element method. Additionally, numerically calculated acoustic parameters, such as directivity and beamwidth, are compared with those evaluated experimentally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement of concrete strain through non-invasive methods is of great importance in civil engineering and structural analysis. Traditional methods use laser speckle and high quality cameras that may result too expensive for many applications. Here we present a method for measuring concrete deformations with a standard reflex camera and image processing for tracking objects in the concretes surface. Two different approaches are presented here. In the first one, on-purpose objects are drawn on the surface, while on the second one we track small defects on the surface due to air bubbles in the hardening process. The method has been tested on a concrete sample under several loading/unloading cycles. A stop-motion sequence of the process has been captured and analyzed. Results have been successfully compared with the values given by a strain gauge. Accuracy of our methods in tracking objects is below 8 μm, in the order of more expensive commercial devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building Information Modelling (BIM) provides a shared source of information about a built asset, which creates a collaborative virtual environment for project teams. Literature suggests that to collaborate efficiently, the relationship between the project team is based on sympathy, obligation, trust and rapport. Communication increases in importance when working collaboratively but effective communication can only be achieved when the stakeholders are willing to act, react, listen and share information. Case study research and interviews with Architecture, Engineering and Construction (AEC) industry experts suggest that synchronous face-to-face communication is project teams’ preferred method, allowing teams to socialise and build rapport, accelerating the creation of trust between the stakeholders. However, virtual unified communication platforms are a close second-preferred option for communication between the teams. Effective methods for virtual communication in professional practice, such as virtual collaboration environments (CVE), that build trust and achieve similar spontaneous responses as face-to-face communication, are necessary to face the global challenges and can be achieved with the right people, processes and technology. This research paper investigates current industry methods for virtual communication within BIM projects and explores the suitability of avatar interaction in a collaborative virtual environment as an alternative to face-to-face communication to enhance collaboration between design teams’ professional practice on a project. Hence, this paper presents comparisons between the effectiveness of these communication methods within construction design teams with results of further experiments conducted to test recommendations for more efficient methods for virtual communication to add value in the workplace between design teams.