990 resultados para Visualisation du code source


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As part of a continuing program of organic-geochemistry studies of sediments recovered by the Deep Sea Drilling Project, we have analyzed the types, amounts, and thermal-alteration indices of organic matter in samples collected from the landward wall of the Japan Trench on Legs 56 and 57. The samples were canned aboard ship, enabling us to measure also their gas contents. In addition, we analyzed the heavy C15+ hydrocarbons, NSO compounds, and asphaltenes extracted from selected samples. Our samples form a transect down the trench wall, from Holes 438 and 438A (water depth 1558 m), through Holes 435 and 435A (water depth 3401 m), and 440 (water depth 4507 m), to Holes 434 and 434B (water depth 5986 m). The trench wall is the continental slope of Japan. Its sediments are Cenozoic hemipelagic diatomaceous muds that were deposited where they are found or have slumped from farther up the slope. Their terrigenous components probably were deposited from near-bottom nepheloid layers transported by bottom currents or in low density flows (Arthur et al., 1978). Our objective was to find out what types of organic matter exist in the sediment and to estimate their potential for generation of hydrocarbons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The HERMES cold-water coral database is a combination of historical and published sclerectinia cold-water coral occurrences (mainly Lophelia pertusa) and new records of the HERMES project along the European margin. This database will be updated if new findings are reported. New or historical data can be sent to Ben De Mol (mailto:bendemol@ub.edu). Besides geocodes a second category indicates the coral species and if they are sampled alive or dead. If absolute dating is available of the corals this is provide together with the method. Only the framework building cold-water corals are selected: Lophelia pertusa, Madrepora oculata and common cold-water corals often associated with the framework builders like: Desmophyllum sp and Dendrophylia sp. in comments other observed corals are indicated. Another field indicates if the corals are part of a large build-up or solitary. A third category of parameters is referencing to the quality of the represented data. In this category are the following parameters indicated: source of reference, source type (such as Fishermen location, scientific paper, cruise reports). sample code and or name and sample type (e.g. rock dredge, grab, video line). These parameters must allow an assessment of the quality of the described parameters.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Après le séisme qui eut lieu en Haïti le 12 janvier 2010, dont l’épicentre a été localisé non loin de la capitale, Port-au-Prince (25 km en direction sud-est), d’une magnitude Mw 7.0 et à une profondeur de 13 km, le pays s’est retrouvé dans une situation catastrophique et d’extrême pauvreté, avec des graves carences en matière de santé, nutrition, éducation et logement. Les effets du tremblement de terre ont été dévastateurs pour la population : on compte plus de 300.000 morts, presque autant de blessés et 1,3 millions de sans-abri logès dans des campements « provisoires ». Quant aux conséquences matérielles, le séisme a totalement détruit près de 100.000 maisons et endommagé près de 200.000 (source : USGS). Ce tremblement de terre a été le plus fort enregistré dans la zone depuis 1770. De plus le séisme fut perceptible dans des pays voisins comme Cuba, la Jamaïque et la République Dominicaine, où il a provoqué l’alarme et des évacuations préventives. La reconstruction du pays reste un sujet prioritaire pour la coopération internationale. Le présent projet, SISMO-HAITÍ, a été développé dans le but d’apporter la connaissance et l’information nécessaires afin de faciliter la prise de mesures préventives face au risque sismique existant, afin d’éviter qu’un éventuel futur séisme ne déclenche une nouvelle catastrophe. Dans le cas d’Haïti, aucune institution n’était chargée d’assurer une surveillance sismique, mais un contact direct a été établi avec l’Observatoire National de l’Environnement et de la Vulnérabilité (ONEV) en Haïti à travers son directeur Dwinel Belizaire Ing. M. Sc., qui est précisément celui qui a sollicité l’aide qui a motivé la présente proposition. Le but ultime de ce projet est l’étude des mesures d’atténuation du risque élevé qui demeure, contribuant ainsi au développement durable de la région. Dans cette optique, la menace sismique en Haïti a été évaluée, constituant la base sur laquelle on prétend élaborer des critères de conception parasismique pour la reconstruction du pays qui pourront être inclus dans la première version de la norme antisismique, ainsi que le risque sismique à Port-au-Prince, dont les résultats serviront de base pour élaborer les plans d’urgence face à ce risque naturel. Les objectifs spécifiques atteints sont : • Évaluation de l'aléa sismique en Haïti. On en obtient des cartes de différents paramètres de mouvement pour différentes probabilités de dépassement (ce qui suppose connaître la probabilité associée aux mouvements dus à des tremblements futurs). • Évaluation de l'effet local à Port-au-Prince et élaboration d'une carte de microzonage de la ville. • Étude de la vulnérabilité sismique locale à Port-au-Prince. • Estimation du risque sismique à Port-au-Prince. • Mesures d'atténuation du risque et de conception parasismique. Ce rapport résume les activités et les résultats obtenus a cours de l'année 2011 lors de la mise en œuvre de ce projet. Le groupe de travail est une équipe multidisciplinaire composée de chercheurs de différents établissements universitaires et de recherche (Université Polytechnique de Madrid-UPM-, Conseil Supérieur de la Recherche Scientifique (CSIC) ; U. Complutense de Madrid-UCM-, U-UA-Alicante, Almeria-UAL-U., U. Autonome de Saint-Domingue, UASD et Université de Porto Rico Mayagüez--UPRM) experts dans les diverses matières impliquées dans le projet: géologie, sismologie, génie parasismique, architecture et gestion de l'information géographique. Tous les membres de cette équipe ont travaillé ensemble tout au long de l'année, en réalisant des réunions, des ateliers de travail, des vidéoconférences, en plus d'une visite à Port-au-Prince en Juillet 2011 afin de procéder à la première collecte de données.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE’s) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations SA (T) for return periods of 475, 975 and 2475 years. The map of the coefficient of variation (COV) are also represented to give an idea of the zones where the dispersion among results are the highest and the zones where the results are robust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reactive power is critical to the operation of the power networks on both safety aspects and economic aspects. Unreasonable distribution of the reactive power would severely affect the power quality of the power networks and increases the transmission loss. Currently, the most economical and practical approach to minimizing the real power loss remains using reactive power dispatch method. Reactive power dispatch problem is nonlinear and has both equality constraints and inequality constraints. In this thesis, PSO algorithm and MATPOWER 5.1 toolbox are applied to solve the reactive power dispatch problem. PSO is a global optimization technique that is equipped with excellent searching capability. The biggest advantage of PSO is that the efficiency of PSO is less sensitive to the complexity of the objective function. MATPOWER 5.1 is an open source MATLAB toolbox focusing on solving the power flow problems. The benefit of MATPOWER is that its code can be easily used and modified. The proposed method in this thesis minimizes the real power loss in a practical power system and determines the optimal placement of a new installed DG. IEEE 14 bus system is used to evaluate the performance. Test results show the effectiveness of the proposed method.