959 resultados para Dodd Frank
Resumo:
p.9-19
Resumo:
La sociedad plantea una variedad de demandas de educación dependiendo de su situación y circunstancias particulares. La educación a distancia representa una realidad mundial en constante crecimiento cuantitativo y cualitativo potenciada últimamente con nuevos medios de comunicación.
Resumo:
Three paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation -- a nonlinear, structured-grid partial differential equation boundary value problem -- using the same algorithm on the same hardware. All of the paradigms -- parallel languages represented by the Portland Group's HPF, (semi-)automated serial-to-parallel source-to-source translation represented by CAP-Tools from the University of Greenwich, and parallel libraries represented by Argonne's PETSc -- are found to be easy to use for this problem class, and all are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under any paradigm includes specification of the data partitioning, corresponding to a geometrically simple decomposition of the domain of the PDE. Programming in SPMD style for the PETSc library requires writing only the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global-to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm as a starting point, introduction of concurrency through subdomain blocking (a task similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Programming with CAPTools involves feeding the same sequential implementation to the CAPTools interactive parallelization system, and guiding the source-to-source code transformation by responding to various queries about quantities knowable only at runtime. Results representative of "the state of the practice" for a scaled sequence of structured grid problems are given on three of the most important contemporary high-performance platforms: the IBM SP, the SGI Origin 2000, and the CRAYY T3E.
Resumo:
We present a dynamic distributed load balancing algorithm for parallel, adaptive Finite Element simulations in which we use preconditioned Conjugate Gradient solvers based on domain-decomposition. The load balancing is designed to maintain good partition aspect ratio and we show that cut size is not always the appropriate measure in load balancing. Furthermore, we attempt to answer the question why the aspect ratio of partitions plays an important role for certain solvers. We define and rate different kinds of aspect ratio and present a new center-based partitioning method of calculating the initial distribution which implicitly optimizes this measure. During the adaptive simulation, the load balancer calculates a balancing flow using different versions of the diffusion algorithm and a variant of breadth first search. Elements to be migrated are chosen according to a cost function aiming at the optimization of subdomain shapes. Experimental results for Bramble's preconditioner and comparisons to state-of-the-art load balancers show the benefits of the construction.
Resumo:
The paper deals with the determination of an optimal schedule for the so-called mixed shop problem when the makespan has to be minimized. In such a problem, some jobs have fixed machine orders (as in the job-shop), while the operations of the other jobs may be processed in arbitrary order (as in the open-shop). We prove binary NP-hardness of the preemptive problem with three machines and three jobs (two jobs have fixed machine orders and one may have an arbitrary machine order). We answer all other remaining open questions on the complexity status of mixed-shop problems with the makespan criterion by presenting different polynomial and pseudopolynomial algorithms.
Resumo:
We survey recent results on the computational complexity of mixed shop scheduling problems. In a mixed shop, some jobs have fixed machine orders (as in the job shop), while the operations of the other jobs may be processed in arbitrary order (as in the open shop). The main attention is devoted to establishing the boundary between polynomially solvable and NP-hard problems. When the number of operations per job is unlimited, we focus on problems with a fixed number of jobs.
Resumo:
Prediction of tandem mass spectrometric (MS/MS) fragmentation for non-peptidic molecules based on structure is of immense interest to the mass spectrometrist. If a reliable approach to MS/MS prediction could be achieved its impact within the pharmaceutical industry could be immense. Many publications have stressed that the fragmentation of a molecular ion or protonated molecule is a complex process that depends on many parameters, making prediction difficult. Commercial prediction software relies on a collection of general heuristic rules of fragmentation, which involve cleaving every bond in the structure to produce a list of 'expected' masses which can be compared with the experimental data. These approaches do not take into account the thermodynamic or molecular orbital effects that impact on the molecule at the point of protonation which could influence the potential sites of bond cleavage based on the structural motif. A series of compounds have been studied by examining the experimentally derived high-resolution MS/MS data and comparing it with the in silico modelling of the neutral and protonated structures. The effect that protonation at specific sites can have on the bond lengths has also been determined. We have calculated the thermodynamically most stable protonated species and have observed how that information can help predict the cleavage site for that ion. The data have shown that this use of in silico techniques could be a possible way to predict MS/MS spectra. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
Air stable benzodiazepine containing palladacycles were synthesized by a C-H activation reaction and studied by mass spectrometry and X-ray crystallography. Catalytic C-H functionalizations of 1-methyl-5-phenyl-1H-1,4-benzodiazepin-2(3H)-one with diphenyliodonium hexafluorophosphate led to a mixture, which included the starting material and the expected product 1-methyl-5-(2'-biphenyl)-1H-1,4-benzodiazepin-2(3H)-one. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Overfishing of large-bodied benthic fishes and their subsequent population collapses on the Scotian Shelf of Canada’s east coast1, 2 and elsewhere3, 4 resulted in restructuring of entire food webs now dominated by planktivorous, forage fish species and macroinvertebrates. Despite the imposition of strict management measures in force since the early 1990s, the Scotian Shelf ecosystem has not reverted back to its former structure. Here we provide evidence of the transient nature of this ecosystem and its current return path towards benthic fish species domination. The prolonged duration of the altered food web, and its current recovery, was and is being governed by the oscillatory, runaway consumption dynamics of the forage fish complex. These erupting forage species, which reached biomass levels 900% greater than those prevalent during the pre-collapse years of large benthic predators, are now in decline, having outstripped their zooplankton food supply. This dampening, and the associated reduction in the intensity of predation, was accompanied by lagged increases in species abundances at both lower and higher trophic levels, first witnessed in zooplankton and then in large-bodied predators, all consistent with a return towards the earlier ecosystem structure. We conclude that the reversibility of perturbed ecosystems can occur and that this bodes well for other collapsed fisheries.
Resumo:
Climate change and variability may have an impact on the occurrence of food safety hazards at various stages of the food chain, from primary production through to consumption. There are multiple pathways through which climate related factors may impact food safety including: changes in temperature and precipitation patterns, increased frequency and intensity of extreme weather events, ocean warming and acidification, and changes in contaminants’ transport pathways among others. Climate change may also affect socio-economic aspects related to food systems such as agriculture, animal production, global trade, demographics and human behaviour which all influence food safety. This paper reviews the potential impacts of predicted changes in climate on food contamination and food safety at various stages of the food chain and identifies adaptation strategies and research priorities to address food safety implications of climate change. The paper concludes that there is a need for intersectoral and international cooperation to better understand the changing food safety situation and in developing and implementing adaptation strategies to address emerging risks associated with climate change.
Resumo:
The GEOTRACES Intermediate Data Product 2014 (IDP2014) is the first publicly available data product of the international GEOTRACES programme, and contains data measured and quality controlled before the end of 2013. It consists of two parts: (1) a compilation of digital data for more than 200 trace elements and isotopes (TEls) as well as classical hydrographic parameters, and (2) the eGEOTRACES Electronic Atlas providing a strongly inter-linked on-line atlas including more than 300 section plots and 90 animated 3D scenes. The IDP2014 covers the Atlantic, Arctic, and Indian oceans, exhibiting highest data density in the Atlantic. The TEI data in the IDP2014 are quality controlled by careful assessment of intercalibration results and multi-laboratory data comparisons at cross-over stations. The digital data are provided in several formats, including ASCII spreadsheet, Excel spreadsheet, netCDF, and Ocean Data View collection. In addition to the actual data values the IDP2014 also contains data quality flags and 1-sigma data error values where available. Quality flags and error values are useful for data filtering. Metadata about data originators, analytical methods and original publications related to the data are linked to the data in an easily accessible way. The eGEOTRACES Electronic Atlas is the visual representation of the IDP2014 data providing section plots and a new kind of animated 3D scenes. The basin-wide 3D scenes allow for viewing of data from many cruises at the same time, thereby providing quick overviews of large-scale tracer distributions. In addition, the 3D scenes provide geographical and bathymetric context that is crucial for the interpretation and assessment of observed tracer plumes, as well as for making inferences about controlling processes.