917 resultados para Numerical Algorithms and Problems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Algorithmic DNA tiles systems are fascinating. From a theoretical perspective, they can result in simple systems that assemble themselves into beautiful, complex structures through fundamental interactions and logical rules. As an experimental technique, they provide a promising method for programmably assembling complex, precise crystals that can grow to considerable size while retaining nanoscale resolution. In the journey from theoretical abstractions to experimental demonstrations, however, lie numerous challenges and complications.

In this thesis, to examine these challenges, we consider the physical principles behind DNA tile self-assembly. We survey recent progress in experimental algorithmic self-assembly, and explain the simple physical models behind this progress. Using direct observation of individual tile attachments and detachments with an atomic force microscope, we test some of the fundamental assumptions of the widely-used kinetic Tile Assembly Model, obtaining results that fit the model to within error. We then depart from the simplest form of that model, examining the effects of DNA sticky end sequence energetics on tile system behavior. We develop theoretical models, sequence assignment algorithms, and a software package, StickyDesign, for sticky end sequence design.

As a demonstration of a specific tile system, we design a binary counting ribbon that can accurately count from a programmable starting value and stop growing after overflowing, resulting in a single system that can construct ribbons of precise and programmable length. In the process of designing the system, we explain numerous considerations that provide insight into more general tile system design, particularly with regards to tile concentrations, facet nucleation, the construction of finite assemblies, and design beyond the abstract Tile Assembly Model.

Finally, we present our crystals that count: experimental results with our binary counting system that represent a significant improvement in the accuracy of experimental algorithmic self-assembly, including crystals that count perfectly with 5 bits from 0 to 31. We show some preliminary experimental results on the construction of our capping system to stop growth after counters overflow, and offer some speculation on potential future directions of the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Madden-Julian Oscillation (MJO) is a pattern of intense rainfall and associated planetary-scale circulations in the tropical atmosphere, with a recurrence interval of 30-90 days. Although the MJO was first discovered 40 years ago, it is still a challenge to simulate the MJO in general circulation models (GCMs), and even with simple models it is difficult to agree on the basic mechanisms. This deficiency is mainly due to our poor understanding of moist convection—deep cumulus clouds and thunderstorms, which occur at scales that are smaller than the resolution elements of the GCMs. Moist convection is the most important mechanism for transporting energy from the ocean to the atmosphere. Success in simulating the MJO will improve our understanding of moist convection and thereby improve weather and climate forecasting.

We address this fundamental subject by analyzing observational datasets, constructing a hierarchy of numerical models, and developing theories. Parameters of the models are taken from observation, and the simulated MJO fits the data without further adjustments. The major findings include: 1) the MJO may be an ensemble of convection events linked together by small-scale high-frequency inertia-gravity waves; 2) the eastward propagation of the MJO is determined by the difference between the eastward and westward phase speeds of the waves; 3) the planetary scale of the MJO is the length over which temperature anomalies can be effectively smoothed by gravity waves; 4) the strength of the MJO increases with the typical strength of convection, which increases in a warming climate; 5) the horizontal scale of the MJO increases with the spatial frequency of convection; and 6) triggered convection, where potential energy accumulates until a threshold is reached, is important in simulating the MJO. Our findings challenge previous paradigms, which consider the MJO as a large-scale mode, and point to ways for improving the climate models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Energy and sustainability have become one of the most critical issues of our generation. While the abundant potential of renewable energy such as solar and wind provides a real opportunity for sustainability, their intermittency and uncertainty present a daunting operating challenge. This thesis aims to develop analytical models, deployable algorithms, and real systems to enable efficient integration of renewable energy into complex distributed systems with limited information.

The first thrust of the thesis is to make IT systems more sustainable by facilitating the integration of renewable energy into these systems. IT represents the fastest growing sectors in energy usage and greenhouse gas pollution. Over the last decade there are dramatic improvements in the energy efficiency of IT systems, but the efficiency improvements do not necessarily lead to reduction in energy consumption because more servers are demanded. Further, little effort has been put in making IT more sustainable, and most of the improvements are from improved "engineering" rather than improved "algorithms". In contrast, my work focuses on developing algorithms with rigorous theoretical analysis that improve the sustainability of IT. In particular, this thesis seeks to exploit the flexibilities of cloud workloads both (i) in time by scheduling delay-tolerant workloads and (ii) in space by routing requests to geographically diverse data centers. These opportunities allow data centers to adaptively respond to renewable availability, varying cooling efficiency, and fluctuating energy prices, while still meeting performance requirements. The design of the enabling algorithms is however very challenging because of limited information, non-smooth objective functions and the need for distributed control. Novel distributed algorithms are developed with theoretically provable guarantees to enable the "follow the renewables" routing. Moving from theory to practice, I helped HP design and implement industry's first Net-zero Energy Data Center.

The second thrust of this thesis is to use IT systems to improve the sustainability and efficiency of our energy infrastructure through data center demand response. The main challenges as we integrate more renewable sources to the existing power grid come from the fluctuation and unpredictability of renewable generation. Although energy storage and reserves can potentially solve the issues, they are very costly. One promising alternative is to make the cloud data centers demand responsive. The potential of such an approach is huge.

To realize this potential, we need adaptive and distributed control of cloud data centers and new electricity market designs for distributed electricity resources. My work is progressing in both directions. In particular, I have designed online algorithms with theoretically guaranteed performance for data center operators to deal with uncertainties under popular demand response programs. Based on local control rules of customers, I have further designed new pricing schemes for demand response to align the interests of customers, utility companies, and the society to improve social welfare.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high computational cost of correlated wavefunction theory (WFT) calculations has motivated the development of numerous methods to partition the description of large chemical systems into smaller subsystem calculations. For example, WFT-in-DFT embedding methods facilitate the partitioning of a system into two subsystems: a subsystem A that is treated using an accurate WFT method, and a subsystem B that is treated using a more efficient Kohn-Sham density functional theory (KS-DFT) method. Representation of the interactions between subsystems is non-trivial, and often requires the use of approximate kinetic energy functionals or computationally challenging optimized effective potential calculations; however, it has recently been shown that these challenges can be eliminated through the use of a projection operator. This dissertation describes the development and application of embedding methods that enable accurate and efficient calculation of the properties of large chemical systems.

Chapter 1 introduces a method for efficiently performing projection-based WFT-in-DFT embedding calculations on large systems. This is accomplished by using a truncated basis set representation of the subsystem A wavefunction. We show that naive truncation of the basis set associated with subsystem A can lead to large numerical artifacts, and present an approach for systematically controlling these artifacts.

Chapter 2 describes the application of the projection-based embedding method to investigate the oxidative stability of lithium-ion batteries. We study the oxidation potentials of mixtures of ethylene carbonate (EC) and dimethyl carbonate (DMC) by using the projection-based embedding method to calculate the vertical ionization energy (IE) of individual molecules at the CCSD(T) level of theory, while explicitly accounting for the solvent using DFT. Interestingly, we reveal that large contributions to the solvation properties of DMC originate from quadrupolar interactions, resulting in a much larger solvent reorganization energy than that predicted using simple dielectric continuum models. Demonstration that the solvation properties of EC and DMC are governed by fundamentally different intermolecular interactions provides insight into key aspects of lithium-ion batteries, with relevance to electrolyte decomposition processes, solid-electrolyte interphase formation, and the local solvation environment of lithium cations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the behavior of granular materials at three length scales. At the smallest length scale, the grain-scale, we study inter-particle forces and "force chains". Inter-particle forces are the natural building blocks of constitutive laws for granular materials. Force chains are a key signature of the heterogeneity of granular systems. Despite their fundamental importance for calibrating grain-scale numerical models and elucidating constitutive laws, inter-particle forces have not been fully quantified in natural granular materials. We present a numerical force inference technique for determining inter-particle forces from experimental data and apply the technique to two-dimensional and three-dimensional systems under quasi-static and dynamic load. These experiments validate the technique and provide insight into the quasi-static and dynamic behavior of granular materials.

At a larger length scale, the mesoscale, we study the emergent frictional behavior of a collection of grains. Properties of granular materials at this intermediate scale are crucial inputs for macro-scale continuum models. We derive friction laws for granular materials at the mesoscale by applying averaging techniques to grain-scale quantities. These laws portray the nature of steady-state frictional strength as a competition between steady-state dilation and grain-scale dissipation rates. The laws also directly link the rate of dilation to the non-steady-state frictional strength.

At the macro-scale, we investigate continuum modeling techniques capable of simulating the distinct solid-like, liquid-like, and gas-like behaviors exhibited by granular materials in a single computational domain. We propose a Smoothed Particle Hydrodynamics (SPH) approach for granular materials with a viscoplastic constitutive law. The constitutive law uses a rate-dependent and dilation-dependent friction law. We provide a theoretical basis for a dilation-dependent friction law using similar analysis to that performed at the mesoscale. We provide several qualitative and quantitative validations of the technique and discuss ongoing work aiming to couple the granular flow with gas and fluid flows.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O presente trabalho apresenta a aplicação das fórmulas de Vincenty nos cálculos das correções do terreno e do efeito indireto, que desempenham papel relevante na construção de cartas geoidais. Implementa-se um programa de processamento que realiza a integração numérica sobre o modelo digital do terreno, discretizado em células triangulares de Delaunay. O sistema foi desenvolvido com a linguagem de programação FORTRAN, para a execução de intensos algoritmos numéricos usando compiladores livres e robustos. Para o cálculo do efeito indireto, considera-se a redução gravimétrica efetuada com base no segundo método de condensação de Helmert, face ao pequeno valor de efeito indireto no cálculo do geóide, em função da mudança que este produz no potencial da gravidade devido ao deslocamento da massa topográfica. Utiliza-se, o sistema geodésico SIRGAS 2000 como sistema de referência para o cômputo das correções. Simplificando o exame dos resultados alcançados, distingue-se o processamento e desenvolvimento do trabalho em etapas como a escolha de ferramentas geodésicas para máxima precisão dos resultados, elaboração de subrotinas e comparação de resultados com cálculos anteriores. Os resultados encontrados foram de geração sadia e satisfatória e podem ser perfeitamente empregados no cálculo do geóide em qualquer área do globo.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the work was to develop a non-invasive methodology for image acquisition, processing and nonlinear trajectory analysis of the collective fish response to a stochastic event. Object detection and motion estimation were performed by an optical flow algorithm in order to detect moving fish and simultaneously eliminate background, noise and artifacts. The Entropy and the Fractal Dimension (FD) of the trajectory followed by the centroids of the groups of fish were calculated using Shannon and permutation Entropy and the Katz, Higuchi and Katz-Castiglioni's FD algorithms respectively. The methodology was tested on three case groups of European sea bass (Dicentrarchus labrax), two of which were similar (C1 control and C2 tagged fish) and very different from the third (C3, tagged fish submerged in methylmercury contaminated water). The results indicate that Shannon entropy and Katz-Castiglioni were the most sensitive algorithms and proved to be promising tools for the non-invasive identification and quantification of differences in fish responses. In conclusion, we believe that this methodology has the potential to be embedded in online/real time architecture for contaminant monitoring programs in the aquaculture industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A realistic quantum many-body system, characterized by a generic microscopic Hamiltonian, is accessible only through approximation methods. The mean field theories, as the simplest practices of approximation methods, commonly serve as a powerful tool, but unfortunately often violate the symmetry of the Hamiltonian. The conventional BCS theory, as an excellent mean field approach, violates the particle number conservation and completely erases quantumness characterized by concurrence and quantum discord between different modes. We restore the symmetry by using the projected BCS theory and the exact numerical solution and find that the lost quantumness is synchronously reestablished. We show that while entanglement remains unchanged with the particle numbers, quantum discord behaves as an extensive quantity with respect to the system size. Surprisingly, discord is hardly dependent on the interaction strengths. The new feature of discord offers promising applications in modern quantum technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although geographically the River Wyre lies between two rivers containing major migrations of adult salmon and sea trout, its rod & line fisheries have for a number of years produced exceptionally low catches. In order to determine the causes of this the Wyre Salmon and Sea trout Restoration Group (WSSRG) was conceived in 1994 as a partnership between the then National Rivers Authority (now Environment Agency), local landowners, angling clubs and interested parties. Two studies of 1994 and 1995 stated that there is a shortage of useable spawning gravels on the river. This is compounded by Abbeystead Reservoir acting as a gravel trap, the siltation of gravels on several side becks and problems with access to available gravels by returning adults. There was also perceived to be a need for accurate fishery data from the river encompassing redd counts, catch data and surveys of fry populations. The 1995 report suggested a number of management proposals which might be adopted in order to improve and create available spawning habitat for migratory salmonids. Funding was made available to create three spawning gravels on each of two side becks (Grizedale Beck and Joshua's Beck) and the addition of gravels to a site oh the main river below Abbeystead Reservoir. Modifications were also made to the fish pass at Abbeystead to allow easier passage of fish. These improvements were made in the autumn of 1995. Salmonid spawning redd counting was undertaken on the whole Wyre catchment in 1995/1996 and specific surveys by electric fishing on the gravel enhancement sites in the summer of 1996. This report details the current state of the improvement works that were undertaken and presents the results of electric fishing surveys in September 1996. A number of lessons have been learnt which will be of great benefit to the Fisheries Function in other parts of the Wyre catchment and the Central Area in general.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O surgimento de novos serviços de telecomunicações tem provocado um enorme aumento no tráfego de dados nas redes de transmissão. Para atender a essa demanda crescente, novas tecnologias foram desenvolvidas e implementadas ao longo dos anos, sendo que um dos principais avanços está na área de transmissão óptica, devido à grande capacidade de transporte de informação da fibra óptica. A tecnologia que melhor explora a capacidade desse meio de transmissão atualmente é a multiplexação por divisão de comprimento de onda ou Wavelength Division Multiplexing (WDM) que permite a transmissão de diversos sinais utilizando apenas uma fibra óptica. Redes ópticas WDM se tornaram muito complexas, com enorme capacidade de transmissão de informação (terabits por segundo), para atender à explosão de necessidade por largura de banda. Nesse contexto, é de extrema importância que os recursos dessas redes sejam utilizados de forma inteligente e otimizada. Um dos maiores desafios em uma rede óptica é a escolha de uma rota e a seleção de um comprimento de onda disponível na rede para atender uma solicitação de conexão utilizando o menor número de recursos possível. Esse problema é bastante complexo e ficou conhecido como problema de roteamento e alocação de comprimento de onda ou, simplesmente, problema RWA (Routing and Wavelentgh Assignment problem). Muitos estudos foram realizados com o objetivo de encontrar uma solução eficiente para esse problema, mas nem sempre é possível aliar bom desempenho com baixo tempo de execução, requisito fundamental em redes de telecomunicações. A técnica de algoritmo genético (AG) tem sido utilizada para encontrar soluções de problemas de otimização, como é o caso do problema RWA, e tem obtido resultados superiores quando comparada com soluções heurísticas tradicionais encontradas na literatura. Esta dissertação apresenta, resumidamente, os conceitos de redes ópticas e de algoritmos genéticos, e descreve uma formulação do problema RWA adequada à solução por algoritmo genético.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Size distribution within re- ported landings is an important aspect of northern Gulf of Mexico penaeid shrimp stock assessments. It reflects shrimp population characteristics such as numerical abundance of various sizes, age structure, and vital rates (e.g. recruitment, growth, and mortality), as well as effects of fishing, fishing power, fishing practices, sampling, size-grading, etc. The usual measure of shrimp size in archived landings data is count (C) the number of shrimp tails (abdomen or edible portion) per pound (0.4536 kg). Shrimp are marketed and landings reported in pounds within tail count categories. Statistically, these count categories are count class intervals or bins with upper and lower limits expressed in C. Count categories vary in width, overlap, and frequency of occurrence within the landings. The upper and lower limits of most count class intervals can be transformed to lower and upper limits (respectively) of class intervals expressed in pounds per shrimp tail, w, the reciprocal of C (i.e. w = 1/C). Age based stock assessments have relied on various algorithms to estimate numbers of shrimp from pounds landed within count categories. These algorithms required un- derlying explicit or implicit assumptions about the distribution of C or w. However, no attempts were made to assess the actual distribution of C or w. Therefore, validity of the algorithms and assumptions could not be determined. When different algorithms were applied to landings within the same size categories, they produced different estimates of numbers of shrimp. This paper demonstrates a method of simulating the distribution of w in reported biological year landings of shrimp. We used, as examples, landings of brown shrimp, Farfantepenaeus aztecus, from the northern Gulf of Mexico fishery in biological years 1986–2006. Brown shrimp biological year, Ti, is defined as beginning on 1 May of the same calendar year as Ti and ending on 30 April of the next calendar year, where subscript i is the place marker for biological year. Biological year landings encompass most if not all of the brown shrimp life cycle and life span. Simulated distributions of w reflect all factors influencing sizes of brown shrimp in the landings within a given biological year. Our method does not require a priori assumptions about the parent distributions of w or C, and it takes into account the variability in width, overlap, and frequency of occurrence of count categories within the landings. Simulated biological year distributions of w can be transformed to equivalent distributions of C. Our method may be useful in future testing of previously applied algorithms and development of new estimators based on statistical estimation theory and the underlying distribution of w or C. We also examine some applications of biological year distributions of w, and additional variables derived from them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho teve como objetivo principal implementar um algoritmo empírico para o monitoramento do processo de eutrofização da Baía de Guanabara (BG), Rio de Janeiro (RJ), utilizando dados de clorofila-a coletados in situ e imagens de satélite coletadas pelo sensor MERIS, a bordo do satélite ENVISAT, da Agência Espacial Européia (ESA). Para a elaboração do algoritmo foi utilizada uma série histórica de clorofila-a (Out/2002 a Jan/2012) fornecida pelo Laboratório de Biologia Marinha da UFRJ, que, acoplada aos dados radiométricos coletados pelo sensor MERIS em datas concomitantes com as coletas in situ de clorofila-a, permitiu a determinação das curvas de regressão que deram origem aos algorítmos. Diversas combinações de bandas foram utilizadas, com ênfase nos comprimentos de onda do verde, vermelho e infra-vermelho próximo. O algoritmo escolhido (R = 0,66 e MRE = 77,5%) fez uso dos comprimentos de onda entre o verde e o vermelho (665, 680, 560 e 620 nm) e apresentou resultado satisfatório, apesar das limitações devido à complexidade da área de estudo e problemas no algoritmo de correção atmosférica . Algorítmos típicos de água do Caso I (OC3 e OC4) também foram testados, assim como os algoritmos FLH e MCI, aconselhados para águas com concentrações elevadas de Chl-a, todos com resultados insatisfatório. Como observado por estudos pretéritos, a Baia de Guanabara possui alta variabilidade espacial e temporal de concentrações de clorofila-a, com as maiores concentrações no período úmido (meses: 01, 02, 03, 10, 11 12) e nas porções marginais (~ 100 mg.m-3), particularmente na borda Oeste da baia, e menores concentrações no período seco e no canal principal de circulação (~ 20 mg.m-3). O presente trabalho é pioneiro na construção e aplicação de algoritmos bio-óptico para a região da BG utilizando imagens MERIS. Apesar dos bons resultados, o presente algorítmo não deve ser considerado definitivo, e recomenda-se para trabalhos futuros testar os diferentes modelos de correção atmosférico para as imagens MERIS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the late 1980's and early 1990's, significant changes occurred in the fisheries of Hawaii. Expansion and diversification of pelagic fisheries and growth (including industrialization) of fisheries that, in at least some cases, had been largely recreational or artisanall ed to fear of overfishing and problems in allocation among fishery sectors. Combined with establishment of limited entry programs in Hawaii fisheries (bottomfish, longline, and lobster), this led to anticipation that similar growth might occur in Guam, the Northern Marianas, and American Samoa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

EXTRACT (SEE PDF FOR FULL ABSTRACT): Variations in temperature that occurred in the North Pacific thermocline (250 to 400 meters) during the 1970s and 1980s are described in both a numerical simulation and XBT observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distribution of zooplankton along two transects at Karwar and Ratnagiri, west coast of India, was studied. The standing stock of zooplankton was relatively high in the neritic zone with the highest value [358 ml/100 m super(3)] in the area off Ratnagiri due to the aggregation of fish larvae and hydromedusae. Maximum zooplankton production in these areas was noticed with the low temperature and low dissolved oxygen during postmonsoon season. At Karwar the highest biomass [188 ml/100 m super(3)] was observed from the nearshore station due to swarms of the cladoceran Penilia avirostris and the pteropod Cresis acicula when the salinity was low. The fluctuations in numerical abundance and percentage composition of all the major planktonic groups are discussed. The fishery of these areas is compared with the zooplankton standing stock.