961 resultados para idea-cache model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have carried out symmetrized density-matrix renormalization-group calculations to study the nature of excited states of long polyacene oligomers within a Pariser-Parr-Pople Hamiltonian. We have used the C-2 symmetry, the electron-hole symmetry, and the spin parity of the system in our calculations. We find that there is a crossover in the lowest dipole forbidden two-photon state and the lowest dipole allowed excited state with size of the oligomer. In the long system limit, the two-photon state lies below the lowest dipole allowed excited state. The triplet state lies well below the two-photon state and energetically does not correspond to its description as being made up of two triplets. These results are in agreement with the general trends in linear conjugated polymers. However, unlike in linear polyenes wherein the two-photon state is a localized excitation, we find that in polyacenes, the two-photon excitation is spread out over the system. We have doped the systems with a hole and an electron and have calculated the charge excitation gap. Using the charge gap and the optical gap, we estimate the binding energy of the 1(1)B(-) exciton to be 2.09 eV. We have also studied doubly doped polyacenes and find that the bipolaron in these systems, to be composed of two separated polarons, as indicated by the calculated charge-density profile and charge-charge correlation function. We have studied bond orders in various states in order to get an idea of the excited state geometry of the system. We find that the ground state, the triplet state, the dipole allowed state, and the polaron excitations correspond to lengthening of the rung bonds in the interior of the oligomer while the two-photon excitation corresponds to the rung bond lengths having two maxima in the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We build dynamic models of community assembly by starting with one species in our model ecosystem and adding colonists. We find that the number of species present first increases, then fluctuates about some level. We ask: how large are these fluctuations and how can we characterize them statistically? As in Robert May's work, communities with weaker interspecific interactions permit a greater number of species to coexist on average. We find that as this average increases, however, the relative variation in the number of species and return times to mean community levels decreases. In addition, the relative frequency of large extinction events to small extinction events decreases as mean community size increases. While the model reproduces several of May's results, it also provides theoretical support for Charles Elton's idea that diverse communities such as those found in the tropics should be less variable than depauperate communities such as those found in arctic or agricultural settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cache analysis plays a very important role in obtaining precise Worst Case Execution Time (WCET) estimates of programs for real-time systems. While Abstract Interpretation based approaches are almost universally used for cache analysis, they fail to take advantage of its unique requirement: it is not necessary to find the guaranteed cache behavior that holds across all executions of a program. We only need the cache behavior along one particular program path, which is the path with the maximum execution time. In this work, we introduce the concept of cache miss paths, which allows us to use the worst-case path information to improve the precision of AI-based cache analysis. We use Abstract Interpretation to determine the cache miss paths, and then integrate them in the IPET formulation. An added advantage is that this further allows us to use infeasible path information for cache analysis. Experimentally, our approach gives more precise WCETs as compared to AI-based cache analysis, and we also provide techniques to trade-off analysis time with precision to provide scalability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A micropolar cohesive damage model for delamination of composites is proposed. The main idea is to embed micropolarity, which brings an additional layer of kinematics through the micro-rotation degrees of freedom within a continuum model to account for the micro-structural effects during delamination. The resulting cohesive model, describing the modified traction separation law, includes micro-rotational jumps in addition to displacement jumps across the interface. The incorporation of micro-rotation requires the model to be supplemented with physically relevant material length scale parameters, whose effects during delamination of modes I and II are brought forth using numerical simulations appropriately supported by experimental evidences. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen: En este trabajo examino aspectos filosóficos del transhumanismo, en particular en lo que se refiere a su antropología filosófica y la filosofía de la tecnología que se halla implicada en esta. Me enfoco en la ingeniería genética y la “evolución dirigida”, una narrativa central en la promoción de un futuro “posthumano” que –según el transhumanismo– traerá beneficios en gran escala para la humanidad. Argumento que el transhumanismo avanza una teoría deliberativa de los valores en un contexto que favorece la lógica de mercado para la comercialización y distribución de los bienes prometidos por la reprogenética. Esta teoría deliberativa, a su vez, se basa en una visión antropológica con fuertes raíces humanistas. Argumento que esta estrategia del transhumanismo nos lleva a profundas contradicciones, dado que la lógica individualista-mercantilista no conlleva lógicamente a un beneficio global en lo que concierne a la “naturaleza” humana.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present paper, the piston model of the coronal transient (see Hu. 1983a, b is discssed in detail, and the quantitative results of unsteady gasdynamics are applied to the coronal transient processes. The piston model explains the major features of the transient observations, such as the density profile, the geometric configuration, the kinetic process and the classifications of the coronal transient. Based on the idea of piston model, the bright feature and the dark feature of the transient are the gasdynamical response of the dense plasma ejecting into the corona, and associate with the compressed and rarefied flows, respectively. The quantitative results show that the density increment in the compressed region and the density decrement in the rarefied region are one order of magnitude larger and smaller, respectively, to the density in the quiet corona, it agrees quantitatively with the observations, and both the bright feature and dark feature are explained at the same time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transcription factor binding sites (TFBS) play key roles in genebior 6.8 wavelet expression and regulation. They are short sequence segments with de¯nite structure and can be recognized by the corresponding transcription factors correctly. From the viewpoint of statistics, the candidates of TFBS should be quite di®erent from the segments that are randomly combined together by nucleotide. This paper proposes a combined statistical model for ¯nding over- represented short sequence segments in di®erent kinds of data set. While the over-represented short sequence segment is described by position weight matrix, the nucleotide distribution at most sites of the segment should be far from the background nucleotide distribution. The central idea of this approach is to search for such kind of signals. This algorithm is tested on 3 data sets, including binding sites data set of cyclic AMP receptor protein in E.coli, PlantProm DB which is a non-redundant collection of proximal promoter sequences from di®erent species, collection of the intergenic sequences of the whole genome of E.Coli. Even though the complexity of these three data sets is quite di®erent, the results show that this model is rather general and sensible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.

It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.

In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.

Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many types of mazes have been used in cognitive brain research and data obtained from those experiments, especially those from rodents' studies, support the idea that the hippocampus is related to spatial learning and memory. But the results from non-huma

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Discrete element modeling is being used increasingly to simulate flow in fluidized beds. These models require complex measurement techniques to provide validation for the approximations inherent in the model. This paper introduces the idea of modeling the experiment to ensure that the validation is accurate. Specifically, a 3D, cylindrical gas-fluidized bed was simulated using a discrete element model (DEM) for particle motion coupled with computational fluid dynamics (CFD) to describe the flow of gas. The results for time-averaged, axial velocity during bubbling fluidization were compared with those from magnetic resonance (MR) experiments made on the bed. The DEM-CFD data were postprocessed with various methods to produce time-averaged velocity maps for comparison with the MR results, including a method which closely matched the pulse sequence and data processing procedure used in the MR experiments. The DEM-CFD results processed with the MR-type time-averaging closely matched experimental MR results, validating the DEM-CFD model. Analysis of different averaging procedures confirmed that MR time-averages of dynamic systems correspond to particle-weighted averaging, rather than frame-weighted averaging, and also demonstrated that the use of Gaussian slices in MR imaging of dynamic systems is valid. © 2013 American Chemical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The basic idea of a defect model of photoconversion by an oxygen impurity in semi-insulating GaAs, proposed in an earlier paper, is described in a systematic way. All experiments related to this defect, including high-resolution spectroscopic measurements, piezospectroscopic study, and recent measurements on electronic energy levels, are explained on the basis of this defect model. The predictions of the model are in good agreement with the experiments. A special negative-U mechanism in this defect is discussed in detail with an emphasis on the stability of the charge states. The theoretical basis of using a self-consistent bond-orbital model in the calculation is also given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new theoretical model of Pattern Recognition principles was proposed, which is based on "matter cognition" instead of "matter classification" in traditional statistical Pattern Recognition. This new model is closer to the function of human being, rather than traditional statistical Pattern Recognition using "optimal separating" as its main principle. So the new model of Pattern Recognition is called the Biomimetic Pattern Recognition (BPR)(1). Its mathematical basis is placed on topological analysis of the sample set in the high dimensional feature space. Therefore, it is also called the Topological Pattern Recognition (TPR). The fundamental idea of this model is based on the fact of the continuity in the feature space of any one of the certain kinds of samples. We experimented with the Biomimetic Pattern Recognition (BPR) by using artificial neural networks, which act through covering the high dimensional geometrical distribution of the sample set in the feature space. Onmidirectionally cognitive tests were done on various kinds of animal and vehicle models of rather similar shapes. For the total 8800 tests, the correct recognition rate is 99.87%. The rejection rate is 0.13% and on the condition of zero error rates, the correct rate of BPR was much better than that of RBF-SVM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With web caching and cache-related services like CDNs and edge services playing an increasingly significant role in the modern internet, the problem of the weak consistency and coherence provisions in current web protocols is becoming increasingly significant and drawing the attention of the standards community [LCD01]. Toward this end, we present definitions of consistency and coherence for web-like environments, that is, distributed client-server information systems where the semantics of interactions with resource are more general than the read/write operations found in memory hierarchies and distributed file systems. We then present a brief review of proposed mechanisms which strengthen the consistency of caches in the web, focusing upon their conceptual contributions and their weaknesses in real-world practice. These insights motivate a new mechanism, which we call "Basis Token Consistency" or BTC; when implemented at the server, this mechanism allows any client (independent of the presence and conformity of any intermediaries) to maintain a self-consistent view of the server's state. This is accomplished by annotating responses with additional per-resource application information which allows client caches to recognize the obsolescence of currently cached entities and identify responses from other caches which are already stale in light of what has already been seen. The mechanism requires no deviation from the existing client-server communication model, and does not require servers to maintain any additional per-client state. We discuss how our mechanism could be integrated into a fragment-assembling Content Management System (CMS), and present a simulation-driven performance comparison between the BTC algorithm and the use of the Time-To-Live (TTL) heuristic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many networked applications, independent caching agents cooperate by servicing each other's miss streams, without revealing the operational details of the caching mechanisms they employ. Inference of such details could be instrumental for many other processes. For example, it could be used for optimized forwarding (or routing) of one's own miss stream (or content) to available proxy caches, or for making cache-aware resource management decisions. In this paper, we introduce the Cache Inference Problem (CIP) as that of inferring the characteristics of a caching agent, given the miss stream of that agent. While CIP is insolvable in its most general form, there are special cases of practical importance in which it is, including when the request stream follows an Independent Reference Model (IRM) with generalized power-law (GPL) demand distribution. To that end, we design two basic "litmus" tests that are able to detect LFU and LRU replacement policies, the effective size of the cache and of the object universe, and the skewness of the GPL demand for objects. Using extensive experiments under synthetic as well as real traces, we show that our methods infer such characteristics accurately and quite efficiently, and that they remain robust even when the IRM/GPL assumptions do not hold, and even when the underlying replacement policies are not "pure" LFU or LRU. We exemplify the value of our inference framework by considering example applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interpretations people attach to line drawings reflect shape-related processes in human vision. Their divergences from expectations embodied in related machine vision traditions are summarized, and used to suggest how human vision decomposes the task of interpretation. A model called IO implements this idea. It first identifies geometrically regular, local fragments. Initial decisions fix edge orientations, and this information constrains decisions about other properties. Relations between fragments are explored, beginning with weak consistency checks and moving to fuller ones. IO's output captures multiple distinctive characteristics of human performance, and it suggests steady progress towards understanding shape-related visual processes is possible.