908 resultados para CLEOPATRA (Computer program language)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

O presente trabalho aborda o efeito de sobrecargas assimétricas em estacas, através do estudo de caso de encontros da Ponte do Rio Capivari nas obras do Arco Metropolitano. Neste caso específico as estacas foram construídas previamente a execução de aterros em terra armada e para minimizar o efeito de Tschebotarioff o solo foi reforçado com colunas de brita. Com embasamento na revisão bibliográfica apresentada foi realizada a análise pelo Método dos Elementos Finitos. Esta análise, realizada pelo programa computacional Plaxis, teve como foco principal a obtenção dos deslocamentos e momentos fletores nas estacas para os dois casos analisados: sem colunas de brita e com solo homogêneo equivalente (com colunas de brita). Foi possível verificar a influência da introdução das colunas de brita na redução dos momentos fletores e deslocamentos horizontais nas estacas dos encontros. Realizou-se ainda uma simulação em que as estacas seriam construídas após a realização dos aterros em terra armada onde pode-se constatar que o efeito de sobrecargas assimétricas seria mitigado. Foi efetuada também a comparação entre os deslocamentos provenientes de dados coletados da instrumentação de campo (inclinômetros) e os obtidos pelas análises numéricas, estando os mesmos compatíveis entre si, demonstrando que a metodologia adotada para simulação das colunas de brita no Método dos Elementos finitos foi adequada.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Population parameters of Lepturacanthus savala from the trawl catches in the north-eastern part of the Bay of Bengal, Bangladesh were investigated based on length frequency data, using complete ELEFAN computer program. The asymptotic length (Lα) and growth constant (K) were estimated to be 106.50 cm (total length) and 0.80/year respectively. Based on these growth parameters, the total mortality (Z) was estimated to be 1.89. The estimated values for natural mortality (M) and fishing mortality (F) were 1.08 and 0.81 respectively. The estimated value for the exploitation rate (E) using the length converted catch curve was 0.43. The recruitment pattern showed two peaks per year. The estimated sizes of L. savala at 25, 50 and 75% probabilities of capture were 57.49, 60.39 and 63.28 cm respectively. The estimated length weight relationship for combined sex was W=0.00093 TL(super)2.97

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article investigates how to use UK probabilistic climate-change projections (UKCP09) in rigorous building energy analysis. Two office buildings (deep plan and shallow plan) are used as case studies to demonstrate the application of UKCP09. Three different methods for reducing the computational demands are explored: statistical reduction (Finkelstein-Schafer [F-S] statistics), simplification using degree-day theory and the use of metamodels. The first method, which is based on an established technique, can be used as reference because it provides the most accurate information. However, it is necessary to automatically choose weather files based on F-S statistic by using computer programming language because thousands of weather files created from UKCP09 weather generator need to be processed. A combination of the second (degree-day theory) and third method (metamodels) requires only a relatively small number of simulation runs, but still provides valuable information to further implement the uncertainty and sensitivity analyses. The article also demonstrates how grid computing can be used to speed up the calculation for many independent EnergyPlus models by harnessing the processing power of idle desktop computers. © 2011 International Building Performance Simulation Association (IBPSA).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The creep effects on sequentially built bridges are analysed by the theory of thermal creep. Two types of analysis are used: time dependent and steady state. The traditional uniform creep analysis is also introduced briefly. Both simplified and parabolic normalising creep-temperature functions are used in the analysis for comparison. Numerical examples are presented, calculated by a computer program based on the theory of thermal creep and using the displacement method. It is concluded that different assumptions within thermal creep can lead to very different results when compared with uniform creep analysis. The steady-state analysis of monolithically built structures can serve as a limit to evaluate total creep effects for both monolithically and sequentially built structures. The importance of the correct selection of the normalising creep-temperature function is demonstrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper describes an experimental and theoretical study of the deposition of small spherical particles from a turbulent air flow in a curved duct. The objective was to investigate the interaction between the streamline curvature of the primary flow and the turbulent deposition mechanisms of diffusion and turbophoresis. The experiments were conducted with particles of uranine (used as a fluorescent tracer) produced by an aerosol generator. The particles were entrained in an air flow which passed vertically downwards through a long straight channel of rectangular cross-section leading to a 90° bend. The inside surfaces of the channel and bend were covered with tape to collect the deposited particles. Following a test run the tape was removed in sections, the uranine was dissolved in sodium hydroxide solution and the deposition rates established by measuring the uranine concentration with a luminescence spectrometer. The experimental results were compared with calculations of particle deposition in a curved duct using a computer program that solved the ensemble-averaged particle mass and momentum conservation equations. A particle density-weighted averaging procedure was used and the equations were expressed in terms of the particle convective, rather than total, velocity. This approach provided a simpler formulation of the particle turbulence correlations generated by the averaging process. The computer program was used to investigate the distance required to achieve a fully-developed particle flow in the straight entry channel as well as the variation of the deposition rate around the bend. The simulations showed good agreement with the experimental results. © 2012 Elsevier Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the most important issues facing the helicopter industry today is helicopter noise, in particular transonic rotor noise. It is the main factor limiting cruise speeds, and there is real demand for efficient and reliable prediction methods which can be used in the rotor design process. This paper considers the Ffowcs Williams-Hawkings equation applied to a permeable control surface. The surface is chosen to be as small as possible, while enclosing both the blade and any transonic flow regions. This allows the problematic quadrupole term to always be neglected, and requires only near field CFD input data. It is therefore less computationally intensive than existing prediction methods, and moreover retains the physical interpretation of the sources in terms of thickness, loading and shock-associated noise. A computer program has been developed which implements the permeable surface form of retarded time formulation. The program has been validated and subsequently used to validate an acoustic 2-D CFD code. It is fast and reliable for subsonic motion, but it is demonstrated that it cannot be used at high subsonic or supersonic speeds. A second computer program implementing a more general formulation has also been developed and is presently being validated. This general formulation can be applied at high subsonic and supersonic speeds, except under one specific condition. © 2002 by the author(s). Published by the American Institute of Aeronautics and Astronautics, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discipline of Artificial Intelligence (AI) was born in the summer of 1956 at Dartmouth College in Hanover, New Hampshire. Half of a century has passed, and AI has turned into an important field whose influence on our daily lives can hardly be overestimated. The original view of intelligence as a computer program - a set of algorithms to process symbols - has led to many useful applications now found in internet search engines, voice recognition software, cars, home appliances, and consumer electronics, but it has not yet contributed significantly to our understanding of natural forms of intelligence. Since the 1980s, AI has expanded into a broader study of the interaction between the body, brain, and environment, and how intelligence emerges from such interaction. This advent of embodiment has provided an entirely new way of thinking that goes well beyond artificial intelligence proper, to include the study of intelligent action in agents other than organisms or robots. For example, it supplies powerful metaphors for viewing corporations, groups of agents, and networked embedded devices as intelligent and adaptive systems acting in highly uncertain and unpredictable environments. In addition to giving us a novel outlook on information technology in general, this broader view of AI also offers unexpected perspectives into how to think about ourselves and the world around us. In this chapter, we briefly review the turbulent history of AI research, point to some of its current trends, and to challenges that the AI of the 21st century will have to face. © Springer-Verlag Berlin Heidelberg 2007.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An automated and semi-intelligent voltammetric system is described for trace metal analysis. The system consists of a voltammeter interfaced with a personal computer, a sample changer, 2 peristaltic pumps, a motor burette and a hanging mercury drop electrode. The system carries out fully automatically approximately 5 metal determinations per hour (including at least 3 repetitive scans and calibration by standard addition) at trace levels encountered in clean sea water. The computer program decides what level of standard addition to use and evaluates the data prior to switching to the next sample. Alternatively, the system can be used to carry out complexing ligand titration with copper whilst recording the labile copper concentration; in this mode up to 8 full titrations are carried out per day. Depth profiles for chromium speciation in the Mediterranean Sea and a profile for copper complexing ligand concentrations in the North Atlantic Ocean measured on board-ship with the system are presented. The chromium speciation was determined using a new method to differentiate between Cr(III) and Cr(VI) utilizing adsorption of Cr(III) on silica particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The identification of near native protein-protein complexes among a set of decoys remains highly challenging. A stategy for improving the success rate of near native detection is to enrich near native docking decoys in a small number of top ranked decoys. Recently, we found that a combination of three scoring functions (energy, conservation, and interface propensity) can predict the location of binding interface regions with reasonable accuracy. Here, these three scoring functions are modified and combined into a consensus scoring function called ENDES for enriching near native docking decoys. We found that all individual scores result in enrichment for the majority of 28 targets in ZDOCK2.3 decoy set and the 22 targets in Benchmark 2.0. Among the three scores, the interface propensity score yields the highest enrichment in both sets of protein complexes. When these scores are combined into the ENDES consensus score, a significant increase in enrichment of near-native structures is found. For example, when 2000 dock decoys are reduced to 200 decoys by ENDES, the fraction of near-native structures in docking decoys increases by a factor of about six in average. ENDES was implemented into a computer program that is available for download at http://sparks.informatics.iupui.edu.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Random amplified polymorphism DNA (RAPD) analysis was applied to germplasm characterization in 33 different selected Laminaria male and female gametophytes. The positional homology of the RAPD analysis using sequence characterized applied region (SCAR) method was successfully conducted. A total of 233 polymorphic loci were obtained from 18 selected primers after screening, of which 27 stable and clear bands were selected to construct a fingerprint map for discrimination of each gametophyte. Seven RAPD markers from five primers were finally determined by a computer program to construct the fingerprint map. Three specific markers closely related with gametophytes were obtained and were converted to gametophytic SCAR markers, the first SCAR marker report on Laminaria germplasm and applicable to cultivars identification. These results demonstrated the feasibility of applying RAPD markers to germplasm characterization in selected Laminaria gametophytes, and can provide a molecular basis for breeding new Laminaria strains. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

本文从信息控制的角度出发将机器人语言定义为能处理某些特定的“外部设备”的计算机程序设计语言。并将机器人语言成份分为两大部分,即机器人核心语言和机器人专用语言。然后分别综述了机器人专用语言和机器人核心语言的进展情况。

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Kineticist's Workbench is a computer program currently under development whose purpose is to help chemists understand, analyze, and simplify complex chemical reaction mechanisms. This paper discusses one module of the program that numerically simulates mechanisms and constructs qualitative descriptions of the simulation results. These descriptions are given in terms that are meaningful to the working chemist (e.g., steady states, stable oscillations, and so on); and the descriptions (as well as the data structures used to construct them) are accessible as input to other programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Bifurcation Interpreter is a computer program that autonomously explores the steady-state orbits of one-parameter families of periodically- driven oscillators. To report its findings, the Interpreter generates schematic diagrams and English text descriptions similar to those appearing in the science and engineering research literature. Given a system of equations as input, the Interpreter uses symbolic algebra to automatically generate numerical procedures that simulate the system. The Interpreter incorporates knowledge about dynamical systems theory, which it uses to guide the simulations, to interpret the results, and to minimize the effects of numerical error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Humans rapidly and reliably learn many kinds of regularities and generalizations. We propose a novel model of fast learning that exploits the properties of sparse representations and the constraints imposed by a plausible hardware mechanism. To demonstrate our approach we describe a computational model of acquisition in the domain of morphophonology. We encapsulate phonological information as bidirectional boolean constraint relations operating on the classical linguistic representations of speech sounds in term of distinctive features. The performance model is described as a hardware mechanism that incrementally enforces the constraints. Phonological behavior arises from the action of this mechanism. Constraints are induced from a corpus of common English nouns and verbs. The induction algorithm compiles the corpus into increasingly sophisticated constraints. The algorithm yields one-shot learning from a few examples. Our model has been implemented as a computer program. The program exhibits phonological behavior similar to that of young children. As a bonus the constraints that are acquired can be interpreted as classical linguistic rules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report describes MM, a computer program that can model a variety of mechanical and fluid systems. Given a system's structure and qualitative behavior, MM searches for models using an energy-based modeling framework. MM uses general facts about physical systems to relate behavioral and model properties. These facts enable a more focussed search for models than would be obtained by mere comparison of desired and predicted behaviors. When these facts do not apply, MM uses behavior-constrained qualitative simulation to verify candidate models efficiently. MM can also design experiments to distinguish among multiple candidate models.