965 resultados para ORACLE (Computer Programs)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Through a current and practical approach, this work aims to demonstrate how a speed reducer would behave in a real situation, but using a digital environment. Therefore, first, it was made the modeling of each component of the reducer, driven by gears. Completed the modeling of the components, it was possible to realize the connection between them and thus characterize the work as a speed reducer; and using properly sized and shaped reducer, we could finally demonstrate the operation of the same, the very Autodesk Inventor ™ 2014 environment

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Through a current and practical approach, this work aims to demonstrate how a speed reducer would behave in a real situation, but using a digital environment. Therefore, first, it was made the modeling of each component of the reducer, driven by gears. Completed the modeling of the components, it was possible to realize the connection between them and thus characterize the work as a speed reducer; and using properly sized and shaped reducer, we could finally demonstrate the operation of the same, the very Autodesk Inventor ™ 2014 environment

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Two new species of inseminating freshwater fishes of the genus Monotocheirodon, family Characidae, are described from Peru. Males and females of both new species have an external, visually obvious urogenital papilla that was not detected in the females in previous studies, with this longer in males, which use it as an inseminating organ. A third inseminating species from Bolivia, Monotocheirodon pearsoni, unstudied in any detail since its original description in 1924, is redescribed. This latter species lacks an inseminating organ. Monotocheirodon is redescribed, its phylogenetic relationships are briefly discussed and it is suggested that it is possibly related to the stevardiin genera Ceratobranchia, Othonocheirodus, and Odontostoechus.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

While imperfect information games are an excellent model of real-world problems and tasks, they are often difficult for computer programs to play at a high level of proficiency, especially if they involve major uncertainty and a very large state space. Kriegspiel, a variant of chess making it similar to a wargame, is a perfect example: while the game was studied for decades from a game-theoretical viewpoint, it was only very recently that the first practical algorithms for playing it began to appear. This thesis presents, documents and tests a multi-sided effort towards making a strong Kriegspiel player, using heuristic searching, retrograde analysis and Monte Carlo tree search algorithms to achieve increasingly higher levels of play. The resulting program is currently the strongest computer player in the world and plays at an above-average human level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Im Rahmen der vorliegenden Dissertation wurde, basierend auf der Parallel-/Orthogonalraum-Methode, eine neue Methode zur Berechnung von allgemeinen massiven Zweischleifen-Dreipunkt-Tensorintegralen mit planarer und gedrehter reduzierter planarer Topologie entwickelt. Die Ausarbeitung und Implementation einer Tensorreduktion fuer Integrale, welche eine allgemeine Tensorstruktur im Minkowski-Raum besitzen koennen, wurde durchgefuehrt. Die Entwicklung und Implementation eines Algorithmus zur semi-analytischen Berechnung der schwierigsten Integrale, die nach der Tensorreduktion verbleiben, konnte vollendet werden. (Fuer die anderen Basisintegrale koennen wohlbekannte Methoden verwendet werden.) Die Implementation ist bezueglich der UV-endlichen Anteile der Masterintegrale, die auch nach Tensorreduktion noch die zuvor erwaehnten Topologien besitzen, abgeschlossen. Die numerischen Integrationen haben sich als stabil erwiesen. Fuer die verbleibenden Teile des Projektes koennen wohlbekannte Methoden verwendet werden. In weiten Teilen muessen lediglich noch Links zu existierenden Programmen geschrieben werden. Fuer diejenigen wenigen verbleibenden speziellen Topologien, welche noch zu implementieren sind, sind (wohlbekannte) Methoden zu implementieren. Die Computerprogramme, die im Rahmen dieses Projektes entstanden, werden auch fuer allgemeinere Prozesse in das xloops-Projekt einfliessen. Deswegen wurde sie soweit moeglich fuer allgemeine Prozesse entwickelt und implementiert. Der oben erwaehnte Algorithmus wurde insbesondere fuer die Evaluation der fermionischen NNLO-Korrekturen zum leptonischen schwachen Mischungswinkel sowie zu aehnlichen Prozessen entwickelt. Im Rahmen der vorliegenden Dissertation wurde ein Grossteil der fuer die fermionischen NNLO-Korrekturen zu den effektiven Kopplungskonstanten des Z-Zerfalls (und damit fuer den schachen Mischungswinkel) notwendigen Arbeit durchgefuehrt.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to develop a model capable to capture the different contributions which characterize the nonlinear behaviour of reinforced concrete structures. In particular, especially for non slender structures, the contribution to the nonlinear deformation due to bending may be not sufficient to determine the structural response. Two different models characterized by a fibre beam-column element are here proposed. These models can reproduce the flexure-shear interaction in the nonlinear range, with the purpose to improve the analysis in shear-critical structures. The first element discussed is based on flexibility formulation which is associated with the Modified Compression Field Theory as material constitutive law. The other model described in this thesis is based on a three-field variational formulation which is associated with a 3D generalized plastic-damage model as constitutive relationship. The first model proposed in this thesis was developed trying to combine a fibre beamcolumn element based on the flexibility formulation with the MCFT theory as constitutive relationship. The flexibility formulation, in fact, seems to be particularly effective for analysis in the nonlinear field. Just the coupling between the fibre element to model the structure and the shear panel to model the individual fibres allows to describe the nonlinear response associated to flexure and shear, and especially their interaction in the nonlinear field. The model was implemented in an original matlab® computer code, for describing the response of generic structures. The simulations carried out allowed to verify the field of working of the model. Comparisons with available experimental results related to reinforced concrete shears wall were performed in order to validate the model. These results are characterized by the peculiarity of distinguishing the different contributions due to flexure and shear separately. The presented simulations were carried out, in particular, for monotonic loading. The model was tested also through numerical comparisons with other computer programs. Finally it was applied for performing a numerical study on the influence of the nonlinear shear response for non slender reinforced concrete (RC) members. Another approach to the problem has been studied during a period of research at the University of California Berkeley. The beam formulation follows the assumptions of the Timoshenko shear beam theory for the displacement field, and uses a three-field variational formulation in the derivation of the element response. A generalized plasticity model is implemented for structural steel and a 3D plastic-damage model is used for the simulation of concrete. The transverse normal stress is used to satisfy the transverse equilibrium equations of at each control section, this criterion is also used for the condensation of degrees of freedom from the 3D constitutive material to a beam element. In this thesis is presented the beam formulation and the constitutive relationships, different analysis and comparisons are still carrying out between the two model presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this thesis a mathematical model was derived that describes the charge and energy transport in semiconductor devices like transistors. Moreover, numerical simulations of these physical processes are performed. In order to accomplish this, methods of theoretical physics, functional analysis, numerical mathematics and computer programming are applied. After an introduction to the status quo of semiconductor device simulation methods and a brief review of historical facts up to now, the attention is shifted to the construction of a model, which serves as the basis of the subsequent derivations in the thesis. Thereby the starting point is an important equation of the theory of dilute gases. From this equation the model equations are derived and specified by means of a series expansion method. This is done in a multi-stage derivation process, which is mainly taken from a scientific paper and which does not constitute the focus of this thesis. In the following phase we specify the mathematical setting and make precise the model assumptions. Thereby we make use of methods of functional analysis. Since the equations we deal with are coupled, we are concerned with a nonstandard problem. In contrary, the theory of scalar elliptic equations is established meanwhile. Subsequently, we are preoccupied with the numerical discretization of the equations. A special finite-element method is used for the discretization. This special approach has to be done in order to make the numerical results appropriate for practical application. By a series of transformations from the discrete model we derive a system of algebraic equations that are eligible for numerical evaluation. Using self-made computer programs we solve the equations to get approximate solutions. These programs are based on new and specialized iteration procedures that are developed and thoroughly tested within the frame of this research work. Due to their importance and their novel status, they are explained and demonstrated in detail. We compare these new iterations with a standard method that is complemented by a feature to fit in the current context. A further innovation is the computation of solutions in three-dimensional domains, which are still rare. Special attention is paid to applicability of the 3D simulation tools. The programs are designed to have justifiable working complexity. The simulation results of some models of contemporary semiconductor devices are shown and detailed comments on the results are given. Eventually, we make a prospect on future development and enhancements of the models and of the algorithms that we used.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is concerned with the calculation of virtual Compton scattering (VCS) in manifestly Lorentz-invariant baryon chiral perturbation theory to fourth order in the momentum and quark-mass expansion. In the one-photon-exchange approximation, the VCS process is experimentally accessible in photon electro-production and has been measured at the MAMI facility in Mainz, at MIT-Bates, and at Jefferson Lab. Through VCS one gains new information on the nucleon structure beyond its static properties, such as charge, magnetic moments, or form factors. The nucleon response to an incident electromagnetic field is parameterized in terms of 2 spin-independent (scalar) and 4 spin-dependent (vector) generalized polarizabilities (GP). In analogy to classical electrodynamics the two scalar GPs represent the induced electric and magnetic dipole polarizability of a medium. For the vector GPs, a classical interpretation is less straightforward. They are derived from a multipole expansion of the VCS amplitude. This thesis describes the first calculation of all GPs within the framework of manifestly Lorentz-invariant baryon chiral perturbation theory. Because of the comparatively large number of diagrams - 100 one-loop diagrams need to be calculated - several computer programs were developed dealing with different aspects of Feynman diagram calculations. One can distinguish between two areas of development, the first concerning the algebraic manipulations of large expressions, and the second dealing with numerical instabilities in the calculation of one-loop integrals. In this thesis we describe our approach using Mathematica and FORM for algebraic tasks, and C for the numerical evaluations. We use our results for real Compton scattering to fix the two unknown low-energy constants emerging at fourth order. Furthermore, we present the results for the differential cross sections and the generalized polarizabilities of VCS off the proton.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To what extent is “software engineering” really “engineering” as this term is commonly understood? A hallmark of the products of the traditional engineering disciplines is trustworthiness based on dependability. But in his keynote presentation at ICSE 2006 Barry Boehm pointed out that individuals’, systems’, and peoples’ dependency on software is becoming increasingly critical, yet that dependability is generally not the top priority for software intensive system producers. Continuing in an uncharacteristic pessimistic vein, Professor Boehm said that this situation will likely continue until a major software-induced system catastrophe similar in impact to the 9/11 World Trade Center catastrophe stimulates action toward establishing accountability for software dependability. He predicts that it is highly likely that such a software-induced catastrophe will occur between now and 2025. It is widely understood that software, i.e., computer programs, are intrinsically different from traditionally engineered products, but in one aspect they are identical: the extent to which the well-being of individuals, organizations, and society in general increasingly depend on software. As wardens of the future through our mentoring of the next generation of software developers, we believe that it is our responsibility to at least address Professor Boehm’s predicted catastrophe. Traditional engineering has, and continually addresses its social responsibility through the evolution of the education, practice, and professional certification/licensing of professional engineers. To be included in the fraternity of professional engineers, software engineering must do the same. To get a rough idea of where software engineering currently stands on some of these issues we conducted two surveys. Our main survey was sent to software engineering academics in the U.S., Canada, and Australia. Among other items it sought detail information on their software engineering programs. Our auxiliary survey was sent to U.S. engineering institutions to get some idea about how software engineering programs compared with those in established engineering disciplines of Civil, Electrical, and Mechanical Engineering. Summaries of our findings can be found in the last two sections of our paper.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Design rights represent an interesting example of how the EU legislature has successfully regulated an otherwise heterogeneous field of law. Yet this type of protection is not for all. The tools created by EU intervention have been drafted paying much more attention to the industry sector rather than to designers themselves. In particular, modern, digitally based, individual or small-sized, 3D printing, open designers and their needs are largely neglected by such legislation. There is obviously nothing wrong in drafting legal tools around the needs of an industrial sector with an important role in the EU economy, on the contrary, this is a legitimate and good decision of industrial policy. However, good legislation should be fair, balanced, and (technologically) neutral in order to offer suitable solutions to all the players in the market, and all the citizens in the society, without discriminating the smallest or the newest: the cost would be to stifle innovation. The use of printing machinery to manufacture physical objects created digitally thanks to computer programs such as Computer-Aided Design (CAD) software has been in place for quite a few years, and it is actually the standard in many industrial fields, from aeronautics to home furniture. The change in recent years that has the potential to be a paradigm-shifting factor is a combination between the opularization of such technologies (price, size, usability, quality) and the diffusion of a culture based on access to and reuse of knowledge. We will call this blend Open Design. It is probably still too early, however, to say whether 3D printing will be used in the future to refer to a major event in human history, or instead will be relegated to a lonely Wikipedia entry similarly to ³Betamax² (copyright scholars are familiar with it for other reasons). It is not too early, however, to develop a legal analysis that will hopefully contribute to clarifying the major issues found in current EU design law structure, why many modern open designers will probably find better protection in copyright, and whether they can successfully rely on open licenses to achieve their goals. With regard to the latter point, we will use Creative Commons (CC) licenses to test our hypothesis due to their unique characteristic to be modular, i.e. to have different license elements (clauses) that licensors can choose in order to adapt the license to their own needs.”

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The purpose of the article is to provide first a doctrinal summary of the concept, rules and policy of exhaustion, first, on the international and EU level, and, later, under the law of the United States. Based upon this introduction, the paper turns to the analysis of the doctrine by the pioneer court decisions handed over in the UsedSoft, ReDigi, the German e-book/audio book cases, and the pending Tom Kabinet case from the Netherlands. Questions related to the licence versus sale dichotomy; the so-called umbrella solution; the “new copy theory”, migration of digital copies via the internet; the forward-and-delete technology; the issue of lex specialis and the theory of functional equivalence are covered later on. The author of the present article stresses that the answers given by the respective judges of the referred cases are not the final stop in the discussion. The UsedSoft preliminary ruling and the subsequent German domestic decisions highlight a special treatment for computer programs. On the other hand, the refusal of digital exhaustion in the ReDigi and the audio book/e-book cases might be in accordance with the present wording of copyright law; however, they do not necessarily reflect the proper trends of our ages. The paper takes the position that the need for digital exhaustion is constantly growing in society and amongst businesses. Indeed, there are reasonable arguments in favour of equalizing the resale of works sold in tangible and intangible format. Consequently, the paper urges the reconsideration of the norms on exhaustion on the international and EU level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Individuals with intellectual disabilities (ID) often struggle with learning how to read. Reading difficulties seem to be the most common secondary condition of ID. Only one in five children with mild or moderate ID achieves even minimal literacy skills. However, literacy education for children and adolescents with ID has been largely overlooked by researchers and educators. While there is little research on reading of children with ID, many training studies have been conducted with other populations with reading difficulties. The most common approach of acquiring literacy skills consists of sophisticated programs that train phonological skills and auditory perception. Only few studies investigated the influence of implicit learning on literacy skills. Implicit learning processes seem to be largely independent of age and IQ. Children are sensitive to the statistics of their learning environment. By frequent word reading they acquire implicit knowledge about the frequency of single letters and letter patterns in written words. Additionally, semantic connections not only improve the word understanding, but also facilitate storage of words in memory. Advances in communication technology have introduced new possibilities for remediating literacy skills. Computers can provide training material in attractive ways, for example through animations and immediate feedback .These opportunities can scaffold and support attention processes central to learning. Thus, the aim of this intervention study was to develop and implement a computer based word-picture training, which is based on statistical and semantic learning, and to examine the training effects on reading, spelling and attention in children and adolescents (9-16 years) diagnosed with mental retardation (general IQ  74). Fifty children participated in four to five weekly training sessions of 15-20 minutes over 4 weeks, and completed assessments of attention, reading, spelling, short-term memory and fluid intelligence before and after training. After a first assessment (T1), the entire sample was divided in a training group (group A) and a waiting control group (group B). After 4 weeks of training with group A, a second assessment (T2) was administered with both training groups. Afterwards, group B was trained for 4 weeks, before a last assessment (T3) was carried out in both groups. Overall, the results showed that the word-picture training led to substantial gains on word decoding and attention for both training groups. These effects were preserved six weeks later (group A). There was also a clear tendency of improvement in spelling after training for both groups, although the effect did not reach significance. These findings highlight the fact that an implicit statistical learning training in a playful way by motivating computer programs can not only promote reading development, but also attention in children with intellectual disabilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Phase I clinical trial is mainly designed to determine the maximum tolerated dose (MTD) of a new drug. Optimization of phase I trial design is crucial to minimize the number of enrolled patients exposed to unsafe dose levels and to provide reliable information to the later phases of clinical trials. Although it has been criticized about its inefficient MTD estimation, nowadays the traditional 3+3 method remains dominant in practice due to its simplicity and conservative estimation. There are many new designs that have been proven to generate more credible MTD estimation, such as the Continual Reassessment Method (CRM). Despite its accepted better performance, the CRM design is still not widely used in real trials. There are several factors that contribute to the difficulties of CRM adaption in practice. First, CRM is not widely accepted by the regulatory agencies such as FDA in terms of safety. It is considered to be less conservative and tend to expose more patients above the MTD level than the traditional design. Second, CRM is relatively complex and not intuitive for the clinicians to fully understand. Third, the CRM method take much more time and need statistical experts and computer programs throughout the trial. The current situation is that the clinicians still tend to follow the trial process that they are comfortable with. This situation is not likely to change in the near future. Based on this situation, we have the motivation to improve the accuracy of MTD selection while follow the procedure of the traditional design to maintain simplicity. We found that in 3+3 method, the dose transition and the MTD determination are relatively independent. Thus we proposed to separate the two stages. The dose transition rule remained the same as 3+3 method. After getting the toxicity information from the dose transition stage, we combined the isotonic transformation to ensure the monotonic increasing order before selecting the optimal MTD. To compare the operating characteristics of the proposed isotonic method and the other designs, we carried out 10,000 simulation trials under different dose setting scenarios to compare the design characteristics of the isotonic modified method with standard 3+3 method, CRM, biased coin design (BC) and k-in-a-row design (KIAW). The isotonic modified method improved MTD estimation of the standard 3+3 in 39 out of 40 scenarios. The improvement is much greater when the target is 0.3 other than 0.25. The modified design is also competitive when comparing with other selected methods. A CRM method performed better in general but was not as stable as the isotonic method throughout the different dose settings. The results demonstrated that our proposed isotonic modified method is not only easily conducted using the same procedure as 3+3 but also outperforms the conventional 3+3 design. It can also be applied to determine MTD for any given TTL. These features make the isotonic modified method of practical value in phase I clinical trials.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

SIMBAA is a spatially explicit, individual-based simulation model. It was developed to analyse the response of populations of Antarctic benthic species and their diversity to iceberg scouring. This disturbance is causing a high local mortality providing potential space for new colonisation. Traits can be attributed to model species, e.g. in terms of reproduction, dispersal, and life span. Physical disturbances can be designed in space and time, e.g. in terms of size, shape, and frequency. Environmental heterogeneity can be considered by cell-specific capacities to host a certain number of individuals. When grid cells become empty (after a disturbance event or due to natural mortality of of an individual), a lottery decides which individual from which species stored in a pool of candidates (for this cell) will recruit in that cell. After a defined period the individuals become mature and their offspring are dispersed and stored in the pool of candidates. The biological parameters and disturbance regimes decide on how long an individual lives. Temporal development of single populations of species as well as Shannon diversity are depicted in the main window graphically and primary values are listed. Examples for simulations can be loaded and saved as sgf-files. The results are also shown in an additional window in a dimensionless area with 50 x 50 cells, which contain single individuals depicted as circles; their colour indicates the assignment to the self-designed model species and the size represents their age. Dominant species per cell and disturbed areas can also be depicted. Output of simulation runs can be saved as images, which can be assembled to video-clips by standard computer programs (see GIF-examples of which "Demo 1" represents the response of the Antarctic benthos to iceberg scouring and "Demo 2" represents a simulation of a deep-sea benthic habitat).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Los terremotos constituyen una de las más importantes fuentes productoras de cargas dinámicas que actúan sobre las estructuras y sus cimentaciones. Cuando se produce un terremoto la energía liberada genera movimientos del terreno en forma de ondas sísmicas que pueden provocar asientos en las cimentaciones de los edificios, empujes sobre los muros de contención, vuelco de las estructuras y el suelo puede licuar perdiendo su capacidad de soporte. Los efectos de los terremotos en estructuras constituyen unos de los aspectos que involucran por su condición de interacción sueloestructura, disciplinas diversas como el Análisis Estructural, la Mecánica de Suelo y la Ingeniería Sísmica. Uno de los aspectos que han sido poco estudiados en el cálculo de estructuras sometidas a la acciones de los terremotos son los efectos del comportamiento no lineal del suelo y de los movimientos que pueden producirse bajo la acción de cargas sísmicas, tales como posibles despegues y deslizamientos. En esta Tesis se estudian primero los empujes sísmicos y posibles deslizamientos de muros de contención y se comparan las predicciones de distintos tipos de cálculos: métodos pseudo-estáticos como el de Mononobe-Okabe (1929) con la contribución de Whitman-Liao (1985), y formulaciones analíticas como la desarrollada por Veletsos y Younan (1994). En segundo lugar se estudia el efecto del comportamiento no lineal del terreno en las rigideces de una losa de cimentación superficial y circular, como la correspondiente a la chimenea de una Central Térmica o al edificio del reactor de una Central Nuclear, considerando su variación con frecuencia y con el nivel de cargas. Finalmente se estudian los posibles deslizamientos y separación de las losas de estas dos estructuras bajo la acción de terremotos, siguiendo la formulación propuesta por Wolf (1988). Para estos estudios se han desarrollado una serie de programas específicos (MUROSIS, VELETSOS, INTESES y SEPARSE) cuyos listados y detalles se incluyen en los Apéndices. En el capítulo 6 se incluyen las conclusiones resultantes de estos estudios y recomendaciones para futuras investigaciones. ABSTRACT Earthquakes constitute one of the most important sources of dynamic loads that acting on structures and foundations. When an earthquake occurs the liberated energy generates seismic waves that can give rise to structural vibrations, settlements of the foundations of buildings, pressures on retaining walls, and possible sliding, uplifting or even overturning of structures. The soil can also liquefy losing its capacity of support The study of the effects of earthquakes on structures involve the use of diverse disciplines such as Structural Analysis, Soil Mechanics and Earthquake Engineering. Some aspects that have been the subject of limited research in relation to the behavior of structures subjected to earthquakes are the effects of nonlinear soil behavior and geometric nonlinearities such as sliding and uplifting of foundations. This Thesis starts with the study of the seismic pressures and potential displacements of retaining walls comparing the predictions of two types of formulations and assessing their range of applicability and limitations: pseudo-static methods as proposed by Mononobe-Okabe (1929), with the contribution of Whitman-Liao (1985), and analytical formulations as the one developed by Veletsos and Younan (1994) for rigid walls. The Thesis deals next with the effects of nonlinear soil behavior on the dynamic stiffness of circular mat foundations like the chimney of a Thermal Power Station or the reactor building of a Nuclear Power Plant, as a function of frequency and level of forces. Finally the seismic response of these two structures accounting for the potential sliding and uplifting of the foundation under a given earthquake are studied, following an approach suggested by Wolf (1988). In order to carry out these studies a number of special purposes computer programs were developed (MUROSIS, VELETSOS, INTESES and SEPARSE). The listing and details of these programs are included in the appendices. The conclusions derived from these studies and recommendations for future work are presented in Chapter 6.