195 resultados para Bernoulli
Resumo:
Statistics has penetrated almost all branches of science and all areas of human endeavor. At the same time, statistics is not only misunderstood, misused and abused to a frightening extent, but it is also often much disliked by students in colleges and universities. This lecture discusses/covers/addresses the historical development of statistics, aiming at identifying the most important turning points that led to the present state of statistics and at answering the questions “What went wrong with statistics?” and “What to do next?”. ACM Computing Classification System (1998): A.0, A.m, G.3, K.3.2.
Resumo:
2000 Mathematics Subject Classification: Primary 60G51, secondary 60G70, 60F17.
Resumo:
We present indefinite integration algorithms for rational functions over subfields of the complex numbers, through an algebraic approach. We study the local algorithm of Bernoulli and rational algorithms for the class of functions in concern, namely, the algorithms of Hermite; Horowitz-Ostrogradsky; Rothstein-Trager and Lazard-Rioboo-Trager. We also study the algorithm of Rioboo for conversion of logarithms involving complex extensions into real arctangent functions, when these logarithms arise from the integration of rational functions with real coefficients. We conclude presenting pseudocodes and codes for implementation in the software Maxima concerning the algorithms studied in this work, as well as to algorithms for polynomial gcd computation; partial fraction decomposition; squarefree factorization; subresultant computation, among other side algorithms for the work. We also present the algorithm of Zeilberger-Almkvist for integration of hyperexpontential functions, as well as its pseudocode and code for Maxima. As an alternative for the algorithms of Rothstein-Trager and Lazard-Rioboo-Trager, we yet present a code for Benoulli’s algorithm for square-free denominators; and another for Czichowski’s algorithm, although this one is not studied in detail in the present work, due to the theoretical basis necessary to understand it, which is beyond this work’s scope. Several examples are provided in order to illustrate the working of the integration algorithms in this text
Resumo:
In the Oil industry, oil and gas pipelines are commonly utilized to perform the transportation of production fluids to longer distances. The maintenance of the pipelines passes through the analysis of several tools, in which the most currently used are the pipelines inspection cells, popularly knowing as PIG. Among the variants existing in the market, the instrumented PIG has a significant relevance; acknowledging that through the numerous sensors existing in the equipment, it can detect faults or potential failure along the inspected line. Despite its versatility, the instrumented PIG suffers from speed variations, impairing the reading of sensors embedded in it. Considering that PIG moves depending on the speed of the production fluid, a way to control his speed is to control the flow of the fluid through the pressure control, reducing the flow rate of the produced flow, resulting in reduction of overall production the fluid in the ducts own or with the use of a restrictive element (valve) installed on it. The characteristic of the flow rate/pressure drop from restrictive elements of the orifice plate is deducted usually from the ideal energy equation (Bernoulli’s equation) and later, the losses are corrected normally through experimental tests. Thus, with the objective of controlling the fluids flow passing through the PIG, a valve shutter actuated by solenoid has been developed. This configuration allows an ease control and stabilization of the flow adjustment, with a consequent response in the pressure drops between upstream and downstream of the restriction. It was assembled a test bench for better definition of flow coefficients; composed by a duct with intern diameter of four inches, one set of shutters arranged in a plate and pressure gauges for checking the pressure drop in the test. The line was pressurized and based on the pressure drop it was possible to draw a curve able to characterize the flow coefficient of the control valve prototype and simulate in mockup the functioning, resulting in PIG speed reduction of approximately 68%.
Resumo:
Con la presente tesi viene esaminato un metodo per modificare la frequenza di risonanza di trasduttori piezoelettrici mediante applicazione di carichi elettrici esterni. L'elaborato inizia con la presentazione dei cristalli utilizzati nel lavoro di tesi, concentrandosi sul processo di fabbricazione di un bimorph cantilever impiegato come convertitore elettromeccanico di energia, la cui frequenza di risonanza è modellizzata analiticamente mediante la legge di Newton e il modello di Euler-Bernoulli. Su tale struttura vengono condotte misure mediante shaker elettrodinamico e analizzatore d'impedenza, ai fini di giusticare il modello analitico presentato. Con lo scopo di sincronizzare la frequenza di risonanza del cantilever con la vibrazione dell'ambiente per massimizzare la potenza disponibile, viene proposto un algoritmo MPPT secondo l'approccio Perturba e Osserva (P&O), al quale è fornita in ingresso la tensione efficace di un layer di materiale piezoelettrico. Valutare la sua risposta in tensione, presenta dei limiti applicativi che hanno portato a prendere in considerazione un approccio totalmente diff�erente, basato sullo sfasamento tra la tensione di un trasduttore piezoelettrico e il segnale di accelerazione impiegato come eccitazione. Misure sperimentali sono state condotte con l'obiettivo di validare l'efficacia di quest'ultimo approccio qualora si voglia sincronizzare la frequenza di risonanza dei piezo con segnali di vibrazione reali.
Resumo:
Peer reviewed
Resumo:
The effect of unevenness in a bridge deck for the purpose of Structural Health Monitoring (SHM) under operational conditions is studied in this paper. The moving vehicle is modelled as a single degree of freedom system traversing the damaged beam at a constant speed. The bridge is modelled as an Euler-Bernoulli beam with a breathing crack, simply supported at both ends. The breathing crack is treated as a nonlinear system with bilinear stiffness characteristics related to the opening and closing of crack. The unevenness in the bridge deck considered is modelled using road classification according to ISO 8606:1995(E). Numerical simulations are conducted considering the effects of changing road surface classes from class A - very good to class E - very poor. Cumulant based statistical parameters, based on a new algorithm are computed on stochastic responses of the damaged beam due to passages of the load in order to calibrate the damage. Possibilities of damage detection and calibration under benchmarked and non-benchmarked cases are considered. The findings of this paper are important for establishing the expectations from different types of road roughness on a bridge for damage detection purposes using bridge-vehicle interaction where the bridge does not need to be closed for monitoring.
Resumo:
The effects of vehicle speed for Structural Health Monitoring (SHM) of bridges under operational conditions are studied in this paper. The moving vehicle is modelled as a single degree oscillator traversing a damaged beam at a constant speed. The bridge is modelled as simply supported Euler-Bernoulli beam with a breathing crack. The breathing crack is treated as a nonlinear system with bilinear stiffness characteristics related to the opening and closing of crack. The unevenness of the bridge deck is modelled using road classification according to ISO 8606:1995(E). The stochastic description of the unevenness of the road surface is used as an aid to monitor the health of the structure in its operational condition. Numerical simulations are conducted considering the effects of changing vehicle speed with regards to cumulant based statistical damage detection parameters. The detection and calibration of damage at different levels is based on an algorithm dependent on responses of the damaged beam due to passages of the load. Possibilities of damage detection and calibration under benchmarked and non-benchmarked cases are considered. Sensitivity of calibration values is studied. The findings of this paper are important for establishing the expectations from different vehicle speeds on a bridge for damage detection purposes using bridge-vehicle interaction where the bridge does not need to be closed for monitoring. The identification of bunching of these speed ranges provides guidelines for using the methodology developed in the paper.
Resumo:
The Jurassic (hemi)pelagic continental margin deposits drilled at Hole 547B, off the Moroccan coast, reveal striking Tethyan affinity. Analogies concern not only types and gross vertical evolution of facies, but also composition and textures of the fine sediment and the pattern of diagenetic alteration. In this context, the occurrence of the nanno-organism Schizosphaerella Deflandre and Dangeard (sometimes as a conspicuous portion of the fine-grained carbonate fraction) is of particular interest. Schizosphaerella, an incertae sedis taxon, has been widely recorded as a sediment contributor from Tethyan Jurassic deeper-water carbonate facies exposed on land. Because of its extremely long range (Hettangian to early Kimmeridgian), the genus Schizosphaerella (two species currently described, S. punctulata Deflandre and Dangeard and S. astrea Moshkovitz) is obviously not of great biostratigraphic interest. However, it is of interest in sedimentology and petrology. Specifically, Schizosphaerella was often the only component of the initial fine-grained fraction of a sediment that was able to resist diagenetic obliteration. However, alteration of the original skeletal structure did occur to various degrees. Crystal habit and mineralogy of the fundamental skeletal elements, as well as their mode of mutual arrangement in the test wall with the implied high initial porosity of the skeleton (60-70%), appear to be responsible for this outstanding resistance. Moreover, the ability to concentrate within and, in the case of the species S. punctulata, around the skeleton, large amounts of diagenetic calcite also contributed to the resistance. In both species of Schizosphaerella, occlusion of the original skeletal void space during diagenesis appears to have proceeded in an analogous manner, with an initial slight uniform syntaxial enlargement of the basic lamellar skeletal crystallites followed, upon mutual impingement, by uneven accretion of overgrowth cement in the remaining skeletal voids. However, distinctive fabrics are evident according to the different primary test wall architecture. In S. punctulata, intraskeletal cementation is usually followed by the growth of a radially structured crust of bladed to fibrous calcite around the valves. These crusts are interpreted as a product of aggrading neomorphism, associated with mineralogic stabilization of the original, presumably polyphase, sediment. Data from Hole 547B, along with inferences, drawn from the fabric relationships, suggest that the crusts formed and (inferentially) mineralogic stabilization occurred at a relatively early time in the diagenetic history in the shallow burial realm. An enhanced rate of lithification at relatively shallow burial depths and thus the chance for neomorphism to significantly influence the textural evolution of the buried sediment may be related to a lower Mg/Ca concentration ratio in the oceanic system and, hence, in marine pore waters in pre-Late Jurassic times.
Resumo:
The Cutri Formation’s, type location, exposed in the NW of Mallorca, Spain has previously been described by Álvaro et al., (1989) and further interpreted by Abbots (1989) unpublished PhD thesis as a base-of-slope carbonate apron. Incorporating new field and laboratory analysis this paper enhances this interpretation. From this analysis, it can be shown without reasonable doubt that the Cutri Formation was deposited in a carbonate base-of-slope environment on the palaeowindward side of a Mid-Jurassic Tethyan platform. Key evidence such as laterally extensive exposures, abundant deposits of calciturbidtes and debris flows amongst hemipelagic deposits strongly support this interpretation.
Resumo:
We present a summary of the series representations of the remainders in the expansions in ascending powers of t of 2/(et+1)2/(et+1) , sech t and coth t and establish simple bounds for these remainders when t>0t>0 . Several applications of these expansions are given which enable us to deduce some inequalities and completely monotonic functions associated with the ratio of two gamma functions. In addition, we derive a (presumably new) quadratic recurrence relation for the Bernoulli numbers Bn.
Resumo:
This dissertation investigates the connection between spectral analysis and frame theory. When considering the spectral properties of a frame, we present a few novel results relating to the spectral decomposition. We first show that scalable frames have the property that the inner product of the scaling coefficients and the eigenvectors must equal the inverse eigenvalues. From this, we prove a similar result when an approximate scaling is obtained. We then focus on the optimization problems inherent to the scalable frames by first showing that there is an equivalence between scaling a frame and optimization problems with a non-restrictive objective function. Various objective functions are considered, and an analysis of the solution type is presented. For linear objectives, we can encourage sparse scalings, and with barrier objective functions, we force dense solutions. We further consider frames in high dimensions, and derive various solution techniques. From here, we restrict ourselves to various frame classes, to add more specificity to the results. Using frames generated from distributions allows for the placement of probabilistic bounds on scalability. For discrete distributions (Bernoulli and Rademacher), we bound the probability of encountering an ONB, and for continuous symmetric distributions (Uniform and Gaussian), we show that symmetry is retained in the transformed domain. We also prove several hyperplane-separation results. With the theory developed, we discuss graph applications of the scalability framework. We make a connection with graph conditioning, and show the in-feasibility of the problem in the general case. After a modification, we show that any complete graph can be conditioned. We then present a modification of standard PCA (robust PCA) developed by Cand\`es, and give some background into Electron Energy-Loss Spectroscopy (EELS). We design a novel scheme for the processing of EELS through robust PCA and least-squares regression, and test this scheme on biological samples. Finally, we take the idea of robust PCA and apply the technique of kernel PCA to perform robust manifold learning. We derive the problem and present an algorithm for its solution. There is also discussion of the differences with RPCA that make theoretical guarantees difficult.
Resumo:
La historia mostrar? un panorama general cronol?gico referente a la iniciaci?n de la combinatoria que se ha clasificado por siglos, a conveniencia para exhibir momentos contundentes en la constituci?n de las ideas que se desarrollaron en el estudio de estructuras discretas y las relaciones a trav?s de sus operaciones. Las interrelaciones entre los elementos de un conjunto originan cambios en su estructura y es as? que se empieza a vislumbrar la estirpe, es decir, las ra?ces o naturaleza de la que gozan las combinaciones. Es desde oriente, a trav?s de la cultura china en tiempos legendarios que se encuentran las evidencias de elementos primigenios de la combinatoria con el primer cuadrado m?gico. Las reglas principales del c?lculo de permutaciones, variaciones y combinaciones est?n a cargo de los matem?ticos hind?es y jud?os. Pero el camino hacia el reconocimiento de la combinatoria como un campo digno de estudio formal lo preparan los matem?ticos Fermat y Pascal a trav?s de sus correspondencias buscando la soluci?n al problema del reparto. El punto c?spide se genera con las obras de Gottfried Wilhelm Leibniz y Jacobo Bernoulli. El primero es el autor de Disertatio de arte combinatoria donde introduce el t?rmino ?Combinatoria? como actualmente se conoce. Adem?s, Leibniz realiza la construcci?n sistem?tica del conocimiento combinatorio que se hab?a obtenido hasta la ?poca. De otro lado, es la obra magna Ars Conjectandi (Arte de conjeturar) de Jacobo Bernoulli donde la combinatoria se vuelve la base para la resoluci?n de algunos problemas de probabilidad de aquel tiempo.
Resumo:
A new type of space debris was recently discovered by Schildknecht in near -geosynchronous orbit (GEO). These objects were later identified as exhibiting properties associated with High Area-to-Mass ratio (HAMR) objects. According to their brightness magnitudes (light curve), high rotation rates and composition properties (albedo, amount of specular and diffuse reflection, colour, etc), it is thought that these objects are multilayer insulation (MLI). Observations have shown that this debris type is very sensitive to environmental disturbances, particularly solar radiation pressure, due to the fact that their shapes are easily deformed leading to changes in the Area-to-Mass ratio (AMR) over time. This thesis proposes a simple effective flexible model of the thin, deformable membrane with two different methods. Firstly, this debris is modelled with Finite Element Analysis (FEA) by using Bernoulli-Euler theory called “Bernoulli model”. The Bernoulli model is constructed with beam elements consisting 2 nodes and each node has six degrees of freedom (DoF). The mass of membrane is distributed in beam elements. Secondly, the debris based on multibody dynamics theory call “Multibody model” is modelled as a series of lump masses, connected through flexible joints, representing the flexibility of the membrane itself. The mass of the membrane, albeit low, is taken into account with lump masses in the joints. The dynamic equations for the masses, including the constraints defined by the connecting rigid rod, are derived using fundamental Newtonian mechanics. The physical properties of both flexible models required by the models (membrane density, reflectivity, composition, etc.), are assumed to be those of multilayer insulation. Both flexible membrane models are then propagated together with classical orbital and attitude equations of motion near GEO region to predict the orbital evolution under the perturbations of solar radiation pressure, Earth’s gravity field, luni-solar gravitational fields and self-shadowing effect. These results are then compared to two rigid body models (cannonball and flat rigid plate). In this investigation, when comparing with a rigid model, the evolutions of orbital elements of the flexible models indicate the difference of inclination and secular eccentricity evolutions, rapid irregular attitude motion and unstable cross-section area due to a deformation over time. Then, the Monte Carlo simulations by varying initial attitude dynamics and deformed angle are investigated and compared with rigid models over 100 days. As the results of the simulations, the different initial conditions provide unique orbital motions, which is significantly different in term of orbital motions of both rigid models. Furthermore, this thesis presents a methodology to determine the material dynamic properties of thin membranes and validates the deformation of the multibody model with real MLI materials. Experiments are performed in a high vacuum chamber (10-4 mbar) replicating space environment. A thin membrane is hinged at one end but free at the other. The free motion experiment, the first experiment, is a free vibration test to determine the damping coefficient and natural frequency of the thin membrane. In this test, the membrane is allowed to fall freely in the chamber with the motion tracked and captured through high velocity video frames. A Kalman filter technique is implemented in the tracking algorithm to reduce noise and increase the tracking accuracy of the oscillating motion. The forced motion experiment, the last test, is performed to determine the deformation characteristics of the object. A high power spotlight (500-2000W) is used to illuminate the MLI and the displacements are measured by means of a high resolution laser sensor. Finite Element Analysis (FEA) and multibody dynamics of the experimental setups are used for the validation of the flexible model by comparing with the experimental results of displacements and natural frequencies.
Resumo:
A maior parte dos estudiosos da obra de Verney destacam no domínio da filosofia natural a sua defesa do experimentalismo e a adopção, dentro do eclectismo filosófico, de uma postura de adesão ao newtonianismo. Nas linhas que se seguem propomo-nos aprofundar a matriz newtoniana do pensamento de Verney ao nível das fontes (autores e livros) mencionados no Verdadeiro Método de Estudar, em particular na sua Carta X. Analisa-se as suas referências aos Principia de Newton bem como ao Cálculo Integral e Diferencial com a citação de Leibniz, os irmãos Bernoulli e o Marquês de l’ Hôpital e outros matemáticos contemporâneos. Assinala-se a ausência de qualquer alusão à segunda grande obra de Newton, a Óptica. Por último, discute-se a inclusão sistemática de autores e obras italianas, menos conhecidos da Europa culta da época, que parece terem sido importantes na formação «moderna» do jovem Verney radicado em Roma.