923 resultados para Complex non-linear paradigm, Non-linearity
Resumo:
In this paper, we will demonstrate the possibility of opening a new telecommunications transmission window around the 2 μm wavelength, in order to exploit the potential low loss of hollow-core photonic bandgap fibers, with the benefits of significantly lower non-linearity and latency. We will show recent efforts developing a dense wavelength division multiplexing testbed at this waveband, with 100 GHz spacing wavelength channels and 105 Gbit/s total capacity achieved.
Resumo:
The intention of this article is to provide a structural and operational analysis of policing beyond the police in Northern Ireland. While the polity enjoys low levels of ‘officially’ recorded crime as part of its post-conflict status, little empirical analysis exists as to the epistemological roots of security production outside that of the Police Service of Northern Ireland. The empirical evidence presented seeks to establish that beyond more prominent analyses related to paramilitary ‘policing’, the country is in fact replete with a substantial reservoir of legitimate civil society policing – the collective mass of which contributes to policing, community safety and quality of life issues. While such non-state policing at the level of locale was recognised by the Independent Commission for Policing, structured understandings have rarely permeated governmental or academic discourse beyond anecdotal contentions. Thus, the present argument provides an empirical assessment of the complex, non-state policing landscape beyond the formal state apparatus; examines definitions and structures of such community-based policing activities; and explores issues related to co-opting this non-state security ‘otherness’ into more formal relations with the state.
Resumo:
Empirical evidence has demonstrated the benefits of using simulation games in enhancing learning especially in terms of cognitive gains. This is to be expected as the dynamism and non-linearity of simulation games are more cognitively demanding. However, the other effects of simulation games, specifically in terms of learners’ emotions, have not been given much attention and are under-investigated. This study aims to demonstrate that simulation games stimulate positive emotions from learners that help to enhance learning. The study finds that the affect-based constructs of interest, engagement and appreciation are positively correlated to learning. A stepwise multiple regression analysis shows that a model involving interest and engagement are significantly associated with learning. The emotions of learners should be considered in the development of curriculum, and the delivery of learning and teaching as positive emotions enhances learning.
Resumo:
Reinforced concrete creep is a phenomenon of great importance. Despite being appointed as the main cause of several pathologies, its effects are yet considered in a simplified way by the structural designers. In addition to studying the phenomenon in reinforced concrete structures and its current account used in the structural analysis, this paper compares creep strains at simply supported reinforced concrete beams in analytical and in experimental forms with the finite element method (FEM) simulation results. The strains and deflections obtained through the analytical form were calculated with the Brazilian code NBR 6118 (2014) recommendations and the simplified method from CEB-FIP 90 and the experimental results were extracted from tests available in the literature. Finite element simulations are performed using ANSYS Workbench software, using its 3D SOLID 186 elements and the structure symmetry. Analyzes of convergence using 2D PLANE 183 elements are held as well. At the end, it is concluded that FEM analyses are quantitative and qualitative efficient for the estimation of this non-linearity and that the method utilized to obtain the creep coefficients values is sufficiently accurate.
Resumo:
Analog In-memory Computing (AIMC) has been proposed in the context of Beyond Von Neumann architectures as a valid strategy to reduce internal data transfers energy consumption and latency, and to improve compute efficiency. The aim of AIMC is to perform computations within the memory unit, typically leveraging the physical features of memory devices. Among resistive Non-volatile Memories (NVMs), Phase-change Memory (PCM) has become a promising technology due to its intrinsic capability to store multilevel data. Hence, PCM technology is currently investigated to enhance the possibilities and the applications of AIMC. This thesis aims at exploring the potential of new PCM-based architectures as in-memory computational accelerators. In a first step, a preliminar experimental characterization of PCM devices has been carried out in an AIMC perspective. PCM cells non-idealities, such as time-drift, noise, and non-linearity have been studied to develop a dedicated multilevel programming algorithm. Measurement-based simulations have been then employed to evaluate the feasibility of PCM-based operations in the fields of Deep Neural Networks (DNNs) and Structural Health Monitoring (SHM). Moreover, a first testchip has been designed and tested to evaluate the hardware implementation of Multiply-and-Accumulate (MAC) operations employing PCM cells. This prototype experimentally demonstrates the possibility to reach a 95% MAC accuracy with a circuit-level compensation of cells time drift and non-linearity. Finally, empirical circuit behavior models have been included in simulations to assess the use of this technology in specific DNN applications, and to enhance the potentiality of this innovative computation approach.
Resumo:
We present a new quantum description for the Oppenheimer-Snyder model of gravitational collapse of a ball of dust. Starting from the geodesic equation for dust in spherical symmetry, we introduce a time-independent Schrödinger equation for the radius of the ball. The resulting spectrum is similar to that of the Hydrogen atom and Newtonian gravity. However, the non-linearity of General Relativity implies that the ground state is characterised by a principal quantum number proportional to the square of the ADM mass of the dust. For a ball with ADM mass much larger than the Planck scale, the collapse is therefore expected to end in a macroscopically large core and the singularity predicted by General Relativity is avoided. Mathematical properties of the spectrum are investigated and the ground state is found to have support essentially inside the gravitational radius, which makes it a quantum model for the matter core of Black Holes. In fact, the scaling of the ADM mass with the principal quantum number agrees with the Bekenstein area law and the corpuscular model of Black Holes. Finally, the uncertainty on the size of the ground state is interpreted within the framework of an Uncertainty Principle.
Resumo:
The main aim of the thesis is to prove the local Lipschitz regularity of the weak solutions to a class of parabolic PDEs modeled on the parabolic p-Laplacian. This result is well known in the Euclidean case and recently has been extended in the Heisenberg group, while higher regularity results are not known in subriemannian parabolic setting. In this thesis we will consider vector fields more general than those in the Heisenberg setting, introducing some technical difficulties. To obtain our main result we will use a Moser-like iteration. Due to the non linearity of the equation, we replace the usual parabolic cylinders with new ones, whose dimension also depends on the L^p norm of the solution. In addition, we deeply simplify the iterative procedure, using the standard Sobolev inequality, instead of the parabolic one.
Resumo:
This work focused on the development and validation of an RP-HPLC-UV method for quantification of beta-lactam antibiotics in three pharmaceutical samples. Active principles analyzed were amoxicillin and ampicillin, in 3 veterinary drugs. Mobile phase comprised 5 mmol L-1 phosphoric acid solution at pH 2.00, acetonitrile with gradient elution mode and detection wavelength at 220 nm. The method was validated according to the Brazilian National Health Surveillance regulation, where linear range and linearity, selectivity, precision, accuracy and ruggedness were evaluated. Inter day precision and accuracy for pharmaceutical samples 1, 2 and 3 were: 1.43 and 1.43%; 4.71 and 3.74%; 2.72 and 1.72%, respectively, while regression coefficients for analytical curves exceeded 0.99. The method had acceptable merit figure values, indicating reliable quantification. Analyzed samples had active principle concentrations varying from -12 to +21% compared to manufacturer label claims, rendering the medicine unsafe for administration to animals.
Resumo:
It is shown that the tight-binding approximation of the nonlinear Schrodinger equation with a periodic linear potential and periodic in space nonlinearity coefficient gives rise to a number of nonlinear lattices with complex, both linear and nonlinear, neighbor interactions. The obtained lattices present nonstandard possibilities, among which we mention a quasilinear regime, where the pulse dynamics obeys essentially the linear Schrodinger equation. We analyze the properties of such models both in connection to their modulational stability, as well as in regard to the existence and stability of their localized solitary wave solutions.
Resumo:
Pós-graduação em Linguística e Língua Portuguesa - FCLAR
Resumo:
El paradigma de procesamiento de eventos CEP plantea la solución al reto del análisis de grandes cantidades de datos en tiempo real, como por ejemplo, monitorización de los valores de bolsa o el estado del tráfico de carreteras. En este paradigma los eventos recibidos deben procesarse sin almacenarse debido a que el volumen de datos es demasiado elevado y a las necesidades de baja latencia. Para ello se utilizan sistemas distribuidos con una alta escalabilidad, elevado throughput y baja latencia. Este tipo de sistemas son usualmente complejos y el tiempo de aprendizaje requerido para su uso es elevado. Sin embargo, muchos de estos sistemas carecen de un lenguaje declarativo de consultas en el que expresar la computación que se desea realizar sobre los eventos recibidos. En este trabajo se ha desarrollado un lenguaje declarativo de consultas similar a SQL y un compilador que realiza la traducción de este lenguaje al lenguaje nativo del sistema de procesamiento masivo de eventos. El lenguaje desarrollado en este trabajo es similar a SQL, con el que se encuentran familiarizados un gran número de desarrolladores y por tanto aprender este lenguaje no supondría un gran esfuerzo. Así el uso de este lenguaje logra reducir los errores en ejecución de la consulta desplegada sobre el sistema distribuido al tiempo que se abstrae al programador de los detalles de este sistema.---ABSTRACT---The complex event processing paradigm CEP has become the solution for high volume data analytics which demand scalability, high throughput, and low latency. Examples of applications which use this paradigm are financial processing or traffic monitoring. A distributed system is used to achieve the performance requisites. These same requisites force the distributed system not to store the events but to process them on the fly as they are received. These distributed systems are complex systems which require a considerably long time to learn and use. The majority of such distributed systems lack a declarative language in which to express the computation to perform over incoming events. In this work, a new SQL-like declarative language and a compiler have been developed. This compiler translates this new language to the distributed system native language. Due to its similarity with SQL a vast amount of developers who are already familiar with SQL will need little time to learn this language. Thus, this language reduces the execution failures at the time the programmer no longer needs to know every single detail of the underlying distributed system to submit a query.
Resumo:
The visibility of the term vaccinology has become more pronounced in the 21st century in defining a scientific field that has absorbed aspects from different scientific domains until finally acquiring an identity of its own. As a result, vaccinology brings together a long tradition of researchers who have operated within a linear paradigm and incorporates new generations of scientists who have forged an exciting and diverse network of knowledge within this field. The term vaccinology, which initially appeared in isolation at the time of Jenner and once again with the emergence of the Pasteurian model, acquired further prominence thanks to the efforts of the vaccinologists who chronicled the production of vaccines in the last third of the 20th century. The term has since become truly consolidated, with the appearance of new adjectives during this century. This study provides a historical perspective for the frequency of use and evolution of this increasingly widespread term.
Resumo:
New arguments proving that successive (repeated) measurements have a memory and actually remember each other are presented. The recognition of this peculiarity can change essentially the existing paradigm associated with conventional observation in behavior of different complex systems and lead towards the application of an intermediate model (IM). This IM can provide a very accurate fit of the measured data in terms of the Prony's decomposition. This decomposition, in turn, contains a small set of the fitting parameters relatively to the number of initial data points and allows comparing the measured data in cases where the “best fit” model based on some specific physical principles is absent. As an example, we consider two X-ray diffractometers (defined in paper as A- (“cheap”) and B- (“expensive”) that are used after their proper calibration for the measuring of the same substance (corundum a-Al2O3). The amplitude-frequency response (AFR) obtained in the frame of the Prony's decomposition can be used for comparison of the spectra recorded from (A) and (B) - X-ray diffractometers (XRDs) for calibration and other practical purposes. We prove also that the Fourier decomposition can be adapted to “ideal” experiment without memory while the Prony's decomposition corresponds to real measurement and can be fitted in the frame of the IM in this case. New statistical parameters describing the properties of experimental equipment (irrespective to their internal “filling”) are found. The suggested approach is rather general and can be used for calibration and comparison of different complex dynamical systems in practical purposes.
Resumo:
Swarm colonies reproduce social habits. Working together in a group to reach a predefined goal is a social behaviour occurring in nature. Linear optimization problems have been approached by different techniques based on natural models. In particular, Particles Swarm optimization is a meta-heuristic search technique that has proven to be effective when dealing with complex optimization problems. This paper presents and develops a new method based on different penalties strategies to solve complex problems. It focuses on the training process of the neural networks, the constraints and the election of the parameters to ensure successful results and to avoid the most common obstacles when searching optimal solutions.
Resumo:
This paper proposes a physical non-linear formulation to deal with steel fiber reinforced concrete by the finite element method. The proposed formulation allows the consideration of short or long fibers placed arbitrarily inside a continuum domain (matrix). The most important feature of the formulation is that no additional degree of freedom is introduced in the pre-existent finite element numerical system to consider any distribution or quantity of fiber inclusions. In other words, the size of the system of equations used to solve a non-reinforced medium is the same as the one used to solve the reinforced counterpart. Another important characteristic of the formulation is the reduced work required by the user to introduce reinforcements, avoiding ""rebar"" elements, node by node geometrical definitions or even complex mesh generation. Bounded connection between long fibers and continuum is considered, for short fibers a simplified approach is proposed to consider splitting. Non-associative plasticity is adopted for the continuum and one dimensional plasticity is adopted to model fibers. Examples are presented in order to show the capabilities of the formulation.