941 resultados para motion picture industry
Resumo:
Do we view the world differently if it is described to us in figurative rather than literal terms? An answer to this question would reveal something about both the conceptual representation of figurative language and the scope of top-down influences oil scene perception. Previous work has shown that participants will look longer at a path region of a picture when it is described with a type of figurative language called fictive motion (The road goes through the desert) rather than without (The road is in the desert). The current experiment provided evidence that such fictive motion descriptions affect eye movements by evoking mental representations of motion. If participants heard contextual information that would hinder actual motion, it influenced how they viewed a picture when it was described with fictive motion. Inspection times and eye movements scanning along the path increased during fictive motion descriptions when the terrain was first described as difficult (The desert is hilly) as compared to easy (The desert is flat); there were no such effects for descriptions without fictive motion. It is argued that fictive motion evokes a mental simulation of motion that is immediately integrated with visual processing, and hence figurative language can have a distinct effect on perception. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
This paper presents a novel two-pass algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS). compensation. for block base motion On the basis of research from previous algorithms, especially an on-the-edge motion estimation algorithm called hexagonal search (HEXBS), we propose the LHMEA and the Two-Pass Algorithm (TPA). We introduce hashtable into video compression. In this paper we employ LHMEA for the first-pass search in all the Macroblocks (MB) in the picture. Motion Vectors (MV) are then generated from the first-pass and are used as predictors for second-pass HEXBS motion estimation, which only searches a small number of MBs. The evaluation of the algorithm considers the three important metrics being time, compression rate and PSNR. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms. Experimental results show that the proposed algorithm can offer the same compression rate as the Full Search. LHMEA with TPA has significant improvement on HEXBS and shows a direction for improving other fast motion estimation algorithms, for example Diamond Search.
Resumo:
This paper presents a parallel Linear Hashtable Motion Estimation Algorithm (LHMEA). Most parallel video compression algorithms focus on Group of Picture (GOP). Based on LHMEA we proposed earlier [1][2], we developed a parallel motion estimation algorithm focus inside of frame. We divide each reference frames into equally sized regions. These regions are going to be processed in parallel to increase the encoding speed significantly. The theory and practice speed up of parallel LHMEA according to the number of PCs in the cluster are compared and discussed. Motion Vectors (MV) are generated from the first-pass LHMEA and used as predictors for second-pass Hexagonal Search (HEXBS) motion estimation, which only searches a small number of Macroblocks (MBs). We evaluated distributed parallel implementation of LHMEA of TPA for real time video compression.
Resumo:
This paper presents a novel two-pass algorithm constituted by Linear Hashtable Motion Estimation Algorithm (LHMEA) and Hexagonal Search (HEXBS) for block base motion compensation. On the basis of research from previous algorithms, especially an on-the-edge motion estimation algorithm called hexagonal search (HEXBS), we propose the LHMEA and the Two-Pass Algorithm (TPA). We introduced hashtable into video compression. In this paper we employ LHMEA for the first-pass search in all the Macroblocks (MB) in the picture. Motion Vectors (MV) are then generated from the first-pass and are used as predictors for second-pass HEXBS motion estimation, which only searches a small number of MBs. The evaluation of the algorithm considers the three important metrics being time, compression rate and PSNR. The performance of the algorithm is evaluated by using standard video sequences and the results are compared to current algorithms, Experimental results show that the proposed algorithm can offer the same compression rate as the Full Search. LHMEA with TPA has significant improvement on HEXBS and shows a direction for improving other fast motion estimation algorithms, for example Diamond Search.
Resumo:
In this paper, we construct a dynamic portrait of the inner asteroidal belt. We use information about the distribution of test particles, which were initially placed on a perfectly rectangular grid of initial conditions, after 4.2 Myr of gravitational interactions with the Sun and five planets, from Mars to Neptune. Using the spectral analysis method introduced by Michtchenko et al., the asteroidal behaviour is illustrated in detail on the dynamical, averaged and frequency maps. On the averaged and frequency maps, we superpose information on the proper elements and proper frequencies of real objects, extracted from the data base, AstDyS, constructed by Milani and Knezevic. A comparison of the maps with the distribution of real objects allows us to detect possible dynamical mechanisms acting in the domain under study; these mechanisms are related to mean-motion and secular resonances. We note that the two- and three-body mean-motion resonances and the secular resonances (strong linear and weaker non-linear) have an important role in the diffusive transportation of the objects. Their long-lasting action, overlaid with the Yarkovsky effect, may explain many observed features of the density, size and taxonomic distributions of the asteroids.
Resumo:
This paper presents the second part in our study of the global structure of the planar phase space of the planetary three-body problem, when both planets lie in the vicinity of a 2/1 mean-motion resonance. While Paper I was devoted to cases where the outer planet is the more massive body, the present work is devoted to the cases where the more massive body is the inner planet. As before, outside the well-known Apsidal Corotation Resonances (ACR), the phase space shows a complex picture marked by the presence of several distinct regimes of resonant and non-resonant motion, crossed by families of periodic orbits and separated by chaotic zones. When the chosen values of the integrals of motion lead to symmetric ACR, the global dynamics are generally similar to the structure presented in Paper I. However, for asymmetric ACR the resonant phase space is strikingly different and shows a galore of distinct dynamical states. This structure is shown with the help of dynamical maps constructed on two different representative planes, one centred on the unstable symmetric ACR and the other on the stable asymmetric equilibrium solution. Although the study described in the work may be applied to any mass ratio, we present a detailed analysis for mass values similar to the Jupiter-Saturn case. Results give a global view of the different dynamical states available to resonant planets with these characteristics. Some of these dynamical paths could have marked the evolution of the giant planets of our Solar system, assuming they suffered a temporary capture in the 2/1 resonance during the latest stages of the formation of our Solar system.
Resumo:
The auditing role in the contemporaneous business environment, and increasing interest in and demand for governance and transparency, has become an element even more important to the society, as a whole, in order to build solid basis to the development of businesses and generation of wealth through technical knowledge, independence, transparency, credibility, and ethics. Nevertheless, the external financial audit industry in the world and also particularly in Brazil has faced several challenges which threaten its success and evolution. In this sense, since the external audit industry in Brazil has been immersed in a deep crisis with features that are explored through this study, allow me to create an analogy over this study saying that the external financial audit industry is like a sick person with a chronic disease, but the disease has not yet been diagnosed and the person has been dealing with the isolated symptoms. This person, the external audit industry, has struggled with this disease for many years and it is getting worse. It is fundamental to highlight that the challenges faced by the external audit industry in Brazil, ultimately, have not harmed the industry only, but they also materialize themselves as chronic issues for the corporate governance and the capital markets since they harm every interested party. In my point of view, the hardest affected are the investors or shareholders whose interest the independent auditor’s work seeks to preserve. Therefore, the purpose of this study is to have a picture of the challenges faced by the external audit industry in Brazil and understand those challenges as a requirement to analyze the potential alternatives to solve them or, analogically, to diagnose this disease. The research purpose is to map and identify the challenges faced by the external audit industry in Brazil based on the understanding of professionals seasoned in the area. Those challenges are mapped and understood through a methodological approach, a questionnaire answered by auditors with experience in the Brazilian auditing market. The challenges were preliminarily listed based on over 16 years of experience of the author in the area of auditing and financial and accounting services, discussions and interviews about the topic with seasoned professionals, and analyses of pieces of news, publications and academic studies. The questionnaire was used in order to validate the challenges, observations, perspectives, and perceptions gathered through those resources. Despite of the fact that the study is highly relevant, it was not found, through my research, other analyses on this topic with a similar approach which is intended by this study. It looks like the external audit industry in Brazil has walked through these new age dealing with problems on a daily basis and the real challenges of the industry may be concealed by the economic conditions in Brazil and other explanations. As in any problematic scenario, in which a critical analysis is needed, having an accurate picture and understanding of the challenges is a crucial step to start exploring alternatives to address them.
Resumo:
Content marketing refers to marketing format that involves the creation and sharing of media and publishing content in order to acquire customers. It is focused not on selling, but on communicating with customers and prospects. In today world´s, a trend has been seen in brands becoming publishers in order to keep up with their competition and more importantly to keep their base of fans and followers. Content Marketing is making companies to engage consumers by publishing engaging and value-filled content. This study aims to investigate if there is a link between brand engagement and Facebook Content Marketing practices in the e-commerce industry in Brazil. Based on the literature review, this study defines brand engagement on Facebook as the numbers of "likes" "comments" and "shares" that a company receives from its fans. These actions reflect the popularity of the brand post and leads to engagement. The author defines a scale where levels of Content Marketing practices are developed in order to analyze brand posts on Facebook of an ecommerce company in Brazil. The findings reveal that the most important criterion for the company is the one regarding the picture of the post, where it examines whether the photo content is appealing to the audience. Moreover, it was perceived that the higher the level of these criterion in a post, the greater the number of likes, comments and shares the post receives. The time when a post is published does not present a significant role in determining customer engagement and the most important factor within a publication is to reach the maximum level in the Content Marketing Scale.
Resumo:
We evaluate the vacuum polarization tensor for three-dimensional quantum electrodynamics (QED3) via Heisenberg equations of motion in order to clarify the problem arising from the use of different regularization prescriptions in the interaction picture. We conclude that the photon does acquire physical mass of topological origin when such contribution is taken into account for the photon propagator.
Resumo:
We study a charged Brownian gas with a non uniform bath temperature, and present a thermohydrodynamical picture. Expansion on the collision time probes the validity of the local equilibrium approach and the relevant thermodynamical variables. For the linear regime we present several applications (some novel).
Resumo:
Includes bibliography
Resumo:
We consider a charged Brownian gas under the influence of external and non-uniform electric, magnetic and mechanical fields, immersed in a non-uniform bath temperature. With the collision time as an expansion parameter, we study the solution to the associated Kramers equation, including a linear reactive term. To the first order we obtain the asymptotic (overdamped) regime, governed by transport equations, namely: for the particle density, a Smoluchowski- reactive like equation; for the particle's momentum density, a generalized Ohm's-like equation; and for the particle's energy density, a MaxwellCattaneo-like equation. Defining a nonequilibrium temperature as the mean kinetic energy density, and introducing Boltzmann's entropy density via the one particle distribution function, we present a complete thermohydrodynamical picture for a charged Brownian gas. We probe the validity of the local equilibrium approximation, Onsager relations, variational principles associated to the entropy production, and apply our results to: carrier transport in semiconductors, hot carriers and Brownian motors. Finally, we outline a method to incorporate non-linear reactive kinetics and a mean field approach to interacting Brownian particles. © 2011 Elsevier B.V. All rights reserved.
Resumo:
We consider a charged Brownian gas under the influence of external, static and uniform electric and magnetic fields, immersed in a uniform bath temperature. We obtain the solution for the associated Langevin equation, and thereafter the evolution of the nonequilibrium temperature towards a nonequilibrium (hot) steady state. We apply our results to a simple yet relevant Brownian model for carrier transport in GaAs. We obtain a negative differential conductivity regime (Gunn effect) and discuss and compare our results with the experimental results. © 2013.
Resumo:
The aim of this thesis was to describe the development of motion analysis protocols for applications on upper and lower limb extremities, by using inertial sensors-based systems. Inertial sensors-based systems are relatively recent. Knowledge and development of methods and algorithms for the use of such systems for clinical purposes is therefore limited if compared with stereophotogrammetry. However, their advantages in terms of low cost, portability, small size, are a valid reason to follow this direction. When developing motion analysis protocols based on inertial sensors, attention must be given to several aspects, like the accuracy of inertial sensors-based systems and their reliability. The need to develop specific algorithms/methods and software for using these systems for specific applications, is as much important as the development of motion analysis protocols based on them. For this reason, the goal of the 3-years research project described in this thesis was achieved first of all trying to correctly design the protocols based on inertial sensors, in terms of exploring and developing which features were suitable for the specific application of the protocols. The use of optoelectronic systems was necessary because they provided a gold standard and accurate measurement, which was used as a reference for the validation of the protocols based on inertial sensors. The protocols described in this thesis can be particularly helpful for rehabilitation centers in which the high cost of instrumentation or the limited working areas do not allow the use of stereophotogrammetry. Moreover, many applications requiring upper and lower limb motion analysis to be performed outside the laboratories will benefit from these protocols, for example performing gait analysis along the corridors. Out of the buildings, the condition of steady-state walking or the behavior of the prosthetic devices when encountering slopes or obstacles during walking can also be assessed. The application of inertial sensors on lower limb amputees presents conditions which are challenging for magnetometer-based systems, due to ferromagnetic material commonly adopted for the construction of idraulic components or motors. INAIL Prostheses Centre stimulated and, together with Xsens Technologies B.V. supported the development of additional methods for improving the accuracy of MTx in measuring the 3D kinematics for lower limb prostheses, with the results provided in this thesis. In the author’s opinion, this thesis and the motion analysis protocols based on inertial sensors here described, are a demonstration of how a strict collaboration between the industry, the clinical centers, the research laboratories, can improve the knowledge, exchange know-how, with the common goal to develop new application-oriented systems.
Resumo:
In vielen Bereichen der industriellen Fertigung, wie zum Beispiel in der Automobilindustrie, wer- den digitale Versuchsmodelle (sog. digital mock-ups) eingesetzt, um die Entwicklung komplexer Maschinen m ̈oglichst gut durch Computersysteme unterstu ̈tzen zu k ̈onnen. Hierbei spielen Be- wegungsplanungsalgorithmen eine wichtige Rolle, um zu gew ̈ahrleisten, dass diese digitalen Pro- totypen auch kollisionsfrei zusammengesetzt werden k ̈onnen. In den letzten Jahrzehnten haben sich hier sampling-basierte Verfahren besonders bew ̈ahrt. Diese erzeugen eine große Anzahl von zuf ̈alligen Lagen fu ̈r das ein-/auszubauende Objekt und verwenden einen Kollisionserken- nungsmechanismus, um die einzelnen Lagen auf Gu ̈ltigkeit zu u ̈berpru ̈fen. Daher spielt die Kollisionserkennung eine wesentliche Rolle beim Design effizienter Bewegungsplanungsalgorith- men. Eine Schwierigkeit fu ̈r diese Klasse von Planern stellen sogenannte “narrow passages” dar, schmale Passagen also, die immer dort auftreten, wo die Bewegungsfreiheit der zu planenden Objekte stark eingeschr ̈ankt ist. An solchen Stellen kann es schwierig sein, eine ausreichende Anzahl von kollisionsfreien Samples zu finden. Es ist dann m ̈oglicherweise n ̈otig, ausgeklu ̈geltere Techniken einzusetzen, um eine gute Performance der Algorithmen zu erreichen.rnDie vorliegende Arbeit gliedert sich in zwei Teile: Im ersten Teil untersuchen wir parallele Kollisionserkennungsalgorithmen. Da wir auf eine Anwendung bei sampling-basierten Bewe- gungsplanern abzielen, w ̈ahlen wir hier eine Problemstellung, bei der wir stets die selben zwei Objekte, aber in einer großen Anzahl von unterschiedlichen Lagen auf Kollision testen. Wir im- plementieren und vergleichen verschiedene Verfahren, die auf Hu ̈llk ̈operhierarchien (BVHs) und hierarchische Grids als Beschleunigungsstrukturen zuru ̈ckgreifen. Alle beschriebenen Verfahren wurden auf mehreren CPU-Kernen parallelisiert. Daru ̈ber hinaus vergleichen wir verschiedene CUDA Kernels zur Durchfu ̈hrung BVH-basierter Kollisionstests auf der GPU. Neben einer un- terschiedlichen Verteilung der Arbeit auf die parallelen GPU Threads untersuchen wir hier die Auswirkung verschiedener Speicherzugriffsmuster auf die Performance der resultierenden Algo- rithmen. Weiter stellen wir eine Reihe von approximativen Kollisionstests vor, die auf den beschriebenen Verfahren basieren. Wenn eine geringere Genauigkeit der Tests tolerierbar ist, kann so eine weitere Verbesserung der Performance erzielt werden.rnIm zweiten Teil der Arbeit beschreiben wir einen von uns entworfenen parallelen, sampling- basierten Bewegungsplaner zur Behandlung hochkomplexer Probleme mit mehreren “narrow passages”. Das Verfahren arbeitet in zwei Phasen. Die grundlegende Idee ist hierbei, in der er- sten Planungsphase konzeptionell kleinere Fehler zuzulassen, um die Planungseffizienz zu erh ̈ohen und den resultierenden Pfad dann in einer zweiten Phase zu reparieren. Der hierzu in Phase I eingesetzte Planer basiert auf sogenannten Expansive Space Trees. Zus ̈atzlich haben wir den Planer mit einer Freidru ̈ckoperation ausgestattet, die es erlaubt, kleinere Kollisionen aufzul ̈osen und so die Effizienz in Bereichen mit eingeschr ̈ankter Bewegungsfreiheit zu erh ̈ohen. Optional erlaubt unsere Implementierung den Einsatz von approximativen Kollisionstests. Dies setzt die Genauigkeit der ersten Planungsphase weiter herab, fu ̈hrt aber auch zu einer weiteren Perfor- mancesteigerung. Die aus Phase I resultierenden Bewegungspfade sind dann unter Umst ̈anden nicht komplett kollisionsfrei. Um diese Pfade zu reparieren, haben wir einen neuartigen Pla- nungsalgorithmus entworfen, der lokal beschr ̈ankt auf eine kleine Umgebung um den bestehenden Pfad einen neuen, kollisionsfreien Bewegungspfad plant.rnWir haben den beschriebenen Algorithmus mit einer Klasse von neuen, schwierigen Metall- Puzzlen getestet, die zum Teil mehrere “narrow passages” aufweisen. Unseres Wissens nach ist eine Sammlung vergleichbar komplexer Benchmarks nicht ̈offentlich zug ̈anglich und wir fan- den auch keine Beschreibung von vergleichbar komplexen Benchmarks in der Motion-Planning Literatur.