932 resultados para Simulation and Modeling


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Impactive contact between a vibrating string and a barrier is a strongly nonlinear phenomenon that presents several challenges in the design of numerical models for simulation and sound synthesis of musical string instruments. These are addressed here by applying Hamiltonian methods to incorporate distributed contact forces into a modal framework for discrete-time simulation of the dynamics of a stiff, damped string. The resulting algorithms have spectral accuracy, are unconditionally stable, and require solving a multivariate nonlinear equation that is guaranteed to have a unique solution. Exemplifying results are presented and discussed in terms of accuracy, convergence, and spurious high-frequency oscillations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Selon la théorie de l’auto-détermination, l’autonomie est un besoin universel de base qui, lorsque soutenu, permet aux individus de mieux fonctionner et de vivre plus de bien-être psychologique (p. ex., Deci & Ryan, 2008). Le style parental des parents qui soutiennent l’autonomie de leur enfant est caractérisé par le soutien du fonctionnement autodéterminé de ce dernier. Sa définition traditionnelle inclut des pratiques telles qu’offrir des explications et des choix lors des requêtes, communiquer de l’empathie, et encourager les prises d’initiatives tout en minimisant l’utilisation d’un langage contrôlant (p. ex., Soenens et al., 2007). Les bénéfices d’un style parental qui soutient l’autonomie d’un enfant ont été bien documentés (p. ex., Grolnick, Deci, & Ryan, 1997), toutefois, peu d’études ont été effectuées auprès des bambins. Or, cette thèse visait à enrichir la littérature sur le « parentage » en explorant les pratiques soutenantes qui sont utilisées par des parents de bambins dans un contexte de socialisation (étude 1), ainsi qu’en examinant les facteurs qui peuvent brimer leur mise en pratique (étude 2). La première étude a examiné un grand nombre de pratiques de socialisation que les parents qui favorisent davantage le soutien à l’autonomie (SA) pourraient utiliser plus fréquemment lorsqu’ils font des demandes à leurs bambins. Cette étude nous a permis d’explorer comment les parents manifestent leur SA et si le SA dans ce type de contexte est associé à un plus grand niveau d’internalisation des règles. Des parents (N = 182) de bambins (M âge = 27.08 mois) ont donc été invités à rapporter la fréquence avec laquelle ils utilisent 26 pratiques potentiellement soutenantes lorsqu’ils demandent à leurs bambins de compléter des tâches importantes mais non intéressantes et de rapporter à quel point ils valorisent le SA. Huit pratiques ont été identifiées comme étant soutenantes: quatre façons de communiquer de l’empathie, donner des explications courtes, expliquer pourquoi la tâche est importante, décrire le problème de façon informative et neutre, et mettre en pratique le comportement désiré soi-même. De plus, l’ensemble des huit pratiques corrélait positivement avec le niveau d’internalisation des bambins, suggérant aussi que celles-ci représentent bien le concept du SA. Des études futures pourraient tenter de répliquer ces résultats dans des contextes potentiellement plus chargés ou ébranlants (p. ex., réagir face à des méfaits, avec des enfants souffrant de retard de développement). La deuxième étude a poursuivi l’exploration du concept du SA parental en examinant les facteurs qui influencent la fréquence d’utilisation des stratégies soutenantes dans des contextes de socialisation. Puisque la littérature suggère que le stress parental et le tempérament difficile des bambins (c.-à-d., plus haut niveau d’affectivité négative, plus faible niveau de contrôle volontaire/autorégulation, plus faible niveau de surgency) comme étant des facteurs de risque potentiels, nous avons exploré de quelle façon ces variables étaient associées à la fréquence d’utilisation des stratégies soutenantes. Les buts de l’étude étaient: (1) d’examiner comment le tempérament des bambins et le stress parental influençaient le SA parental, et (2) de vérifier si le stress parental médiait la relation possible entre le tempérament des bambins et le SA parental. Le même échantillon de parents a été utilisé. Les parents ont été invités à répondre à des questions portant sur le tempérament de leur enfant ainsi que sur leur niveau de stress. Les résultats ont démontré qu’un plus grand niveau d’affectivité négative était associé à un plus grand niveau de stress parental, qui à son tour prédisait moins de SA parental. De plus, le stress parental médiait la relation positive entre l’autorégulation du bambin et le SA parental. Des recherches futures pourraient évaluer des interventions ayant pour but d’aider les parents à préserver leur attitude soutenante durant des contextes de socialisation plus difficiles malgré certaines caractéristiques tempéramentales exigeantes des bambins, en plus du stress qu’ils pourraient vivre au quotidien.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Within The Creative Unconscious and Pictorial Sign I explore the dialogue that exists between social language and personal expression to understand how creativity is mediated. I consider how the involuntary inventiveness of artistic creativity and the structuring function of language come to negotiate what artists can experience and represent. My Doctoral practice attempts to question the influence of orthodox postmodernist views and allow sensual and direct experiences to be located within improvisation and spontaneous approaches to image making. I ask if it is possible for a humanistic and psychological interpretation of creativity to move beyond the copy and quotation that some postmodern theories of simulation and the hyperreal advance; but to retain the communicative function of visual expression and the model of a social form of signification instead of naïvely promoting unintelligible and personal languages.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software protection is an essential aspect of information security to withstand malicious activities on software, and preserving software assets. However, software developers still lacks a methodology for the assessment of the deployed protections. To solve these issues, we present a novel attack simulation based software protection assessment method to assess and compare various protection solutions. Our solution relies on Petri Nets to specify and visualize attack models, and we developed a Monte Carlo based approach to simulate attacking processes and to deal with uncertainty. Then, based on this simulation and estimation, a novel protection comparison model is proposed to compare different protection solutions. Lastly, our attack simulation based software protection assessment method is presented. We illustrate our method by means of a software protection assessment process to demonstrate that our approach can provide a suitable software protection assessment for developers and software companies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

SD card (Secure Digital Memory Card) is widely used in portable storage medium. Currently, latest researches on SD card, are mainly SD card controller based on FPGA (Field Programmable Gate Array). Most of them are relying on API interface (Application Programming Interface), AHB bus (Advanced High performance Bus), etc. They are dedicated to the realization of ultra high speed communication between SD card and upper systems. Studies about SD card controller, really play a vital role in the field of high speed cameras and other sub-areas of expertise. This design of FPGA-based file systems and SD2.0 IP (Intellectual Property core) does not only exhibit a nice transmission rate, but also achieve the systematic management of files, while retaining a strong portability and practicality. The file system design and implementation on a SD card covers the main three IP innovation points. First, the combination and integration of file system and SD card controller, makes the overall system highly integrated and practical. The popular SD2.0 protocol is implemented for communication channels. Pure digital logic design based on VHDL (Very-High-Speed Integrated Circuit Hardware Description Language), integrates the SD card controller in hardware layer and the FAT32 file system for the entire system. Secondly, the document management system mechanism makes document processing more convenient and easy. Especially for small files in batch processing, it can ease the pressure of upper system to frequently access and process them, thereby enhancing the overall efficiency of systems. Finally, digital design ensures the superior performance. For transmission security, CRC (Cyclic Redundancy Check) algorithm is for data transmission protection. Design of each module is platform-independent of macro cells, and keeps a better portability. Custom integrated instructions and interfaces may facilitate easily to use. Finally, the actual test went through multi-platform method, Xilinx and Altera FPGA developing platforms. The timing simulation and debugging of each module was covered. Finally, Test results show that the designed FPGA-based file system IP on SD card can support SD card, TF card and Micro SD with 2.0 protocols, and the successful implementation of systematic management for stored files, and supports SD bus mode. Data read and write rates in Kingston class10 card is approximately 24.27MB/s and 16.94MB/s.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A recent focus on contemporary evolution and the connections between communities has sought to more closely integrate the fields of ecology and evolutionary biology. Studies of coevolutionary dynamics, life history evolution, and rapid local adaptation demonstrate that ecological circumstances can dictate evolutionary trajectories. Thus, variation in species identity, trait distributions, and genetic composition may be maintained among ecologically divergent habitats. New theories and hypotheses (e.g., metacommunity theory and the Monopolization hypothesis) have been developed to understand better the processes occurring in spatially structured environments and how the movement of individuals among habitats contributes to ecology and evolution at broader scales. As few empirical studies of these theories exist, this work seeks to further test these concepts. Spatial and temporal dispersal are the mechanisms that connect habitats to one another. Both processes allow organisms to leave conditions that are suboptimal or unfavorable, and enable colonization and invasion, species range expansion, and gene flow among populations. Freshwater zooplankton are aquatic crustaceans that typically develop resting stages as part of their life cycle. Their dormant propagules allow organisms to disperse both temporally and among habitats. Additionally, because a number of species are cyclically parthenogenetic, they make excellent model organisms for studying evolutionary questions in a controlled environment. Here, I use freshwater zooplankton communities as model systems to explore the mechanisms and consequences of dispersal and to test these nascent theories on the influence of spatial structure in natural systems. In Chapter one, I use field experiments and mathematical models to determine the range of adult zooplankton dispersal over land and what vectors are moving zooplankton. Chapter two focuses on prolonged dormancy of one aquatic zooplankter, Daphnia pulex. Using statistical models with field and mesocosm experiments, I show that variation in Daphnia dormant egg hatching is substantial among populations in nature, and some of that variation can be attributed to genetic differences among the populations. Chapters three and four explore the consequences of dispersal at multiple levels of biological organization. Chapter three seeks to understand the population level consequences of dispersal over evolutionary time on current patterns of population genetic differentiation. Nearby populations of D. pulex often exhibit high population genetic differentiation characteristic of very low dispersal. I explore two alternative hypotheses that seek to explain this pattern. Finally, chapter four is a case study of how dispersal has influenced patterns of variation at the community, trait and genetic levels of biodiversity in a lake metacommunity.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Terrestrial planets produce crusts as they differentiate. The Earth’s bi-modal crust, with a high-standing granitic continental crust and a low-standing basaltic oceanic crust, is unique in our solar system and links the evolution of the interior and exterior of this planet. Here I present geochemical observations to constrain processes accompanying crustal formation and evolution. My approach includes geochemical analyses, quantitative modeling, and experimental studies. The Archean crustal evolution project represents my perspective on when Earth’s continental crust began forming. In this project, I utilized critical element ratios in sedimentary records to track the evolution of the MgO content in the upper continental crust as a function time. The early Archean subaerial crust had >11 wt. % MgO, whereas by the end of Archean its composition had evolved to about 4 wt. % MgO, suggesting a transition of the upper crust from a basalt-like to a more granite-like bulk composition. Driving this fundamental change of the upper crustal composition is the widespread operation of subduction processes, suggesting the onset of global plate tectonics at ~ 3 Ga (Abstract figure). Three of the chapters in this dissertation leverage the use of Eu anomalies to track the recycling of crustal materials back into the mantle, where Eu anomaly is a sensitive measure of the element’s behavior relative to neighboring lanthanoids (Sm and Gd) during crustal differentiation. My compilation of Sm-Eu-Gd data for the continental crust shows that the average crust has a net negative Eu anomaly. This result requires recycling of Eu-enriched lower continental crust to the mantle. Mass balance calculations require that about three times the mass of the modern continental crust was returned into the mantle over Earth history, possibly via density-driven recycling. High precision measurements of Eu/Eu* in selected primitive glasses of mid-ocean ridge basalt (MORB) from global MORs, combined with numerical modeling, suggests that the recycled lower crustal materials are not found within the MORB source and may have at least partially sank into the lower mantle where they can be sampled by hot spot volcanoes. The Lesser Antilles Li isotope project provides insights into the Li systematics of this young island arc, a representative section of proto-continental crust. Martinique Island lavas, to my knowledge, represent the only clear case in which crustal Li is recycled back into their mantle source, as documented by the isotopically light Li isotopes in Lesser Antilles sediments that feed into the fore arc subduction trench. By corollary, the mantle-like Li signal in global arc lavas is likely the result of broadly similar Li isotopic compositions between the upper mantle and bulk subducting sediments in most arcs. My PhD project on Li diffusion mechanism in zircon is being carried out in extensive collaboration with multiple institutes and employs analytical, experimental and modeling studies. This ongoing project, finds that REE and Y play an important role in controlling Li diffusion in natural zircons, with Li partially coupling to REE and Y to maintain charge balance. Access to state-of-art instrumentation presented critical opportunities to identify the mechanisms that cause elemental fractionation during laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) analysis. My work here elucidates the elemental fractionation associated with plasma plume condensation during laser ablation and particle-ion conversion in the ICP.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Automação e Eletrotécnica Ramo de Automação e Eletrónica Industrial

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this report is to present the Crossdock Door Assignment Problem, which involves assigning destinations to outbound dock doors of Crossdock centres such that travel distance by material handling equipment is minimized. We propose a two fold solution; simulation and optimization of the simulation model - simulation optimization. The novel aspect of our solution approach is that we intend to use simulation to derive a more realistic objective function and use Memetic algorithms to find an optimal solution. The main advantage of using Memetic algorithms is that it combines a local search with Genetic Algorithms. The Crossdock Door Assignment Problem is a new domain application to Memetic Algorithms and it is yet unknown how it will perform.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A quasigeostrophic model is developed to diagnose the three-dimensional circulation, including the vertical velocity, in the upper ocean from high-resolution observations of sea surface height and buoyancy. The formulation for the adiabatic component departs from the classical surface quasigeostrophic framework considered before since it takes into account the stratification within the surface mixed layer that is usually much weaker than that in the ocean interior. To achieve this, the model approximates the ocean with two constant stratification layers: a finite-thickness surface layer (or the mixed layer) and an infinitely deep interior layer. It is shown that the leading-order adiabatic circulation is entirely determined if both the surface streamfunction and buoyancy anomalies are considered. The surface layer further includes a diabatic dynamical contribution. Parameterization of diabatic vertical velocities is based on their restoring impacts of the thermal wind balance that is perturbed by turbulent vertical mixing of momentum and buoyancy. The model skill in reproducing the three-dimensional circulation in the upper ocean from surface data is checked against the output of a high-resolution primitive equation numerical simulation

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this report is to present the Crossdock Door Assignment Problem, which involves assigning destinations to outbound dock doors of Crossdock centres such that travel distance by material handling equipment is minimized. We propose a two fold solution; simulation and optimization of the simulation model - simulation optimization. The novel aspect of our solution approach is that we intend to use simulation to derive a more realistic objective function and use Memetic algorithms to find an optimal solution. The main advantage of using Memetic algorithms is that it combines a local search with Genetic Algorithms. The Crossdock Door Assignment Problem is a new domain application to Memetic Algorithms and it is yet unknown how it will perform.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Civil e Ambiental, 2015.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.