838 resultados para generative and performative modeling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

While news stories are an important traditional medium to broadcast and consume news, microblogging has recently emerged as a place where people can dis- cuss, disseminate, collect or report information about news. However, the massive information in the microblogosphere makes it hard for readers to keep up with these real-time updates. This is especially a problem when it comes to breaking news, where people are more eager to know “what is happening”. Therefore, this dis- sertation is intended as an exploratory effort to investigate computational methods to augment human effort when monitoring the development of breaking news on a given topic from a microblog stream by extractively summarizing the updates in a timely manner. More specifically, given an interest in a topic, either entered as a query or presented as an initial news report, a microblog temporal summarization system is proposed to filter microblog posts from a stream with three primary concerns: topical relevance, novelty, and salience. Considering the relatively high arrival rate of microblog streams, a cascade framework consisting of three stages is proposed to progressively reduce quantity of posts. For each step in the cascade, this dissertation studies methods that improve over current baselines. In the relevance filtering stage, query and document expansion techniques are applied to mitigate sparsity and vocabulary mismatch issues. The use of word embedding as a basis for filtering is also explored, using unsupervised and supervised modeling to characterize lexical and semantic similarity. In the novelty filtering stage, several statistical ways of characterizing novelty are investigated and ensemble learning techniques are used to integrate results from these diverse techniques. These results are compared with a baseline clustering approach using both standard and delay-discounted measures. In the salience filtering stage, because of the real-time prediction requirement a method of learning verb phrase usage from past relevant news reports is used in conjunction with some standard measures for characterizing writing quality. Following a Cranfield-like evaluation paradigm, this dissertation includes a se- ries of experiments to evaluate the proposed methods for each step, and for the end- to-end system. New microblog novelty and salience judgments are created, building on existing relevance judgments from the TREC Microblog track. The results point to future research directions at the intersection of social media, computational jour- nalism, information retrieval, automatic summarization, and machine learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this dissertation, I demonstrate how improvisations within the structures of performance during Montserrat’s annual festivals produce “rhythms of change” that contribute to the formation of cultural identities. Montserrat is a small island of 39.5 square miles in the Caribbean’s Leeward Islands, and a volcanic disaster in the 1990s led to the loss of villages, homes, and material possessions. The crisis resulted in mass displacement and emigration, and today’s remaining population of 5,000 is now in a stage of post-volcano redevelopment. The reliability of written archives for establishing cultural knowledge is tenuous, and the community is faced with re-energizing cherished cultural traditions. This ethnographic research traces my embodied search for Montserrat’s history through an archive that is itself intangible and performative. Festivals produce some of the island’s most visible and culturally political events, and music and dance performances prompt on- and off-stage discussions about the island’s multifaceted heritage. The festival cycle provides the structure for ongoing renegotiations of what it means to be “Montserratian.” I focus especially on the island’s often-discussed and debated “triangular” heritage of Irishness, Africanness, and Montserratianness as it is performed during the festivals. Through my meanderings along the winding hilly roads of Montserrat, I explored reconfigurations of cultural memory through the island’s masquerade dance tradition and other festival celebrations. In this work, I introduce a “Cast of Characters,” each of whose scholarly, artistic, and public service work on Montserrat contributes to the shape and transformation of the island’s post-volcano cultural identities today. This dissertation is about the kinesthetic transmission of shared (and sometimes unshared) cultural knowledge, the substance of which echoes in the rhythms of Montserrat’s music and dance practices today.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation focuses on design challenges caused by secondary impacts to printed wiring assemblies (PWAs) within hand-held electronics due to accidental drop or impact loading. The continuing increase of functionality, miniaturization and affordability has resulted in a decrease in the size and weight of handheld electronic products. As a result, PWAs have become thinner and the clearances between surrounding structures have decreased. The resulting increase in flexibility of the PWAs in combination with the reduced clearances requires new design rules to minimize and survive possible internal collisions impacts between PWAs and surrounding structures. Such collisions are being termed ‘secondary impact’ in this study. The effect of secondary impact on board-level drop reliability of printed wiring boards (PWBs) assembled with MEMS microphone components, is investigated using a combination of testing, response and stress analysis, and damage modeling. The response analysis is conducted using a combination of numerical finite element modeling and simplified analytic models for additional parametric sensitivity studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To understand the evolution of bipedalism among the homnoids in an ecological context we need to be able to estimate theenerrgetic cost of locomotion in fossil forms. Ideally such an estimate would be based entirely on morphology since, except for the rare instances where footprints are preserved, this is hte only primary source of evidence available. In this paper we use evolutionary robotics techniques (genetic algoritms, pattern generators and mechanical modeling) to produce a biomimentic simulation of bipedalism based on human body dimensions. The mechnaical simulation is a seven-segment, two-dimensional model with motive force provided by tension generators representing the major muscle groups acting around the lower-limb joints. Metabolic energy costs are calculated from the muscel model, and bipedal gait is generated using a finite-state pattern generator whose parameters are produced using a genetic algorithm with locomotor economy (maximum distance for a fixed energy cost) as the fitness criterion. The model is validated by comparing the values it generates with those for modern humans. The result (maximum efficiency of 200 J m-1) is within 15% of the experimentally derived value, which is very encouraging and suggests that this is a useful analytic technique for investigating the locomotor behaviour of fossil forms. Initial work suggests that in the future this technique could be used to estimate other locomotor parameters such as top speed. In addition, the animations produced by this technique are qualitatively very convincing, which suggests that this may also be a useful technique for visualizing bipedal locomotion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A oportunidade de produção de biomassa microalgal tem despertado interesse pelos diversos destinos que a mesma pode ter, seja na produção de bioenergia, como fonte de alimento ou servindo como produto da biofixação de dióxido de carbono. Em geral, a produção em larga escala de cianobactérias e microalgas é feita com acompanhamento através de análises físicoquímicas offline. Neste contexto, o objetivo deste trabalho foi monitorar a concentração celular em fotobiorreator raceway para produção de biomassa microalgal usando técnicas de aquisição digital de dados e controle de processos, pela aquisição de dados inline de iluminância, concentração de biomassa, temperatura e pH. Para tal fim foi necessário construir sensor baseado em software capaz de determinar a concentração de biomassa microalgal a partir de medidas ópticas de intensidade de radiação monocromática espalhada e desenvolver modelo matemático para a produção da biomassa microalgal no microcontrolador, utilizando algoritmo de computação natural no ajuste do modelo. Foi projetado, construído e testado durante cultivos de Spirulina sp. LEB 18, em escala piloto outdoor, um sistema autônomo de registro de informações advindas do cultivo. Foi testado um sensor de concentração de biomassa baseado na medição da radiação passante. Em uma segunda etapa foi concebido, construído e testado um sensor óptico de concentração de biomassa de Spirulina sp. LEB 18 baseado na medição da intensidade da radiação que sofre espalhamento pela suspensão da cianobactéria, em experimento no laboratório, sob condições controladas de luminosidade, temperatura e fluxo de suspensão de biomassa. A partir das medidas de espalhamento da radiação luminosa, foi construído um sistema de inferência neurofuzzy, que serve como um sensor por software da concentração de biomassa em cultivo. Por fim, a partir das concentrações de biomassa de cultivo, ao longo do tempo, foi prospectado o uso da plataforma Arduino na modelagem empírica da cinética de crescimento, usando a Equação de Verhulst. As medidas realizadas no sensor óptico baseado na medida da intensidade da radiação monocromática passante através da suspensão, usado em condições outdoor, apresentaram baixa correlação entre a concentração de biomassa e a radiação, mesmo para concentrações abaixo de 0,6 g/L. Quando da investigação do espalhamento óptico pela suspensão do cultivo, para os ângulos de 45º e 90º a radiação monocromática em 530 nm apresentou um comportamento linear crescente com a concentração, apresentando coeficiente de determinação, nos dois casos, 0,95. Foi possível construir um sensor de concentração de biomassa baseado em software, usando as informações combinadas de intensidade de radiação espalhada nos ângulos de 45º e 135º com coeficiente de determinação de 0,99. É factível realizar simultaneamente a determinação inline de variáveis do processo de cultivo de Spirulina e a modelagem cinética empírica do crescimento do micro-organismo através da equação de Verhulst, em microcontrolador Arduino.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability and dependability modeling can be employed during many stages of analysis of a computing system to gain insights into its critical behaviors. To provide useful results, realistic models of systems are often necessarily large and complex. Numerical analysis of these models presents a formidable challenge because the sizes of their state-space descriptions grow exponentially in proportion to the sizes of the models. On the other hand, simulation of the models requires analysis of many trajectories in order to compute statistically correct solutions. This dissertation presents a novel framework for performing both numerical analysis and simulation. The new numerical approach computes bounds on the solutions of transient measures in large continuous-time Markov chains (CTMCs). It extends existing path-based and uniformization-based methods by identifying sets of paths that are equivalent with respect to a reward measure and related to one another via a simple structural relationship. This relationship makes it possible for the approach to explore multiple paths at the same time,· thus significantly increasing the number of paths that can be explored in a given amount of time. Furthermore, the use of a structured representation for the state space and the direct computation of the desired reward measure (without ever storing the solution vector) allow it to analyze very large models using a very small amount of storage. Often, path-based techniques must compute many paths to obtain tight bounds. In addition to presenting the basic path-based approach, we also present algorithms for computing more paths and tighter bounds quickly. One resulting approach is based on the concept of path composition whereby precomputed subpaths are composed to compute the whole paths efficiently. Another approach is based on selecting important paths (among a set of many paths) for evaluation. Many path-based techniques suffer from having to evaluate many (unimportant) paths. Evaluating the important ones helps to compute tight bounds efficiently and quickly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research is to study sedimentation mechanism by mathematical modeling in access channels which are affected by tidal currents. The most important factor for recognizing sedimentation process in every water environment is the flow pattern of that environment. It is noteworthy that the flow pattern is affected by the geometry and the shape of the environment as well as the type of existing affects in area. The area under the study in this thesis is located in Bushehr Gulf and the access channels (inner and outer). The study utilizes the hydrodynamic modeling with unstructured triangular and non-overlapping grids, using the finite volume, From method analysis in two scale sizes: large scale (200 m to 7.5km) and small scale (50m to 7.5km) in two different time durations of 15 days and 3.5 days to obtain the flow patterns. The 2D governing equations used in the model are the Depth-Averaged Shallow Water Equations. Turbulence Modeling is required to calculate the Eddy Viscosity Coefficient using the Smagorinsky Model with coefficient of 0.3. In addition to the flow modeling in two different scales and the use of the data of 3.5 day tidal current modeling have been considered to study the effects of the sediments equilibrium in the area and the channels. This model is capable of covering the area which is being settled and eroded and to identify the effects of tidal current of these processes. The required data of the above mentioned models such as current and sediments data have been obtained by the measurements in Bushehr Gulf and the access channels which was one of the PSO's (Port and Shipping Organization) project-titled, "The Sedimentation Modeling in Bushehr Port" in 1379. Hydrographic data have been obtained from Admiralty maps (2003) and Cartography Organization (1378, 1379). The results of the modeling includes: cross shore currents in northern and north western coasts of Bushehr Gulf during the neap tide and also the same current in northern and north eastern coasts of the Gulf during the spring tide. These currents wash and carry fine particles (silt, clay, and mud) from the coastal bed of which are generally made of mud and clay with some silts. In this regard, the role of sediments in the islands of this area and the islands made of depot of dredged sediments should not be ignored. The result of using 3.5 day modeling is that the cross channels currents leads to settlement places in inner and outer channels in tidal period. In neap tide the current enters the channel from upside bend of the two channels and outer channel. Then it crosses the channel oblique in some places of the outer channel. Also the oblique currents or even almost perpendicular current from up slope of inner channel between No. 15 and No. 18 buoys interact between the parallel currents in the channel and made secondary oblique currents which exit as a down-slope current in the channel and causes deposit of sediments as well as settling the suspended sediments carried by these currents. In addition in outer channel the speed of parallel currents in the bend of the channel which is naturally deeper increases. Therefore, it leads to erosion and suspension of sediments in this area. The speed of suspended sediments carried by this current which is parallel to the channel axis decreases when they pass through the shallower part of the channel where it is in the buoys No.7 and 8 to 5 and 6 are located. Therefore, the suspended sediment settles and because of this process these places will be even shallower. Furthermore, the passing of oblique upstream leads to settlement of the sediments in the up-slope and has an additional effect on the process of decreasing the depth of these locations. On the contrary, in the down-slope channel, as the results of sediments and current modeling indicates the speed of current increases and the currents make the particles of down-slope channel suspended and be carried away. Thus, in a vast area of downstream of both channels, the sediments have settled. At the end of the neap tide, the process along with circulations in this area produces eddies which causes sedimentation in the area. During spring some parts of this active location for sedimentation will enter both channels in a reverse process. The above mentioned processes and the places of sedimentation and erosion in inner and outer channels are validated by the sediments equilibrium modeling. This model will be able to estimate the suspended, bed load and the boundary layer thickness in each point of both channels and in the modeled area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Mecânica, 2015.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El vertiginoso crecimiento de los centros urbanos, las tecnologías emergentes y la demanda de nuevos servicios por parte de la población plantea encaminar esfuerzos hacia el desarrollo de las ciudades inteligentes. Éste concepto ha tomado fuerza entre los sectores político, económico, social, académico, ambiental y civil; de forma paralela, se han generado iniciativas que conducen hacia la integración de la infraestructura, la tecnología y los servicios para los ciudadanos. En éste contexto, una de las problemáticas con mayor impacto en la sociedad es la seguridad vial. Es necesario contar con mecanismos que disminuyan la accidentalidad, mejoren la atención a incidentes, optimicen la movilidad urbana y planeación municipal, ayuden a reducir el consumo de combustible y la emisión de gases de efecto de invernadero, así como ofrecer información dinámica y efectiva a los viajeros. En este artículo se describen dos (2) enfoques que contribuyen de manera eficiente dicho problema: los videojuegos como juegos serios y los sistemas de transporte inteligente. Ambos enfoques están encaminados a evitar colisiones y su diseño e implementación requieren componentes altamente tecnológicos (e.g. sistemas telemáticos e informáticos, inteligencia artificial, procesamiento de imágenes y modelado 3D).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today’s snowmobile industry faces great challenges in the field of noise & vibration. The area of main concern is the pass-by noise restriction defined by the Society of Automotive Engineers (SAE) test standard J192, with a maximum sound pressure level of 78 dB(A) being required by many states and national parks. To continue meet or beat this requirement without effecting machine performance, a deeper understanding of the sound transfer paths is required. This thesis examines the transfer paths created by the tunnel, rear suspension, drive shaft, and rubber composite track, with the primary source being suspension input through the ground. Using a combination of field experiments and analytical modeling, perspective was gained on which suspension and drive elements create the primary transfer paths. With further understanding of these paths, industry can tailor and fine-tune the approaches taken in to control overall noise output.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We organized an international campaign to observe the blazar 0716+714 in the optical band. The observations took place from February 24, 2009 to February 26, 2009. The global campaign was carried out by observers from more that sixteen countries and resulted in an extended light curve nearly seventy-eight hours long. The analysis and the modeling of this light curve form the main work of this dissertation project. In the first part of this work, we present the time series and noise analyses of the data. The time series analysis utilizes discrete Fourier transform and wavelet analysis routines to search for periods in the light curve. We then present results of the noise analysis which is based on the idea that each microvariability curve is the realization of the same underlying stochastic noise processes in the blazar jet. Neither reoccuring periods nor random noise can successfully explain the observed optical fluctuations. Hence in the second part, we propose and develop a new model to account for the microvariability we see in blazar 0716+714. We propose that the microvariability is due to the emission from turbulent regions in the jet that are energized by the passage of relativistic shocks. Emission from each turbulent cell forms a pulse of emission, and when convolved with other pulses, yields the observed light curve. We use the model to obtain estimates of the physical parameters of the emission regions in the jet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recreational abuse of the drugs cocaine, methamphetamine, and morphine continues to be prevalent in the United States of America and around the world. While numerous methods of detection exist for each drug, they are generally limited by the lifetime of the parent drug and its metabolites in the body. However, the covalent modification of endogenous proteins by these drugs of abuse may act as biomarkers of exposure and allow for extension of detection windows for these drugs beyond the lifetime of parent molecules or metabolites in the free fraction. Additionally, existence of covalently bound molecules arising from drug ingestion can offer insight into downstream toxicities associated with each of these drugs. This research investigated the metabolism of cocaine, methamphetamine, and morphine in common in vitro assay systems, specifically focusing on the generation of reactive intermediates and metabolites that have the potential to form covalent protein adducts. Results demonstrated the formation of covalent adduction products between biological cysteine thiols and reactive moieties on cocaine and morphine metabolites. Rigorous mass spectrometric analysis in conjunction with in vitro metabolic activation, pharmacogenetic reaction phenotyping, and computational modeling were utilized to characterize structures and mechanisms of formation for each resultant thiol adduction product. For cocaine, data collected demonstrated the formation of adduction products from a reactive arene epoxide intermediate, designating a novel metabolic pathway for cocaine. In the case of morphine, data expanded on known adduct-forming pathways using sensitive and selective analysis techniques, following the known reactive metabolite, morphinone, and a proposed novel metabolite, morphine quinone methide. Data collected in this study describe novel metabolic events for multiple important drugs of abuse, culminating in detection methods and mechanistic descriptors useful to both medical and forensic investigators when examining the toxicology associated with cocaine, methamphetamine, and morphine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.