815 resultados para Haptic rendering
Resumo:
With the exponential increasing demands and uses of GIS data visualization system, such as urban planning, environment and climate change monitoring, weather simulation, hydrographic gauge and so forth, the geospatial vector and raster data visualization research, application and technology has become prevalent. However, we observe that current web GIS techniques are merely suitable for static vector and raster data where no dynamic overlaying layers. While it is desirable to enable visual explorations of large-scale dynamic vector and raster geospatial data in a web environment, improving the performance between backend datasets and the vector and raster applications remains a challenging technical issue. This dissertation is to implement these challenging and unimplemented areas: how to provide a large-scale dynamic vector and raster data visualization service with dynamic overlaying layers accessible from various client devices through a standard web browser, and how to make the large-scale dynamic vector and raster data visualization service as rapid as the static one. To accomplish these, a large-scale dynamic vector and raster data visualization geographic information system based on parallel map tiling and a comprehensive performance improvement solution are proposed, designed and implemented. They include: the quadtree-based indexing and parallel map tiling, the Legend String, the vector data visualization with dynamic layers overlaying, the vector data time series visualization, the algorithm of vector data rendering, the algorithm of raster data re-projection, the algorithm for elimination of superfluous level of detail, the algorithm for vector data gridding and re-grouping and the cluster servers side vector and raster data caching.
Resumo:
A trial judge serves as gatekeeper in the courtroom to ensure that only reliable expert witness testimony is presented to the jury. Nevertheless, research shows that while judges take seriously their gatekeeper status, legal professionals in general are unable to identify well conducted research and are unable to define falsifiability, error rates, peer review status, and scientific validity (Gatkowski et al., 2001; Kovera & McAuliff, 2000). However, the abilities to identify quality scientific research and define scientific concepts are critical to preventing "junk" science from entering courtrooms. Research thus far has neglected to address that before selecting expert witnesses, judges and attorneys must first evaluate experts' CVs rather than their scientific testimony to determine whether legal standards of admissibility have been met. The quality of expert testimony, therefore, largely depends on the ability to evaluate properly experts' credentials. Theoretical models of decision making suggest that ability/knowledge and motivation are required to process information systematically. Legal professionals (judges and attorneys) were expected to process CVs heuristically when rendering expert witness decisions due to a lack of training in areas of psychology expertise.^ Legal professionals' (N = 150) and undergraduate students' (N = 468) expert witness decisions were examined and compared. Participants were presented with one of two versions of a criminal case calling for the testimony of either a clinical psychology expert or an experimental legal psychology expert. Participants then read one of eight curricula vitae that varied area of expertise (clinical vs. legal psychology), previous expert witness experience (previous experience vs. no previous experience), and scholarly publication record (30 publications vs. no publications) before deciding whether the expert was qualified to testify in the case. Follow-up measures assessed participants' decision making processes.^ Legal professionals were not better than college students at rendering quality psychology expert witness admissibility decisions yet they were significantly more confident in their decisions. Legal professionals rated themselves significantly higher than students in ability, knowledge, and motivation to choose an appropriate psychology expert although their expert witness decisions were equally inadequate. Findings suggest that participants relied on heuristics, such as previous expert witness experience, to render decisions.^
Resumo:
This research sought to understand the role that differentially assessed lands (lands in the United States given tax breaks in return for their guarantee to remain in agriculture) play in influencing urban growth. Our method was to calibrate the SLEUTH urban growth model under two different conditions. The first used an excluded layer that ignored such lands, effectively rendering them available for development. The second treated those lands as totally excluded from development. Our hypothesis was that excluding those lands would yield better metrics of fit with past data. Our results validate our hypothesis since two different metrics that evaluate goodness of fit both yielded higher values when differentially assessed lands are treated as excluded. This suggests that, at least in our study area, differential assessment, which protects farm and ranch lands for tenuous periods of time, has indeed allowed farmland to resist urban development. Including differentially assessed lands also yielded very different calibrated coefficients of growth as the model tried to account for the same growth patterns over two very different excluded areas. Excluded layer design can greatly affect model behavior. Since differentially assessed lands are quite common through the United States and are often ignored in urban growth modeling, the findings of this research can assist other urban growth modelers in designing excluded layers that result in more accurate model calibration and thus forecasting.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^
Resumo:
Mutualistic symbioses between scleractinian corals and endosymbiotic dinoflagellates (Symbiodinium spp.) are the foundation of coral reef ecosystems. For many coral-algal symbioses, prolonged episodes of thermal stress damage the symbiont's photosynthetic capability, resulting in its expulsion from the host. Despite the link between photosynthetic competency and symbiont expulsion, little is known about the effect of thermal stress on the expression of photosystem genes in Symbiodinium. This study used real-time PCR to monitor the transcript abundance of two important photosynthetic reaction center genes, psbA(encoding the D1 protein of photosystem II) and psaA (encoding the P700 protein of photosystem I), in four cultured isolates (representing ITS2-types A13, A20, B1, and F2) and two in hospite Symbiodinium spp. within the coral Pocillopora spp. (ITS2-types C1b-c and D1). Both cultured and in hospite Symbiodinium samples were exposed to elevated temperatures (32°C) over a 7-day period and examined for changes in photochemistry and transcript abundance. Symbiodinium A13 and C1b-c (both thermally sensitive) demonstrated significant declines in both psbA and psaA during the thermal stress treatment, whereas the transcript levels of the other Symbiodinium types remained stable. The downregulation of both core photosystem genes could be the result of several different physiological mechanisms, but may ultimately limit repair rates of photosynthetic proteins, rendering some Symbiodinium spp. especially susceptible to thermal stress.
Resumo:
While it may be argued that aggression against women is part of a culture of violence deeply rooted in Spanish society, the gender-related violence that exists in today’s Spain is more specifically a legacy of Franco’s dictatorship (1939-1975). Franco’s Spain endorsed unequal gender relations, championed patriarchal dominance and power over women, and imposed models of hegemonic and authoritarian masculinities that internalized violence by rendering it a feature inseparable from manhood and virility. ^ This dissertation provides a comprehensive analysis of masculinity and gender violence in Franco’s Spain, by analyzing the novel as the primary cultural vehicle of social criticism and political dissent against the new regime during a period (1939-1962) dominated by silence and censorship. The first part of this work defines and elucidates the concepts of masculinity and gender violence and the relationship between them. It also compares the significant social and cultural achievements of Spanish women during the Second Republic (1931-1939) with the reactionary curbing of those achievements during Francoism. The second part of this research presents a multidisciplinary analysis of masculinity and gender violence in three novels: Nada (1944) by Carmen Laforet, Juegos de manos (1954) by Juan Goytisolo and Tiempo de silencio (1962) by Luis Martin Santos. ^ Through the literary representation of different models of masculinity and the psychological and social parameters that encourage and incite gender violence, these authors conceptualize and express their political ideology, as well as their symbolic interpretation of Francoist Spain.^
Resumo:
This dissertation studies the manipulation of particles using acoustic stimulation for applications in microfluidics and templating of devices. The term particle is used here to denote any solid, liquid or gaseous material that has properties, which are distinct from the fluid in which it is suspended. Manipulation means to take over the movements of the particles and to position them in specified locations. Using devices, microfabricated out of silicon, the behavior of particles under the acoustic stimulation was studied with the main purpose of aligning the particles at either low-pressure zones, known as the nodes or high-pressure zones, known as anti-nodes. By aligning particles at the nodes in a flow system, these particles can be focused at the center or walls of a microchannel in order to ultimately separate them. These separations are of high scientific importance, especially in the biomedical domain, since acoustopheresis provides a unique approach to separate based on density and compressibility, unparalleled by other techniques. The study of controlling and aligning the particles in various geometries and configurations was successfully achieved by controlling the acoustic waves. Apart from their use in flow systems, a stationary suspended-particle device was developed to provide controllable light transmittance based on acoustic stimuli. Using a glass compartment and a carbon-particle suspension in an organic solvent, the device responded to acoustic stimulation by aligning the particles. The alignment of light-absorbing carbon particles afforded an increase in visible light transmittance as high as 84.5%, and it was controlled by adjusting the frequency and amplitude of the acoustic wave. The device also demonstrated alignment memory rendering it energy-efficient. A similar device for suspended-particles in a monomer enabled the development of electrically conductive films. These films were based on networks of conductive particles. Elastomers doped with conductive metal particles were rendered surface conductive at particle loadings as low as 1% by weight using acoustic focusing. The resulting films were flexible and had transparencies exceeding 80% in the visible spectrum (400-800 nm) These films had electrical bulk conductivities exceeding 50 S/cm.
Resumo:
Data Visualization is widely used to facilitate the comprehension of information and find relationships between data. One of the most widely used techniques for multivariate data (4 or more variables) visualization is the 2D scatterplot. This technique associates each data item to a visual mark in the following way: two variables are mapped to Cartesian coordinates so that a visual mark can be placed on the Cartesian plane; the others variables are mapped gradually to visual properties of the mark, such as size, color, shape, among others. As the number of variables to be visualized increases, the amount of visual properties associated to the mark increases as well. As a result, the complexity of the final visualization is higher. However, increasing the complexity of the visualization does not necessarily implies a better visualization and, sometimes, it provides an inverse situation, producing a visually polluted and confusing visualization—this problem is called visual properties overload. This work aims to investigate whether it is possible to work around the overload of the visual channel and improve insight about multivariate data visualized through a modification in the 2D scatterplot technique. In this modification, we map the variables from data items to multisensoriy marks. These marks are composed not only by visual properties, but haptic properties, such as vibration, viscosity and elastic resistance, as well. We believed that this approach could ease the insight process, through the transposition of properties from the visual channel to the haptic channel. The hypothesis was verified through experiments, in which we have analyzed (a) the accuracy of the answers; (b) response time; and (c) the grade of personal satisfaction with the proposed approach. However, the hypothesis was not validated. The results suggest that there is an equivalence between the investigated visual and haptic properties in all analyzed aspects, though in strictly numeric terms the multisensory visualization achieved better results in response time and personal satisfaction.
Resumo:
Shadows and illumination play an important role when generating a realistic scene in computer graphics. Most of the Augmented Reality (AR) systems track markers placed in a real scene and retrieve their position and orientation to serve as a frame of reference for added computer generated content, thereby producing an augmented scene. Realistic depiction of augmented content with coherent visual cues is a desired goal in many AR applications. However, rendering an augmented scene with realistic illumination is a complex task. Many existent approaches rely on a non automated pre-processing phase to retrieve illumination parameters from the scene. Other techniques rely on specific markers that contain light probes to perform environment lighting estimation. This study aims at designing a method to create AR applications with coherent illumination and shadows, using a textured cuboid marker, that does not require a training phase to provide lighting information. Such marker may be easily found in common environments: most of product packaging satisfies such characteristics. Thus, we propose a way to estimate a directional light configuration using multiple texture tracking to render AR scenes in a realistic fashion. We also propose a novel feature descriptor that is used to perform multiple texture tracking. Our descriptor is an extension of the binary descriptor, named discrete descriptor, and outperforms current state-of-the-art methods in speed, while maintaining their accuracy.
Resumo:
The birth of the ecological movement in the 1960s motivated the conception of a new branch of Translation Studies known as Ecotranslation. This scarcely known theoretical research framework sets off from two main notions: firstly, the representation of nature in literature and secondly, the importance of the different roles and interpretations that nature can be provided with in literary works. From these bases, the goal of our pilot study was to apply this new nature-centered approach to the translations of H. G. Wells’ short story The Country of the Blind, as rendered into Spanish by Íñigo Jáuregui (2014) and Alfonso Hernández Catá (1919). The acknowledgement that Ecotranslation derives from a general awareness towards nature, considering it as an intrinsic feature of humankind which simultaneously influences and is affected by human behavior, motivated the following analysis of the role that Wells attributed to it in his short story The Country Of The Blind, which evinced a strong correspondence between environment and society in the original text, where nature was shown to be an essential instrument to figuratively reflect social concerns. Setting off from that critical analysis we compared how two chronologically separate translators rendered the natural elements of the original story into a different language, in this case Spanish. In general terms, data confirmed that Jauregi´s translation, published in 2014, encompasses a much more literal approach to the source text, rendering Well´s original terminology into the closest equivalent expressions in Spanish. While Hernández Catá, seems to have focused his work on the idea of human control over nature, even if this decision meant altering the precise way in which Wells articulated his ideas.
Resumo:
A landfill represents a complex and dynamically evolving structure that can be stochastically perturbed by exogenous factors. Both thermodynamic (equilibrium) and time varying (non-steady state) properties of a landfill are affected by spatially heterogenous and nonlinear subprocesses that combine with constraining initial and boundary conditions arising from the associated surroundings. While multiple approaches have been made to model landfill statistics by incorporating spatially dependent parameters on the one hand (data based approach) and continuum dynamical mass-balance equations on the other (equation based modelling), practically no attempt has been made to amalgamate these two approaches while also incorporating inherent stochastically induced fluctuations affecting the process overall. In this article, we will implement a minimalist scheme of modelling the time evolution of a realistic three dimensional landfill through a reaction-diffusion based approach, focusing on the coupled interactions of four key variables - solid mass density, hydrolysed mass density, acetogenic mass density and methanogenic mass density, that themselves are stochastically affected by fluctuations, coupled with diffusive relaxation of the individual densities, in ambient surroundings. Our results indicate that close to the linearly stable limit, the large time steady state properties, arising out of a series of complex coupled interactions between the stochastically driven variables, are scarcely affected by the biochemical growth-decay statistics. Our results clearly show that an equilibrium landfill structure is primarily determined by the solid and hydrolysed mass densities only rendering the other variables as statistically "irrelevant" in this (large time) asymptotic limit. The other major implication of incorporation of stochasticity in the landfill evolution dynamics is in the hugely reduced production times of the plants that are now approximately 20-30 years instead of the previous deterministic model predictions of 50 years and above. The predictions from this stochastic model are in conformity with available experimental observations.
Resumo:
When referring to cinema and its emancipatory potential, realism, like Plato’s pharmakon, has signified both illness and cure, poison and medicine. On the one hand, realism is regarded as the main feature of so-called classical cinema, inherently conservative and thoroughly ideological, its main raison d’être being to reify and make a particular version of the status quo believable and to pass it out as ‘reality’ (Burch, 1990; MacCabe, 1974). On the other, realism has also been interpreted as a quest for truth and social justice, as in the positivist ethos that informs documentary (Zavattini, 1953). Even in the latter sense, however, the extent to which realism has served colonizing ends when used to investigate the ‘truth’ of the Other has also been noted, rendering the form profoundly suspicious (Chow, 2007, p. 150). For realism has been a Western form of representation, one that can be traced back to the invention of perspective in painting and that peaked with the secular worldview brought about by the Enlightenment. And like realism, the nation state too is a product of the Enlightenment, nationalism being, as it were, a secular replacement for the religious - that is enchanted or fantastic - worldview. In this way, realism, cinema and nation are inextricably linked, and equally strained under the current decline of the Enlightenment paradigm. This chapter looks at Y tu Mamá También by Alfonso Cuarón (2001), a highly successful road movie with documentary features, to explore the ways in which realism, cinema and nation interact with each other in the present conditions of ‘globalization’ as experienced in Mexico. The chapter compares and contrasts various interpretations of the role of realism in this film put forward by critics and scholars and other discourses about it circulating in the media with actual ways of audience engagement with it.
Resumo:
As complex radiotherapy techniques become more readily-practiced, comprehensive 3D dosimetry is a growing necessity for advanced quality assurance. However, clinical implementation has been impeded by a wide variety of factors, including the expense of dedicated optical dosimeter readout tools, high operational costs, and the overall difficulty of use. To address these issues, a novel dry-tank optical CT scanner was designed for PRESAGE 3D dosimeter readout, relying on 3D printed components and omitting costly parts from preceding optical scanners. This work details the design, prototyping, and basic commissioning of the Duke Integrated-lens Optical Scanner (DIOS).
The convex scanning geometry was designed in ScanSim, an in-house Monte Carlo optical ray-tracing simulation. ScanSim parameters were used to build a 3D rendering of a convex ‘solid tank’ for optical-CT, which is capable of collimating a point light source into telecentric geometry without significant quantities of refractive-index matched fluid. The model was 3D printed, processed, and converted into a negative mold via rubber casting to produce a transparent polyurethane scanning tank. The DIOS was assembled with the solid tank, a 3W red LED light source, a computer-controlled rotation stage, and a 12-bit CCD camera. Initial optical phantom studies show negligible spatial inaccuracies in 2D projection images and 3D tomographic reconstructions. A PRESAGE 3D dose measurement for a 4-field box treatment plan from Eclipse shows 95% of voxels passing gamma analysis at 3%/3mm criteria. Gamma analysis between tomographic images of the same dosimeter in the DIOS and DLOS systems show 93.1% agreement at 5%/1mm criteria. From this initial study, the DIOS has demonstrated promise as an economically-viable optical-CT scanner. However, further improvements will be necessary to fully develop this system into an accurate and reliable tool for advanced QA.
Pre-clinical animal studies are used as a conventional means of translational research, as a midpoint between in-vitro cell studies and clinical implementation. However, modern small animal radiotherapy platforms are primitive in comparison with conventional linear accelerators. This work also investigates a series of 3D printed tools to expand the treatment capabilities of the X-RAD 225Cx orthovoltage irradiator, and applies them to a feasibility study of hippocampal avoidance in rodent whole-brain radiotherapy.
As an alternative material to lead, a novel 3D-printable tungsten-composite ABS plastic, GMASS, was tested to create precisely-shaped blocks. Film studies show virtually all primary radiation at 225 kVp can be attenuated by GMASS blocks of 0.5cm thickness. A state-of-the-art software, BlockGen, was used to create custom hippocampus-shaped blocks from medical image data, for any possible axial treatment field arrangement. A custom 3D printed bite block was developed to immobilize and position a supine rat for optimal hippocampal conformity. An immobilized rat CT with digitally-inserted blocks was imported into the SmART-Plan Monte-Carlo simulation software to determine the optimal beam arrangement. Protocols with 4 and 7 equally-spaced fields were considered as viable treatment options, featuring improved hippocampal conformity and whole-brain coverage when compared to prior lateral-opposed protocols. Custom rodent-morphic PRESAGE dosimeters were developed to accurately reflect these treatment scenarios, and a 3D dosimetry study was performed to confirm the SmART-Plan simulations. Measured doses indicate significant hippocampal sparing and moderate whole-brain coverage.
Resumo:
Paleotopographic models of the West Antarctic margin, which are essential for robust simulations of paleoclimate scenarios, lack information on sediment thickness and geodynamic conditions, resulting in large uncertainties. A new total sediment thickness grid spanning the Ross Sea-Amundsen Sea-Bellingshausen Sea basins is presented and is based on all the available seismic reflection, borehole, and gravity modeling data offshore West Antarctica. This grid was combined with NGDC's global 5 arc minute grid of ocean sediment thickness (Whittaker et al., 2013, doi:10.1002/ggge.20181) and extends the NGDC grid further to the south. Sediment thickness along the West Antarctic margin tends to be 3-4 km larger than previously assumed. The sediment volume in the Bellingshausen, Amundsen, and Ross Sea basins amounts to 3.61, 3.58, and 2.78 million km³, respectively. The residual basement topography of the South Pacific has been revised and the new data show an asymmetric trend over the Pacific-Antarctic Ridge. Values are anomalously high south of the spreading ridge and in the Ross Sea area, where the topography seems to be affected by persistent mantle processes. In contrast, the basement topography offshore Marie Byrd Land cannot be attributed to dynamic topography, but rather to crustal thickening due to intraplate volcanism. Present-day dynamic topography models disagree with the presented revised basement topography of the South Pacific, rendering paleotopographic reconstructions with such a limited dataset still fairly uncertain.
Resumo:
As human populations and resource consumption increase, it is increasingly important to monitor the quality of our environment. While laboratory instruments offer useful information, portable, easy to use sensors would allow environmental analysis to occur on-site, at lower cost, and with minimal operator training. We explore the synthesis, modification, and applications of modified polysiloxane in environmental sensing. Multiple methods of producing modified siloxanes were investigated. Oligomers were formed by using functionalized monomers, producing siloxane materials containing silicon hydride, methyl, and phenyl side chains. Silicon hydride-functionalized oligomers were further modified by hydrosilylation to incorporate methyl ester and naphthyl side chains. Modifications to the siloxane materials were also carried out using post-curing treatments. Methyl ester-functionalized siloxane was incorporated into the surface of a cured poly(dimethylsiloxane) film by siloxane equilibration. The materials containing methyl esters were hydrolyzed to reveal carboxylic acids, which could later be used for covalent protein immobilization. Finally, the siloxane surfaces were modified to incorporate antibodies by covalent, affinity, and adsorption-based attachment. These modifications were characterized by a variety of methods, including contact angle, attenuated total reflectance Fourier transform infrared spectroscopy, dye labels, and 1H nuclear magnetic resonance spectroscopy. The modified siloxane materials were employed in a variety of sensing schemes. Volatile organic compounds were detected using methyl, phenyl, and naphthyl-functionalized materials on a Fabry-Perot interferometer and a refractometer. The Fabry-Perot interferometer was found to detect the analytes upon siloxane extraction by deformation of the Bragg reflectors. The refractometer was used to determine that naphthyl-functionalized siloxanes had elevated refractive indices, rendering these materials more sensitive to some analytes. Antibody-modified siloxanes were used to detect biological analytes through a solid phase microextraction-mediated enzyme linked immunosorbent assay (SPME ELISA). The SPME ELISA was found to have higher analyte sensitivity compared to a conventional ELISA system. The detection scheme was used to detect Escherichia coli at 8500 CFU/mL. These results demonstrate the variety of methods that can be used to modify siloxanes and the wide range of applications of modified siloxanes has been demonstrated through chemical and biological sensing schemes.