280 resultados para Eccentric Connectivity Polynomial
Resumo:
Ben Light puts forward an alternative way of thinking about how we engage with social networking sites, going beyond the emphasis upon connectivity that has been associated with research in the area to date. Analysing our engagements and disengagements social networking sites in public (in cafes and at bus stops), at work (at desks, photocopiers and whilst cleaning), in our personal lives (where we cull friends and gossip on backchannels) and as related to our health and wellbeing (where we restrict our updates), he emphasises the importance of disconnection instead of connection. The book produces a theory of disconnective practice. This theory requires our attention to geographies of disconnection that include relations with a site, within a site, between sites and between sites and a physical world. Attention to disconnectors, as human and non-human is required, and the modes by which disconnection can occur can then be revealed. Light argues that diversity in the exercise of power is key to understanding disconnective practice where social networking sites are concerned, and he suggests that the ethics of disconnection may also require interrogation.
Resumo:
Traditionally the notion of drawing in-situ has suggested the physical presence of the artist in the environment under scrutiny. The assumption here of enhanced connectivity, however, is hasty in light of the idea that situation implies a relative spatial value determined by the interplay of subject and location, where the possibility of not being “in-situ” is problematic. The fact that traditional drawing in-situ, such as the rendering of landscape, requires a framing of the world “out there” suggests a distance between the perceived object of representation and the drawing surface. Rather than suggesting that some drawing is situated and other sorts of drawing are not, however, I argue that situation or site is variously extended and intensified depending on the nature of mediation between surface and environment. The suggestion here is that site is not so much a precondition as a performative function, developed in the act of drawing and always implicating the drawing surface. In my discussion I focus on specific works by Toba Khedoori and Cameron Robbins. As well, in using my own recent drawing practice as a case study, I argue that the geography of site is delimited neither by horizon nor the boundaries of the paper. Rather, I propose that site and drawing surface coincide in variously intensive and extensive ways.
Resumo:
Peggy Shaw’s RUFF, (USA 2013) and Queensland Theatre Company’s collaboration with Queensland University of Technology, Total Dik!, (Australia 2013) overtly and evocatively draw on an aestheticized use of the cinematic techniques and technologies of Chroma Key to reveal the tensions in their production and add layers to their performances. In doing so they offer invaluable insight where the filmic and theatrical approaches overlap. This paper draws on Eckersall, Grehan and Scheer’s New Media Dramaturgy (2014) to reposition the frame as a contribution to intermedial theatre and performance practices in light of increasing convergence between seemingly disparate discourses. In RUFF, the scenic environment replicates a chroma-key ‘studio’ which facilitates the reconstruction of memory displaced after a stroke. RUFF uses the screen and projections to recall crooners, lounge singers, movie stars, rock and roll bands, and an eclectic line of eccentric family members living inside Shaw. While the show pays tribute to those who have kept her company across decades of theatrical performance, use of non-composited chroma-key technique as a theatrical device and the work’s taciturn revelation of the production process during performance, play a central role in its exploration of the juxtaposition between its reconstructed form and content. In contrast Total Dik! uses real-time green screen compositing during performance as a scenic device. Actors manipulate scale models, refocus cameras and generate scenes within scenes in the construction of the work’s examination of an isolated Dictator. The ‘studio’ is again replicated as a site for (re)construction, only in this case Total Dik! actively seeks to reveal the process of production as the performance plays out. Building on RUFF, and other works such as By the Way, Meet Vera Stark, (2012) and Hotel Modern’s God’s Beard (2012), this work blends a convergence of mobile technologies, models, and green screen capture to explore aspects of transmedia storytelling in a theatrical environment (Jenkins, 2009, 2013). When a green screen is placed on stage, it reads at once as metaphor and challenge to the language of theatre. It becomes, or rather acts, as a ‘sign’ that alludes to the nature of the reconstructed, recomposited, manipulated and controlled. In RUFF and in Total Dik!, it is also a place where as a mode of production and subsequent reveal, it adds weight to performance. These works are informed by Auslander (1999) and Giesenkam (2007) and speak to and echo Lehmann’s Postdramatic Theatre (2006). This paper’s consideration of the integration of studio technique and live performance as a dynamic approach to multi-layered theatrical production develops our understanding of their combinatory use in a live performance environment.
Resumo:
Despite moral prohibitions on hurting other humans, some social contexts allow for harmful actions such as the killing of others. One example is warfare, where killing enemy soldiers is seen as morally justified. Yet, the neural underpinnings distinguishing between justified and unjustified killing are largely unknown. To improve understanding of the neural processes involved in justified and unjustified killing, participants had to imagine being the perpetrator whilst watching “first-person perspective” animated videos where they shot enemy soldiers (‘justified violence’) and innocent civilians (‘unjustified violence’). When participants imagined themselves shooting civilians compared to soldiers, greater activation was found in the lateral orbitofrontal cortex (OFC). Regression analysis revealed that the more guilt participants felt about shooting civilians, the greater the response in the lateral OFC. Effective connectivity analyses further revealed an increased coupling between lateral OFC and the tempoparietal junction (TPJ) when shooting civilians. The results show that the neural mechanisms typically implicated with harming others, such as the OFC, become less active when the violence against a particular group is seen as justified. This study therefore provides unique insight into how normal individuals can become aggressors in specific situations.
Resumo:
PURPOSE To assess the performance of the 2Win eccentric videorefractor in relation to subjective refraction and table-mounted autorefraction. METHODS Eighty-six eyes of 86 adults (46 male and 40 female subjects) aged between 20 and 25 years were examined. Subjective refraction and autorefraction using the table-mounted Topcon KR8800 and the handheld 2Win videorefractor were carried out in a randomized fashion by three different masked examiners. Measurements were repeated about 1 week after to assess instrument reproducibility, and the intertest variability was compared between techniques. Agreement of the 2Win videorefractor with subjective refraction and autorefraction was assessed for sphere and for cylindrical vectors at 0 degrees (J0) and 45 degrees (J45). RESULTS Reproducibility coefficients for sphere values measured by subjective refraction, Topcon KR8800, and 2Win (±0.42, ±0.70, and ±1.18, respectively) were better than their corresponding J0 (±1.0, ±0.85, and ±1.66) and J45 (±1.01, ±0.87, and ±1.31) vector components. The Topcon KR8800 showed the most reproducible values for mean spherical equivalent refraction and the J0 and J45 vector components, whereas reproducibility of spherical component was best for subjective refraction. The 2Win videorefractor measurements were the least reproducible for all measures. All refractive components measured by the 2Win videorefractor did not differ significantly from those of subjective refraction, in both sessions (p > 0.05). The Topcon KR8800 autorefractometer and the 2Win videorefractor measured significantly more positive spheres and mean spherical equivalent refraction (p < 0.0001), but the J0 and J45 vector components were similar (p > 0.05), in both sessions. CONCLUSIONS The 2Win videorefractor compares well, on average, with subjective refraction. The reproducibility values for the 2Win videorefractor were considerably worse than either subjective refraction or autorefraction. The wide limits of reproducibility of the 2Win videorefractor probably limit its usefulness as a primary screening device.
Resumo:
Nth-Dimensional Truncated Polynomial Ring (NTRU) is a lattice-based public-key cryptosystem that offers encryption and digital signature solutions. It was designed by Silverman, Hoffstein and Pipher. The NTRU cryptosystem was patented by NTRU Cryptosystems Inc. (which was later acquired by Security Innovations) and available as IEEE 1363.1 and X9.98 standards. NTRU is resistant to attacks based on Quantum computing, to which the standard RSA and ECC public-key cryptosystems are vulnerable to. In addition, NTRU has higher performance advantages over these cryptosystems. Considering this importance of NTRU, it is highly recommended to adopt NTRU as part of a cipher suite along with widely used cryptosystems for internet security protocols and applications. In this paper, we present our analytical study on the implementation of NTRU encryption scheme which serves as a guideline for security practitioners who are novice to lattice-based cryptography or even cryptography. In particular, we show some non-trivial issues that should be considered towards a secure and efficient NTRU implementation.
Resumo:
Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.
Resumo:
This thesis describes, for the first time, the forces involved in the Nordic hamstring exercise, its reliability and the biomechanical effects of extra loading during the movement. The results provide practitioners with valuable information to enhance hamstring injury prevention and rehabilitation programs.
Resumo:
The future of civic engagement is characterised by both technological innovation as well as new technological user practices that are fuelled by trends towards mobile, personal devices; broadband connectivity; open data; urban interfaces; and, cloud computing. These technology trends are progressing at a rapid pace, and have led global technology vendors to package and sell the ‘Smart City’ as a centralized service delivery platform predicted to optimize and enhance cities’ key performance indicators – and generate a profitable market. The top-down deployment of these large and proprietary technology platforms have helped sectors such as energy, transport, and healthcare to increase efficiencies. However, an increasing number of scholars and commentators warn of another ‘IT bubble’ emerging. Along with some city leaders, they argue that the top-down approach does not fit the governance dynamics and values of a liberal democracy when applied across sectors. A thorough understanding is required, of the socio-cultural nuances of how people work, live, play across different environments, and how they employ social media and mobile devices to interact with, engage in, and constitute public realms. Although the term ‘slacktivism’ is sometimes used to denote a watered down version of civic engagement and activism that is reduced to clicking a ‘Like’ button and signing online petitions, we believe that we are far from witnessing another Biedermeier period that saw people focus on the domestic and the non-political. There is plenty of evidence to the contrary, such as post-election violence in Kenya in 2008, the Occupy movements in New York, Hong Kong and elsewhere, the Arab Spring, Stuttgart 21, Fukushima, the Taksim Gezi Park in Istanbul, and the Vinegar Movement in Brazil in 2013. These examples of civic action shape the dynamics of governments, and in turn, call for new processes to be incorporated into governance structures. Participatory research into these new processes across the triad of people, place and technology is a significant and timely investment to foster productive, sustainable, and livable human habitats. With this chapter, we want to reframe the current debates in academia and priorities in industry and government to allow citizens and civic actors to take their rightful centerpiece place in civic movements. This calls for new participatory approaches for co-inquiry and co-design. It is an evolving process with an explicit agenda to facilitate change, and we propose participatory action research (PAR) as an indispensable component in the journey to develop new governance infrastructures and practices for civic engagement. This chapter proposes participatory action research as a useful and fitting research paradigm to guide methodological considerations surrounding the study, design, development, and evaluation of civic technologies. We do not limit our definition of civic technologies to tools specifically designed to simply enhance government and governance, such as renewing your car registration online or casting your vote electronically on election day. Rather, we are interested in civic media and technologies that foster citizen engagement in the widest sense, and particularly the participatory design of such civic technologies that strive to involve citizens in political debate and action as well as question conventional approaches to political issues (DiSalvo, 2012; Dourish, 2010; Foth et al., 2013). Following an outline of some underlying principles and assumptions behind participatory action research, especially as it applies to cities, we will critically review case studies to illustrate the application of this approach with a view to engender robust, inclusive, and dynamic societies built on the principles of engaged liberal democracy. The rationale for this approach is an alternative to smart cities in a ‘perpetual tomorrow,’ (cf. e.g. Dourish & Bell, 2011), based on many weak and strong signals of civic actions revolving around technology seen today. It seeks to emphasize and direct attention to active citizenry over passive consumerism, human actors over human factors, culture over infrastructure, and prosperity over efficiency. First, we will have a look at some fundamental issues arising from applying simplistic smart city visions to the kind of a problem a city is (cf. Jacobs, 1961). We focus on the touch points between “the city” and its civic body, the citizens. In order to provide for meaningful civic engagement, the city must provide appropriate interfaces.
Resumo:
To analyse and compare standing thoracolumbar curves in normal weight participants and participants with obesity, using an electromagnetic device, and to analyse the measurement reliability. Material and Methods. Cross-sectional study was carried out. 36 individuals were divided into two groups (normal-weight and participants with obesity) according to their waist circumference. The reference points (T1–T8–L1–L5 and both posterior superior iliac spines) were used to perform a description of thoracolumbar curvature in the sagittal and coronal planes. A transformation from the global coordinate system was performed and thoracolumbar curves were adjusted by fifth-order polynomial equations. The tangents of the first and fifth lumbar vertebrae and the first thoracic vertebra were determined from their derivatives. The reliability of the measurement was assessed according to the internal consistency of the measure and the thoracolumbar curvature angles were compared between groups. Results. Cronbach’s alpha values ranged between 0.824 (95% CI: 0.776–0.847) and 0.918 (95% CI: 0.903–0.949). In the coronal plane, no significant differences were found between groups; however, in sagittal plane, significant differences were observed for thoracic kyphosis. Conclusion. There were significant differences in thoracic kyphosis in the sagittal plane between two groups of young adults grouped according to their waist circumference.
Resumo:
This paper presents an uncertainty quantification study of the performance analysis of the high pressure ratio single stage radial-inflow turbine used in the Sundstrand Power Systems T-100 Multi-purpose Small Power Unit. A deterministic 3D volume-averaged Computational Fluid Dynamics (CFD) solver is coupled with a non-statistical generalized Polynomial Chaos (gPC) representation based on a pseudo-spectral projection method. One of the advantages of this approach is that it does not require any modification of the CFD code for the propagation of random disturbances in the aerodynamic and geometric fields. The stochastic results highlight the importance of the blade thickness and trailing edge tip radius on the total-to-static efficiency of the turbine compared to the angular velocity and trailing edge tip length. From a theoretical point of view, the use of the gPC representation on an arbitrary grid also allows the investigation of the sensitivity of the blade thickness profiles on the turbine efficiency. The gPC approach is also applied to coupled random parameters. The results show that the most influential coupled random variables are trailing edge tip radius coupled with the angular velocity.
Resumo:
We developed an analysis pipeline enabling population studies of HARDI data, and applied it to map genetic influences on fiber architecture in 90 twin subjects. We applied tensor-driven 3D fluid registration to HARDI, resampling the spherical fiber orientation distribution functions (ODFs) in appropriate Riemannian manifolds, after ODF regularization and sharpening. Fitting structural equation models (SEM) from quantitative genetics, we evaluated genetic influences on the Jensen-Shannon divergence (JSD), a novel measure of fiber spatial coherence, and on the generalized fiber anisotropy (GFA) a measure of fiber integrity. With random-effects regression, we mapped regions where diffusion profiles were highly correlated with subjects' intelligence quotient (IQ). Fiber complexity was predominantly under genetic control, and higher in more highly anisotropic regions; the proportion of genetic versus environmental control varied spatially. Our methods show promise for discovering genes affecting fiber connectivity in the brain.
Resumo:
We report the first 3D maps of genetic effects on brain fiber complexity. We analyzed HARDI brain imaging data from 90 young adult twins using an information-theoretic measure, the Jensen-Shannon divergence (JSD), to gauge the regional complexity of the white matter fiber orientation distribution functions (ODF). HARDI data were fluidly registered using Karcher means and ODF square-roots for interpol ation; each subject's JSD map was computed from the spatial coherence of the ODFs in each voxel's neighborhood. We evaluated the genetic influences on generalized fiber anisotropy (GFA) and complexity (JSD) using structural equation models (SEM). At each voxel, genetic and environmental components of data variation were estimated, and their goodness of fit tested by permutation. Color-coded maps revealed that the optimal models varied for different brain regions. Fiber complexity was predominantly under genetic control, and was higher in more highly anisotropic regions. These methods show promise for discovering factors affecting fiber connectivity in the brain.
Resumo:
A major challenge in neuroscience is finding which genes affect brain integrity, connectivity, and intellectual function. Discovering influential genes holds vast promise for neuroscience, but typical genome-wide searches assess approximately one million genetic variants one-by-one, leading to intractable false positive rates, even with vast samples of subjects. Even more intractable is the question of which genes interact and how they work together to affect brain connectivity. Here, we report a novel approach that discovers which genes contribute to brain wiring and fiber integrity at all pairs of points in a brain scan. We studied genetic correlations between thousands of points in human brain images from 472 twins and their nontwin siblings (mean age: 23.7 2.1 SD years; 193 male/279 female).Wecombined clustering with genome-wide scanning to find brain systems withcommongenetic determination.Wethen filtered the image in a new way to boost power to find causal genes. Using network analysis, we found a network of genes that affect brain wiring in healthy young adults. Our new strategy makes it computationally more tractable to discover genes that affect brain integrity. The gene network showed small-world and scale-free topologies, suggesting efficiency in genetic interactions and resilience to network disruption. Genetic variants at hubs of the network influence intellectual performance by modulating associations between performance intelligence quotient and the integrity of major white matter tracts, such as the callosal genu and splenium, cingulum, optic radiations, and the superior longitudinal fasciculus.
Resumo:
The study is the first to analyze genetic and environmental factors that affect brain fiber architecture and its genetic linkage with cognitive function. We assessed white matter integrity voxelwise using diffusion tensor imaging at high magnetic field (4 Tesla), in 92 identical and fraternal twins. White matter integrity, quantified using fractional anisotropy (FA), was used to fit structural equation models (SEM) at each point in the brain, generating three-dimensional maps of heritability. We visualized the anatomical profile of correlations between white matter integrity and full-scale, verbal, and performance intelligence quotients (FIQ, VIQ, and PIQ). White matter integrity (FA) was under strong genetic control and was highly heritable in bilateral frontal (a 2 = 0.55, p = 0.04, left; a 2 = 0.74, p = 0.006, right), bilateral parietal (a 2 = 0.85, p < 0.001, left; a 2 = 0.84, p < 0.001, right), and left occipital (a 2 = 0.76, p = 0.003) lobes, and was correlated with FIQ and PIQ in the cingulum, optic radiations, superior fronto- occipital fasciculus, internal capsule, callosal isthmus, and the corona radiata (p = 0.04 for FIQ and p = 0.01 for PIQ, corrected for multiple comparisons). In a cross-trait mapping approach, common genetic factors mediated the correlation between IQ and white matter integrity, suggesting a common physiological mechanism for both, and common genetic determination. These genetic brain maps reveal heritable aspects of white matter integrity and should expedite the discovery of single-nucleotide polymorphisms affecting fiber connectivity and cognition.