808 resultados para Motion pictures in science.


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In spite of over two decades of intense research, illumination and pose invariance remain prohibitively challenging aspects of face recognition for most practical applications. The objective of this work is to recognize faces using video sequences both for training and recognition input, in a realistic, unconstrained setup in which lighting, pose and user motion pattern have a wide variability and face images are of low resolution. In particular there are three areas of novelty: (i) we show how a photometric model of image formation can be combined with a statistical model of generic face appearance variation, learnt offline, to generalize in the presence of extreme illumination changes; (ii) we use the smoothness of geodesically local appearance manifold structure and a robust same-identity likelihood to achieve invariance to unseen head poses; and (iii) we introduce an accurate video sequence "reillumination" algorithm to achieve robustness to face motion patterns in video. We describe a fully automatic recognition system based on the proposed method and an extensive evaluation on 171 individuals and over 1300 video sequences with extreme illumination, pose and head motion variation. On this challenging data set our system consistently demonstrated a nearly perfect recognition rate (over 99.7%), significantly outperforming state-of-the-art commercial software and methods from the literature. © Springer-Verlag Berlin Heidelberg 2006.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In conventional metals, there is plenty of space for dislocations-line defects whose motion results in permanent material deformation-to multiply, so that the metal strengths are controlled by dislocation interactions with grain boundaries(1,2) and other obstacles(3,4). For nano-structured materials, in contrast, dislocation multiplication is severely confined by the nanometre-scale geometries so that continued plasticity can be expected to be source-controlled. Nano-grained polycrystalline materials were found to be strong but brittle(5-9), because both nucleation and motion of dislocations are effectively suppressed by the nanoscale crystallites. Here we report a dislocation-nucleation-controlled mechanism in nano-twinned metals(10,11) in which there are plenty of dislocation nucleation sites but dislocation motion is not confined. We show that dislocation nucleation governs the strength of such materials, resulting in their softening below a critical twin thickness. Large-scale molecular dynamics simulations and a kinetic theory of dislocation nucleation in nano-twinned metals show that there exists a transition in deformation mechanism, occurring at a critical twin-boundary spacing for which strength is maximized. At this point, the classical Hall-Petch type of strengthening due to dislocation pile-up and cutting through twin planes switches to a dislocation-nucleation-controlled softening mechanism with twin-boundary migration resulting from nucleation and motion of partial dislocations parallel to the twin planes. Most previous studies(12,13) did not consider a sufficient range of twin thickness and therefore missed this strength-softening regime. The simulations indicate that the critical twin-boundary spacing for the onset of softening in nano-twinned copper and the maximum strength depend on the grain size: the smaller the grain size, the smaller the critical twin-boundary spacing, and the higher the maximum strength of the material.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

C-13 and H-1 relaxation times were measured as a function of temperature in two magnetic fields for dilute solutions of phenolphthalein poly(ether sulfone) (PES-C) in deuterated chloroform. The spin-lattice relaxation times were interpreted in terms of segmental motion characterized by the sharp cutoff model of Jones and Stockmayer (J. S. model). The phenyl group rotation is treated as a stochastic diffusion by the J. S. model. The restricted butterfly motion of the phenyl group attached to the cardo ring in PES-C is mentioned but is not discussed in detail in this work. Correlation times for the segmental motion are in the picosecond range which indicates the high flexibility of PES-C chains. The correlation time for the phenyl group internal rotation is similar to that of the segmental motion. The temperature dependence of these motions is weak. The apparent activation energy of the motions considered is less than 10 kJ/mol. The simulating results for PES are also reasonable considering the differences in structure compared with PES-C. The correlation times and the apparent activation energy obtained using the J. S. model for the main chain motion of PES-C are the same as those obtained using the damped orientational diffusion model and the conformational jump model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three kinds of high-performance polyimides 1 (poly(ketone-imide) PKI), 2 (poly(ether-imide) PEI) and 3 (poly(oxy-imide) POI) were studied using nuclear magnetic resonance (NMR). The NMR spectra of the polyimides were assigned according to the comprehensive consideration of the substitution effect of different substituting groups, viz. distortionless enhancement by polarization transfer (DEPT), no nuclear Overhauser effect (NNE), analysis of relaxation time, and two-dimensional correlated spectroscopy (COSY) techniques. The structural units of these three polyimides were determined. Carbon-13 and proton relaxation times for PEI and PKI were interpreted in terms of segmental motion characterized by the sharp cutoff model of Jones and Stockmayer (JS model) and anisotropic group rotation such as phenyl group rotation and methyl group rotation. Correlation times for the main-chain motion are in the tens of picosecond range which indicates the high flexibility of polyimide chains. Correlation times for phenyl group and methyl group rotations are more than 1 order of magnitude lower and approximately 1 order of magnitude higher than that of the main chain, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyses the asymptotic properties of nonlinear least squares estimators of the long run parameters in a bivariate unbalanced cointegration framework. Unbalanced cointegration refers to the situation where the integration orders of the observables are different, but their corresponding balanced versions (with equal integration orders after filtering) are cointegrated in the usual sense. Within this setting, the long run linkage between the observables is driven by both the cointegrating parameter and the difference between the integration orders of the observables, which we consider to be unknown. Our results reveal three noticeable features. First, superconsistent (faster than √ n-consistent) estimators of the difference between memory parameters are achievable. Next, the joint limiting distribution of the estimators of both parameters is singular, and, finally, a modified version of the ‘‘Type II’’ fractional Brownian motion arises in the limiting theory. A Monte Carlo experiment and the discussion of an economic example are included.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A neural network is introduced which provides a solution of the classical motor equivalence problem, whereby many different joint configurations of a redundant manipulator can all be used to realize a desired trajectory in 3-D space. To do this, the network self-organizes a mapping from motion directions in 3-D space to velocity commands in joint space. Computer simulations demonstrate that, without any additional learning, the network can generate accurate movement commands that compensate for variable tool lengths, clamping of joints, distortions of visual input by a prism, and unexpected limb perturbations. Blind reaches have also been simulated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This PhD thesis investigates the potential use of science communication models to engage a broader swathe of actors in decision making in relation to scientific and technological innovation in order to address possible democratic deficits in science and technology policy-making. A four-pronged research approach has been employed to examine different representations of the public(s) and different modes of engagement. The first case study investigates whether patient-groups could represent an alternative needs-driven approach to biomedical and health sciences R & D. This is followed by enquiry into the potential for Science Shops to represent a bottom-up approach to promote research and development of local relevance. The barriers and opportunities for the involvement of scientific researchers in science communication are next investigated via a national survey which is comparable to a similar survey conducted in the UK. The final case study investigates to what extent opposition or support regarding nanotechnology (as an emerging technology) is reflected amongst the YouTube user community and the findings are considered in the context of how support or opposition to new or emerging technologies can be addressed using conflict resolution based approaches to manage potential conflict trajectories. The research indicates that the majority of communication exercises of relevance to science policy and planning take the form of a one-way flow of information with little or no facility for public feedback. This thesis proposes that a more bottom-up approach to research and technology would help broaden acceptability and accountability for decisions made relating to new or existing technological trajectories. This approach could be better integrated with and complementary to government, institutional, e.g. university, and research funding agencies activities and help ensure that public needs and issues are better addressed directly by the research community. Such approaches could also facilitate empowerment of societal stakeholders regarding scientific literacy and agenda-setting. One-way information relays could be adapted to facilitate feedback from representative groups e.g. Non-governmental organisations or Civil Society Organisations (such as patient groups) in order to enhance the functioning and socio-economic relevance of knowledge-based societies to the betterment of human livelihoods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis traces a genealogy of the discourse of mathematics education reform in Ireland at the beginning of the twenty first century at a time when the hegemonic political discourse is that of neoliberalism. It draws on the work of Michel Foucault to identify the network of power relations involved in the development of a single case of curriculum reform – in this case Project Maths. It identifies the construction of an apparatus within the fields of politics, economics and education, the elements of which include institutions like the OECD and the Government, the bureaucracy, expert groups and special interest groups, the media, the school, the State, state assessment and international assessment. Five major themes in educational reform emerge from the analysis: the arrival of neoliberal governance in Ireland; the triumph of human capital theory as the hegemonic educational philosophy here; the dominant role of OECD/PISA and its values in the mathematics education discourse in Ireland; the fetishisation of western scientific knowledge and knowledge as commodity; and the formation of a new kind of subjectivity, namely the subjectivity of the young person as a form of human-capital-to-be. In particular, it provides a critical analysis of the influence of OECD/PISA on the development of mathematics education policy here – especially on Project Maths curriculum, assessment and pedagogy. It unpacks the arguments in favour of curriculum change and lays bare their ideological foundations. This discourse contextualises educational change as occurring within a rapidly changing economic environment where the concept of the State’s economic aspirations and developments in science, technology and communications are reshaping both the focus of business and the demands being put on education. Within this discourse, education is to be repurposed and its consequences measured against the paradigm of the Knowledge Economy – usually characterised as the inevitable or necessary future of a carefully defined present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The domain decomposition method is directed to electronic packaging simulation in this article. The objective is to address the entire simulation process chain, to alleviate user interactions where they are heavy to mechanization by component approach to streamline the model simulation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generation and near-field radiation of aerodynamic sound from a low-speed unsteady flow over a two-dimensional automobile door cavity is simulated by using a source-extraction-based coupling method. In the coupling procedure, the unsteady cavity flow field is first computed solving the Reynolds averaged Navier–Stokes (RANS) equations. The radiated sound is then calculated by using a set of acoustic perturbation equations with acoustic source terms which are extracted from the time-dependent solutions of the unsteady flow. The aerodynamic and its resulting acoustic field are computed for the Reynolds number of 53,266 based on the base length of the cavity. The free stream flow velocity is taken to be 50.9m/s. As first stage of the numerical investigation of flow-induced cavity noise, laminar flow is assumed. The CFD solver is based on a cell-centered finite volume method. A dispersion-relation-preserving (DRP), optimized, fourth-order finite difference scheme with fully staggered-grid implementation is used in the acoustic solver

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In The Eye of Power, Foucault delineated the key concerns surrounding hospital architecture in the latter half of the eighteenth century as being the ‘visibility of bodies, individuals and things'. As such, the ‘new form of hospital' that came to be developed ‘was at once the effect and support of a new type of gaze'. This was a gaze that was not simply concerned with ways of minimising overcrowding or cross-contamination. Rather, this was a surveillance intended to produce knowledge about the pathological bodies contained within the hospital walls. This would then allow for their appropriate classification. Foucault went on to describe how these principles came to be applied to the architecture of prisons. This was exemplified for him in the distinct shape of Bentham's panopticon. This circular design, which has subsequently become an often misused synonym for a contemporary culture of surveillance, was premised on a binary of the seen and the not-seen. An individual observer could stand at the central point of the circle and observe the cells (and their occupants) on the perimeter whilst themselves remaining unseen. The panopticon in its purest form was never constructed, yet it conveys the significance of the production of knowledge through observation that became central to institutional design at this time and modern thought more broadly. What is curious though is that whilst the aim of those late eighteenth century buildings was to produce wellventilated spaces suffused with light, this provoked an interest in its opposite. The gothic movement in literature that was developing in parallel conversely took a ‘fantasy world of stone walls, darkness, hideouts and dungeons…' as its landscape (Vidler, 1992: 162). Curiously, despite these modern developments in prison design, the façade took on these characteristics. The gothic imagination came to describe that unseen world that lay behind the outer wall. This is what Evans refers to as an architectural ‘hoax'. The façade was taken to represent the world within the prison walls and it was the façade that came to inform the popular imagination about what occurred behind it. The rational, modern principles ordering the prison became conflated with the meanings projected by and onto the façade. This confusion of meanings have then been repeated and reenforced in the subsequent representations of the prison. This is of paramount importance since it is the cinematic and televisual representation of the prison, as I argue here and elsewhere, that maintain this erroneous set of meanings, this ‘hoax'.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our understanding of how the visual system processes motion transparency, the phenomenon by which multiple directions of motion are perceived to co-exist in the same spatial region, has grown considerably in the past decade. There is compelling evidence that the process is driven by global-motion mechanisms. Consequently, although transparently moving surfaces are readily segmented over an extended space, the visual system cannot separate two motion signals that co-exist in the same local region. A related issue is whether the visual system can detect transparently moving surfaces simultaneously, or whether the component signals encounter a serial â??bottleneckâ?? during their processing? Our initial results show that, at sufficiently short stimulus durations, observers cannot accurately detect two superimposed directions; yet they have no difficulty in detecting one pattern direction in noise, supporting the serial-bottleneck scenario. However, in a second experiment, the difference in performance between the two tasks disappears when the component patterns are segregated. This discrepancy between the processing of transparent and non-overlapping patterns may be a consequence of suppressed activity of global-motion mechanisms when the transparent surfaces are presented in the same depth plane. To test this explanation, we repeated our initial experiment while separating the motion components in depth. The marked improvement in performance leads us to conclude that transparent motion signals are represented simultaneously.