956 resultados para convergence


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the companion paper, a fourth-order element formulation in an updated Lagrangian formulation was presented to handle geometric non-linearities. The formulation of the present paper extends this to include material non-linearity by proposing a refined plastic hinge approach to analyse large steel framed structures with many members, for which contemporary algorithms based on the plastic zone approach can be problematic computationally. This concept is an advancement of conventional plastic hinge approaches, as the refined plastic hinge technique allows for gradual yielding, being recognized as distributed plasticity across the element section, a condition of full plasticity, as well as including strain hardening. It is founded on interaction yield surfaces specified analytically in terms of force resultants, and achieves accurate and rapid convergence for large frames for which geometric and material non-linearity are significant. The solutions are shown to be efficacious in terms of a balance of accuracy and computational expediency. In addition to the numerical efficiency, the present versatile approach is able to capture different kinds of material and geometric non-linearities on general applications of steel structures, and thereby it offers an efficacious and accurate means of assessing non-linear behaviour of the structures for engineering practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Finite element frame analysis programs targeted for design office application necessitate algorithms which can deliver reliable numerical convergence in a practical timeframe with comparable degrees of accuracy, and a highly desirable attribute is the use of a single element per member to reduce computational storage, as well as data preparation and the interpretation of the results. To this end, a higher-order finite element method including geometric non-linearity is addressed in the paper for the analysis of elastic frames for which a single element is used to model each member. The geometric non-linearity in the structure is handled using an updated Lagrangian formulation, which takes the effects of the large translations and rotations that occur at the joints into consideration by accumulating their nodal coordinates. Rigid body movements are eliminated from the local member load-displacement relationship for which the total secant stiffness is formulated for evaluating the large member deformations of an element. The influences of the axial force on the member stiffness and the changes in the member chord length are taken into account using a modified bowing function which is formulated in the total secant stiffness relationship, for which the coupling of the axial strain and flexural bowing is included. The accuracy and efficiency of the technique is verified by comparisons with a number of plane and spatial structures, whose structural response has been reported in independent studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper emphasizes material nonlinear effects on composite beams with recourse to the plastic hinge method. Numerous combinations of steel and concrete sections form arbitrary composite sections. Secondly, the material properties of composite beams vary remarkably across its section from ductile steel to brittle concrete. Thirdly, concrete is weak in tension, so composite section changes are dependent on load distribution. To this end, the plastic zone approach is convenient for inelastic analysis of composite sections that can evaluate member resistance, including material nonlinearities, by routine numerical integration with respect to every fiber across the composite section. As a result, many researchers usually adopt the plastic zone approach for numerical inelastic analyses of composite structures. On the other hand, the plastic hinge method describes nonlinear material behaviour of an overall composite section integrally. Consequently, proper section properties for use in plastic hinge spring stiffness are required to represent the material behaviour across the arbitrary whole composite section. In view of numerical efficiency and convergence, the plastic hinge method is superior to the plastic zone method. Therefore, based on the plastic hinge approach, how to incorporate the material nonlinearities of the arbitrary composite section into the plastic hinge stiffness formulation becomes a prime objective of the present paper. The partial shear connection in this paper is by virtue of the effective flexural rigidity as AISC 1993 [American Institute of Steel Construction (AISC). Load and resistance factor design specifications. 2nd ed., Chicago; 1993]. Nonlinear behaviour of different kinds of composite beam is investigated in this paper, including two simply supported composite beams, a cantilever and a two span continuous composite beam.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of Mahalanobis squared distance–based novelty detection in statistical damage identification has become increasingly popular in recent years. The merit of the Mahalanobis squared distance–based method is that it is simple and requires low computational effort to enable the use of a higher dimensional damage-sensitive feature, which is generally more sensitive to structural changes. Mahalanobis squared distance–based damage identification is also believed to be one of the most suitable methods for modern sensing systems such as wireless sensors. Although possessing such advantages, this method is rather strict with the input requirement as it assumes the training data to be multivariate normal, which is not always available particularly at an early monitoring stage. As a consequence, it may result in an ill-conditioned training model with erroneous novelty detection and damage identification outcomes. To date, there appears to be no study on how to systematically cope with such practical issues especially in the context of a statistical damage identification problem. To address this need, this article proposes a controlled data generation scheme, which is based upon the Monte Carlo simulation methodology with the addition of several controlling and evaluation tools to assess the condition of output data. By evaluating the convergence of the data condition indices, the proposed scheme is able to determine the optimal setups for the data generation process and subsequently avoid unnecessarily excessive data. The efficacy of this scheme is demonstrated via applications to a benchmark structure data in the field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This creative work is the production of the live and animated performance of The Empty City. With a significant period of creative development and script work behind it, the team engaged in a range of innovative performance-making practices in order to realise the work onstage as a non-verbal live and animated theatre work. This intermedial process was often led by music, and involved the creation and convergence of non-verbal action, virtual performers, performing objects and two simultaneous projections of animated images. The production opened at the Brisbane Powerhouse on June 27 2013, with a subsequent tour to Perth’s Awesome Festival in October 2013. Its technical achievements were noted in the critical responses. "The story is told on a striking set of two huge screens, the front one transparent, upon which still and moving images are projected, and between which Oliver performs and occasional “real” objects are placed. The effect is startling, and creates a cartoon three dimensionality like those old Viewmaster slide shows. The live action… and soundscape sync perfectly with the projected imagery to complete a dense, intricately devised and technically brilliant whole." (The West Australian 14.10.13)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cloud computing is a currently developing revolution in information technology that is disturbing the way that individuals and corporate entities operate while enabling new distributed services that have not existed before. At the foundation of cloud computing is the broader concept of converged infrastructure and shared services. Security is often said to be a major concern of users considering migration to cloud computing. This article examines some of these security concerns and surveys recent research efforts in cryptography to provide new technical mechanisms suitable for the new scenarios of cloud computing. We consider techniques such as homomorphic encryption, searchable encryption, proofs of storage, and proofs of location. These techniques allow cloud computing users to benefit from cloud server processing capabilities while keeping their data encrypted; and to check independently the integrity and location of their data. Overall we are interested in how users may be able to maintain and verify their own security without having to rely on the trust of the cloud provider.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Addressing possibilities for authentic combinations of diverse media within an installation setting, this research tested hybrid blends of the physical, digital and temporal to explore liminal space and image. The practice led research reflected on creation of artworks from three perspectives – material, immaterial and hybrid – and in doing so, developed a new methodological structure that extends conventional forms of triangulation. This study explored how physical and digital elements each sought hierarchical presence, yet simultaneously coexisted, thereby extending the visual and conceptual potential of the work. Outcomes demonstrated how utilising and recording transitional processes of hybrid imagery achieved a convergence of diverse, experiential forms. "Hybrid authority" – an authentic convergence of disparate elements – was articulated in the creation and public sharing of processual works and the creation of an innovative framework for hybrid art practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The wide applicability of correlation analysis inspired the development of this paper. In this paper, a new correlated modified particle swarm optimization (COM-PSO) is developed. The Correlation Adjustment algorithm is proposed to recover the correlation between the considered variables of all particles at each of iterations. It is shown that the best solution, the mean and standard deviation of the solutions over the multiple runs as well as the convergence speed were improved when the correlation between the variables was increased. However, for some rotated benchmark function, the contrary results are obtained. Moreover, the best solution, the mean and standard deviation of the solutions are improved when the number of correlated variables of the benchmark functions is increased. The results of simulations and convergence performance are compared with the original PSO. The improvement of results, the convergence speed, and the ability to simulate the correlated phenomena by the proposed COM-PSO are discussed by the experimental results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a novel algorithm based on particle swarm optimization (PSO) to estimate the states of electric distribution networks. In order to improve the performance, accuracy, convergence speed, and eliminate the stagnation effect of original PSO, a secondary PSO loop and mutation algorithm as well as stretching function is proposed. For accounting uncertainties of loads in distribution networks, pseudo-measurements is modeled as loads with the realistic errors. Simulation results on 6-bus radial and 34-bus IEEE test distribution networks show that the distribution state estimation based on proposed DLM-PSO presents lower estimation error and standard deviation in comparison with algorithms such as WLS, GA, HBMO, and original PSO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is the fourth edition of New Media: An Introduction, with the previous editions being published by Oxford University Press in 2002, 2005 and 2008. As the first edition of the book published in the 2010s, every chapter has been comprehensively revised, and there are new chapters on: • Online News and the Future of Journalism (Chapter 7) • New Media and the Transformation of Higher Education (Chapter 10) • Online Activism and Networked Politics (Chapter 12). It has retained popular features of the third edition, including the twenty key concepts in new media (Chapter 2) and illustrative case studies to assist with teaching new media. The case studies in the book cover: the global internet; Wikipedia; transmedia storytelling; Media Studies 2.0; the games industry and exploitation; video games and violence; WikiLeaks; the innovator’s dilemma; massive open online courses (MOOCs); Creative Commons; the Barack Obama Presidential campaigns; and the Arab Spring. Several major changes in the media environment since the publication of the third edition stand out. Of particular importance has been the rise of social media platforms such as Facebook, Twitter and YouTube, which draw out even more strongly the features of the internet as networked and participatory media, with a range of implications across the economy, society and culture. In addition, the political implications of new media have become more apparent with a range of social media-based political campaigns, from Barack Obama’s successful Presidential election campaigns to the Occupy movements and the Arab Spring. At the same time, the subsequent developments of politics in these and other cases has drawn attention to the limitations of thinking about the politics or the public sphere in technologically determinist ways. When the first edition of New Media was published in 2002, the concept of new media was seen as being largely about the internet as it was accessed from personal computers. The subsequent decade has seen a proliferation of platforms and devices: we now access media in all forms from our phones and other mobile platforms, therefore we seen television and the internet increasingly converging, and we see a growing uncoupling of digital media content and delivery platforms. While this has a range of implications for media law and policy, from convergent media policy to copyright reform, governments and policy-makers are struggling to adapt to such seismic shifts from mass communications media to convergent social media. The internet is no longer primarily a Western-based medium. Two-thirds of the world’s internet users are now outside of Europe and North America; three-quarters of internet users use languages other than English; and three-quarters of the world’s mobile cellular phone subscriptions are in developing nations. It is also apparent that conducting discussions about how to develop new media technologies and discussions about their cultural and creative content can no longer be separated. Discussions of broadband strategies and the knowledge economy need to be increasingly joined with those concerning the creative industries and the creative economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Safety concerns in the operation of autonomous aerial systems require safe-landing protocols be followed during situations where the mission should be aborted due to mechanical or other failure. This article presents a pulse-coupled neural network (PCNN) to assist in the vegetation classification in a vision-based landing site detection system for an unmanned aircraft. We propose a heterogeneous computing architecture and an OpenCL implementation of a PCNN feature generator. Its performance is compared across OpenCL kernels designed for CPU, GPU, and FPGA platforms. This comparison examines the compute times required for network convergence under a variety of images to determine the plausibility for real-time feature detection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper critically evaluates the series of inquires that the Australian Labor government undertook during 2011-2013 into reform of Australian media, communications and copyright laws. One important driver of policy reform was the government’s commitment to building a National Broadband Network (NBN), and the implications this had for existing broadcasting and telecommunications policy, as it would constitute a major driver of convergence of media and communications access devices and content platforms. These inquiries included: the Convergence Review of media and communications legislation; the Australian Law Reform Commission (ALRC) review of the National Classification Scheme; and the Independent Media Inquiry (Finkelstein Review) into Media and Media Regulation. One unusual feature of this review process was the degree to which academics were involved in the process, not simply as providers of expert opinion, but as review chairs seconded from their universities. This paper considers the role played by activist groups in all of these inquiries and their relationship to the various participants in the inquiries, as well as the implications of academics being engaged in such inquiries, not simply as activist-scholars, but as those primarily responsible for delivering policy review outcomes. The paper draws upon the concept of "policy windows" in order to better understand the context in which the inquiries took place, and their relative lack of legislative impact.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bundle adjustment is one of the essential components of the computer vision toolbox. This paper revisits the resection-intersection approach, which has previously been shown to have inferior convergence properties. Modifications are proposed that greatly improve the performance of this method, resulting in a fast and accurate approach. Firstly, a linear triangulation step is added to the intersection stage, yielding higher accuracy and improved convergence rate. Secondly, the effect of parameter updates is tracked in order to reduce wasteful computation; only variables coupled to significantly changing variables are updated. This leads to significant improvements in computation time, at the cost of a small, controllable increase in error. Loop closures are handled effectively without the need for additional network modelling. The proposed approach is shown experimentally to yield comparable accuracy to a full sparse bundle adjustment (20% error increase) while computation time scales much better with the number of variables. Experiments on a progressive reconstruction system show the proposed method to be more efficient by a factor of 65 to 177, and 4.5 times more accurate (increasing over time) than a localised sparse bundle adjustment approach.