955 resultados para Specification Animation
Resumo:
During the past decade, a significant amount of research has been conducted internationally with the aim of developing, implementing, and verifying "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures. Application of these methods permits comprehensive assessment of the actual failure modes and ultimate strengths of structural systems in practical design situations, without resort to simplified elastic methods of analysis and semi-empirical specification equations. Advanced analysis has the potential to extend the creativity of structural engineers and simplify the design process, while ensuring greater economy and more uniform safety with respect to the ultimate limit state. The application of advanced analysis methods has previously been restricted to steel frames comprising only members with compact cross-sections that are not subject to the effects of local buckling. This precluded the use of advanced analysis from the design of steel frames comprising a significant proportion of the most commonly used Australian sections, which are non-compact and subject to the effects of local buckling. This thesis contains a detailed description of research conducted over the past three years in an attempt to extend the scope of advanced analysis by developing methods that include the effects of local buckling in a non-linear analysis formulation, suitable for practical design of steel frames comprising non-compact sections. Two alternative concentrated plasticity formulations are presented in this thesis: the refined plastic hinge method and the pseudo plastic zone method. Both methods implicitly account for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. The accuracy and precision of the methods for the analysis of steel frames comprising non-compact sections has been established by comparison with a comprehensive range of analytical benchmark frame solutions. Both the refined plastic hinge and pseudo plastic zone methods are more accurate and precise than the conventional individual member design methods based on elastic analysis and specification equations. For example, the pseudo plastic zone method predicts the ultimate strength of the analytical benchmark frames with an average conservative error of less than one percent, and has an acceptable maximum unconservati_ve error of less than five percent. The pseudo plastic zone model can allow the design capacity to be increased by up to 30 percent for simple frames, mainly due to the consideration of inelastic redistribution. The benefits may be even more significant for complex frames with significant redundancy, which provides greater scope for inelastic redistribution. The analytical benchmark frame solutions were obtained using a distributed plasticity shell finite element model. A detailed description of this model and the results of all the 120 benchmark analyses are provided. The model explicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. Its accuracy was verified by comparison with a variety of analytical solutions and the results of three large-scale experimental tests of steel frames comprising non-compact sections. A description of the experimental method and test results is also provided.
Resumo:
Business process modeling is widely regarded as one of the most popular forms of conceptual modeling. However, little is known about the capabilities and deficiencies of process modeling grammars and how existing deficiencies impact actual process modeling practice. This paper is a first contribution towards a theory-driven, exploratory empirical investigation of the ontological deficiencies of process modeling with the industry standard Business Process Modeling Notation (BPMN). We perform an analysis of BPMN using a theory of ontological expressiveness. Through a series of semi-structured interviews with BPMN adopters we explore empirically the actual use of this grammar. Nine ontological deficiencies related to the practice of modeling with BPMN are identified, for example, the capture of business rules and the specification of process decompositions. We also uncover five contextual factors that impact on the use of process modeling grammars, such as tool support and modeling conventions. We discuss implications for research and practice, highlighting the need for consideration of representational issues and contextual factors in decisions relating to BPMN adoption in organizations.
Resumo:
Component software has many benefits, most notably increased software re-use; however, the component software process places heavy burdens on programming language technology, which modern object-oriented programming languages do not address. In particular, software components require specifications that are both sufficiently expressive and sufficiently abstract, and, where possible, these specifications should be checked formally by the programming language. This dissertation presents a programming language called Mentok that provides two novel programming language features enabling improved specification of stateful component roles. Negotiable interfaces are interface types extended with protocols, and allow specification of changing method availability, including some patterns of out-calls and re-entrance. Type layers are extensions to module signatures that allow specification of abstract control flow constraints through the interfaces of a component-based application. Development of Mentok's unique language features included creation of MentokC, the Mentok compiler, and formalization of key properties of Mentok in mini-languages called MentokP and MentokL.
Resumo:
Advances in safety research—trying to improve the collective understanding of motor vehicle crash causation—rests upon the pursuit of numerous lines of inquiry. The research community has focused on analytical methods development (negative binomial specifications, simultaneous equations, etc.), on better experimental designs (before-after studies, comparison sites, etc.), on improving exposure measures, and on model specification improvements (additive terms, non-linear relations, etc.). One might think of different lines of inquiry in terms of ‘low lying fruit’—areas of inquiry that might provide significant improvements in understanding crash causation. It is the contention of this research that omitted variable bias caused by the exclusion of important variables is an important line of inquiry in safety research. In particular, spatially related variables are often difficult to collect and omitted from crash models—but offer significant ability to better understand contributing factors to crashes. This study—believed to represent a unique contribution to the safety literature—develops and examines the role of a sizeable set of spatial variables in intersection crash occurrence. In addition to commonly considered traffic and geometric variables, examined spatial factors include local influences of weather, sun glare, proximity to drinking establishments, and proximity to schools. The results indicate that inclusion of these factors results in significant improvement in model explanatory power, and the results also generally agree with expectation. The research illuminates the importance of spatial variables in safety research and also the negative consequences of their omissions.
Resumo:
Statistical modeling of traffic crashes has been of interest to researchers for decades. Over the most recent decade many crash models have accounted for extra-variation in crash counts—variation over and above that accounted for by the Poisson density. The extra-variation – or dispersion – is theorized to capture unaccounted for variation in crashes across sites. The majority of studies have assumed fixed dispersion parameters in over-dispersed crash models—tantamount to assuming that unaccounted for variation is proportional to the expected crash count. Miaou and Lord [Miaou, S.P., Lord, D., 2003. Modeling traffic crash-flow relationships for intersections: dispersion parameter, functional form, and Bayes versus empirical Bayes methods. Transport. Res. Rec. 1840, 31–40] challenged the fixed dispersion parameter assumption, and examined various dispersion parameter relationships when modeling urban signalized intersection accidents in Toronto. They suggested that further work is needed to determine the appropriateness of the findings for rural as well as other intersection types, to corroborate their findings, and to explore alternative dispersion functions. This study builds upon the work of Miaou and Lord, with exploration of additional dispersion functions, the use of an independent data set, and presents an opportunity to corroborate their findings. Data from Georgia are used in this study. A Bayesian modeling approach with non-informative priors is adopted, using sampling-based estimation via Markov Chain Monte Carlo (MCMC) and the Gibbs sampler. A total of eight model specifications were developed; four of them employed traffic flows as explanatory factors in mean structure while the remainder of them included geometric factors in addition to major and minor road traffic flows. The models were compared and contrasted using the significance of coefficients, standard deviance, chi-square goodness-of-fit, and deviance information criteria (DIC) statistics. The findings indicate that the modeling of the dispersion parameter, which essentially explains the extra-variance structure, depends greatly on how the mean structure is modeled. In the presence of a well-defined mean function, the extra-variance structure generally becomes insignificant, i.e. the variance structure is a simple function of the mean. It appears that extra-variation is a function of covariates when the mean structure (expected crash count) is poorly specified and suffers from omitted variables. In contrast, when sufficient explanatory variables are used to model the mean (expected crash count), extra-Poisson variation is not significantly related to these variables. If these results are generalizable, they suggest that model specification may be improved by testing extra-variation functions for significance. They also suggest that known influences of expected crash counts are likely to be different than factors that might help to explain unaccounted for variation in crashes across sites
Resumo:
Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile
Resumo:
The term “cloud computing” has emerged as a major ICT trend and has been acknowledged by respected industry survey organizations as a key technology and market development theme for the industry and ICT users in 2010. However, one of the major challenges that faces the cloud computing concept and its global acceptance is how to secure and protect the data and processes that are the property of the user. The security of the cloud computing environment is a new research area requiring further development by both the academic and industrial research communities. Today, there are many diverse and uncoordinated efforts underway to address security issues in cloud computing and, especially, the identity management issues. This paper introduces an architecture for a new approach to necessary “mutual protection” in the cloud computing environment, based upon a concept of mutual trust and the specification of definable profiles in vector matrix form. The architecture aims to achieve better, more generic and flexible authentication, authorization and control, based on a concept of mutuality, within that cloud computing environment.
Resumo:
Groundwater is increasingly recognised as an important yet vulnerable natural resource, and a key consideration in water cycle management. However, communication of sub-surface water system behaviour, as an important part of encouraging better water management, is visually difficult. Modern 3D visualisation techniques can be used to effectively communicate these complex behaviours to engage and inform community stakeholders. Most software developed for this purpose is expensive and requires specialist skills. The Groundwater Visualisation System (GVS) developed by QUT integrates a wide range of surface and sub-surface data, to produce a 3D visualisation of the behaviour, structure and connectivity of groundwater/surface water systems. Surface data (elevation, surface water, land use, vegetation and geology) and data collected from boreholes (bore locations and subsurface geology) are combined to visualise the nature, structure and connectivity of groundwater/surface water systems. Time-series data (water levels, groundwater quality, rainfall, stream flow and groundwater abstraction) is displayed as an animation within the 3D framework, or graphically, to show water system condition changes over time. GVS delivers an interactive, stand-alone 3D Visualisation product that can be used in a standard PC environment. No specialised training or modelling skills are required. The software has been used extensively in the SEQ region to inform and engage both water managers and the community alike. Examples will be given of GVS visualisations developed in areas where there have been community concerns around groundwater over-use and contamination.
Resumo:
China has made great progress in constructing comprehensive legislative and judicial infrastructures to protect intellectual property rights. But levels of enforcement remain low. Estimates suggest that 90% of film and music products consumed in China are ‘pirated’ and in 2009 81% of the infringing goods seized at the US border originated from China. Despite of heavy criticism over its failure to enforce IPRs, key areas of China’s creative industries, including film, mobile-music, fashion and animation, are developing rapidly. This paper explores how the rapid expansion of China’s creative economy might be reconciled with conceptual approaches that view the CIs in terms of creativity inputs and IP outputs. It argues that an evolutionary understanding of copyright’s role in creative innovation might better explain China’s experiences and provide more general insights into the nature of the creative industries and the policies most likely to promote growth in this sector of the economy.
Resumo:
In this paper we discuss an advanced, 3D groundwater visualisation and animation system that allows scientists, government agencies and community groups to better understand the groundwater processes that effect community planning and decision-making. The system is unique in that it has been designed to optimise community engagement. Although it incorporates a powerful visualisation engine, this open-source system can be freely distributed and boasts a simple user interface allowing individuals to run and investigate the models on their own PCs and gain intimate knowledge of the groundwater systems. The initial version of the Groundwater Visualisation System (GVS v1.0), was developed from a coastal delta setting (Bundaberg, QLD), and then applied to a basalt catchment area (Obi Obi Creek, Maleny, QLD). Several major enhancements have been developed to produce higher quality visualisations, including display of more types of data, support for larger models and improved user interaction. The graphics and animation capabilities have also been enhanced, notably the display of boreholes, depth logs and time-series water level surfaces. The GVS software remains under continual development and improvement
Resumo:
This paper reports on the development of specifications for an on-board mass monitoring (OBM) application for regulatory requirements in Australia. An earlier paper reported on feasibility study and pilot testing program prior to the specification development [1]. Learnings from the pilot were used to refine this testing process and a full scale testing program was conducted from July to October 2008. The results from the full scale test and evidentiary implications are presented in this report. The draft specification for an evidentiary on-board mass monitoring application is currently under development.
Resumo:
This paper proposes a novel peak load management scheme for rural areas. The scheme transfers certain customers onto local nonembedded generators during peak load periods to alleviate network under voltage problems. This paper develops and presents this system by way of a case study in Central Queensland, Australia. A methodology is presented for determining the best location for the nonembedded generators as well as the number of generators required to alleviate network problems. A control algorithm to transfer and reconnect customers is developed to ensure that the network voltage profile remains within specification under all plausible load conditions. Finally, simulations are presented to show the performance of the system over a typical maximum daily load profile with large stochastic load variations.
Resumo:
As various contributors to this volume suggest, the term soft power is multifaceted. In 2002 Joseph Nye, the political scientist who coined the term more than a decade previously, noted that the soft power of a country rests on three resources: a country’s culture, its political values, and its foreign policies (Nye 2002). However, several factors can be drawn together to explain China’s adoption of this concept. First, China’s economic influence has precipitated a groundswell of nationalism, which reached its apex at the Opening Ceremony of the 2008 Beijing Olympics. This global media event provided an international platform to demonstrate China’s new found self-confidence. Second, cultural diplomacy and foreign aid, particularly through Third World channels is seen by the Chinese Communist Party leadership as an appropriate way to extend Chinese influence globally (Kurlantzick 2007). Third, education in Chinese culture through globally dispersed Confucius Institutes is charged with improving international understanding of Chinese culture and values, and in the process renovating negative images of China. Fourth, the influence of Japanese and Korean popular culture on China’s youth cultures in recent years has caused acute discomfit to cultural nationalists. Many contend it is time to stem the tide. Fifth, the past few years have witnessed a series of lively debates about the importance of industries such as design, advertising, animation and fashion, resulting in the construction of hundreds of creative clusters, animation centres, film backlots, cultural precincts, design centres and artist lofts.
Resumo:
This full day workshop invites participants to consider the nexus where the interests of game design, the expectations of play and HCI meet: the game interface. Game interfaces seem different to the interface to other software and there have been a number of observations. Shneiderman famously noticed that while most software designers are intent on following the tenets of the “invisible computer” and making access easy for the user, games inter-faces are made for players: they embed challenge. Schell discusses a “strange” relationship between the player and the game enabled by the interface and user interface designers frequently opine that much can be learned from the design of game interfaces. So where does the game interface actually sit? Even more interesting is the question as to whether the history of the relationship and sub-sequent expectations are now limiting the potential of game design as an expressive form. Recent innovations in I/O design such as Nintendo’s Wii, Sony’s Move and Microsoft's Kinect seem to usher in an age of physical player-enabled interaction, experience and embodied, engaged design. This workshop intends to cast light on this often mentioned and sporadically examined area and to establish a platform for new and innovative design in the field.