984 resultados para Complex Geometry
Resumo:
This paper discusses the vibration characteristics of a concrete-steel composite multi-panel floor structure; the use of these structures is becoming more common. These structures have many desirable properties but are prone to excessive and complex vibration, which is not currently well understood. Existing design codes and practice guides provide generic advice or simple techniques that cannot address the complex vibration in these types of low-frequency structures. The results of this study show the potential for an adverse dynamic response from higher and multi-modal excitation influenced by human-induced pattern loading, structural geometry, and activity frequency. Higher harmonics of the load frequency are able to excite higher modes in the composite floor structure in addition to its fundamental mode. The analytical techniques used in this paper can supplement the current limited code and practice guide provisions for mitigating the impact of human-induced vibrations in these floor structures.
Resumo:
With the advent of live cell imaging microscopy, new types of mathematical analyses and measurements are possible. Many of the real-time movies of cellular processes are visually very compelling, but elementary analysis of changes over time of quantities such as surface area and volume often show that there is more to the data than meets the eye. This unit outlines a geometric modeling methodology and applies it to tubulation of vesicles during endocytosis. Using these principles, it has been possible to build better qualitative and quantitative understandings of the systems observed, as well as to make predictions about quantities such as ligand or solute concentration, vesicle pH, and membrane trafficked. The purpose is to outline a methodology for analyzing real-time movies that has led to a greater appreciation of the changes that are occurring during the time frame of the real-time video microscopy and how additional quantitative measurements allow for further hypotheses to be generated and tested.
Resumo:
Almost all metapopulation modelling assumes that connectivity between patches is only a function of distance, and is therefore symmetric. However, connectivity will not depend only on the distance between the patches, as some paths are easy to traverse, while others are difficult. When colonising organisms interact with the heterogeneous landscape between patches, connectivity patterns will invariably be asymmetric. There have been few attempts to theoretically assess the effects of asymmetric connectivity patterns on the dynamics of metapopulations. In this paper, we use the framework of complex networks to investigate whether metapopulation dynamics can be determined by directly analysing the asymmetric connectivity patterns that link the patches. Our analyses focus on “patch occupancy” metapopulation models, which only consider whether a patch is occupied or not. We propose three easily calculated network metrics: the “asymmetry” and “average path strength” of the connectivity pattern, and the “centrality” of each patch. Together, these metrics can be used to predict the length of time a metapopulation is expected to persist, and the relative contribution of each patch to a metapopulation’s viability. Our results clearly demonstrate the negative effect that asymmetry has on metapopulation persistence. Complex network analyses represent a useful new tool for understanding the dynamics of species existing in fragmented landscapes, particularly those existing in large metapopulations.
Resumo:
Objective: This paper describes the first phase of a larger project that utilizes participatory action research to examine complex mental health needs across an extensive group of stakeholders in the community. Method: Within an objective qualitative analysis of focus group discussions the social ecological model is utilized to explore how integrative activities can be informed, planned and implemented across multiple elements and levels of a system. Seventy-one primary care workers, managers, policy-makers, consumers and carers from across the southern metropolitan and Gippsland regions of Victoria, Australia took part in seven focus groups. All groups responded to an identical set of focusing questions. Results: Participants produced an explanatory model describing the service system, as it relates to people with complex needs, across the levels of social ecological analysis. Qualitative themes analysis identified four priority areas to be addressed in order to improve the system's capacity for working with complexity. These included: (i) system fragmentation; (ii) integrative case management practices; (iii) community attitudes; and (iv) money and resources. Conclusions: The emergent themes provide clues as to how complexity is constructed and interpreted across the system of involved agencies and interest groups. The implications these findings have for the development and evaluation of this community capacity-building project were examined from the perspective of constructing interventions that address both top-down and bottom-up processes.
Resumo:
Mismanagement of large-scale, complex projects has resulted in spectacular failures, cost overruns, time blowouts, and stakeholder dissatisfaction. We focus discussion on the interaction of key management and leadership attributes which facilitate leaders’ adaptive behaviors. These behaviors should in turn influence adaptive team member behavior, stakeholder engagement and successful project outcomes, outputs and impacts. An understanding of this type of management will benefit from a perspective based in managerial and organizational cognition. The research question we explore is whether successful leaders of large-scale complex projects have an internal process leading to a display of administrative, adaptive, and enabling behaviors that foster adaptive processes and enabling behaviors within their teams and with external stakeholders. At the core of the model we propose interactions of key attributes, namely cognitive flexibility, affect, and emotional intelligence. The result of these cognitive-affective attribute interactions is leadership leading to enhanced likelihood of complex project success.
Resumo:
Queensland's new State Planning Policy for Coastal Protection, released in March and approved in April 2011 as part of the Queensland Coastal Plan, stipulates that local governments prepare and implement adaptation strategies for built up areas projected to be subject to coastal hazards between present day and 2100. Urban localities within the delineated coastal high hazard zone (as determined by models incorporating a 0.8 meter rise in sea level and a 10% increase in the maximum cyclone activity) will be required to re-evaluate their plans to accommodate growth, revising land use plans to minimise impacts of anticipated erosion and flooding on developed areas and infrastructure. While implementation of such strategies would aid in avoidance or minimisation of risk exposure, communities are likely to face significant challenges in such implementation, especially as development in Queensland is so intensely focussed upon its coasts with these new policies directing development away from highly desirable waterfront land. This paper examines models of planning theory to understand how we plan when faced with technically complex problems towards formulation of a framework for evaluating and improving practice.
Resumo:
This paper presents a “research frame” which we have found useful in analyzing complex socio- technical situations. The research frame is based on aspects of actor-network theory: “interressment”, “enrollment”, “points of passage” and the “trial of strength”. Each of these aspects are described in turn, making clear their purpose in the overall research frame. Having established the research frame it is used to analyse two examples. First, the use of speech recognition technology is examined in two different contexts, showing how to apply the frame to compare and contrast current situations. Next, a current medical consultation context is described and the research frame is used to consider how it could change with innovative technology. In both examples, the research frame shows that the use of an artefact or technology must be considered together with the context in which it is used.
Resumo:
This paper presents an experiment designed to investigate if redundancy in an interface has any impact on the use of complex interfaces by older people and people with low prior-experience with technology. The important findings of this study were that older people (65+ years) completed the tasks on the Words only based interface faster than on Redundant (text and symbols) interface. The rest of the participants completed tasks significantly faster on the Redundant interface. From a cognitive processing perspective, sustained attention (one of the functions of Central Executive) has emerged as one of the important factors in completing tasks on complex interfaces faster and with fewer of errors.
Practical improvements to simultaneous computation of multi-view geometry and radial lens distortion
Resumo:
This paper discusses practical issues related to the use of the division model for lens distortion in multi-view geometry computation. A data normalisation strategy is presented, which has been absent from previous discussions on the topic. The convergence properties of the Rectangular Quadric Eigenvalue Problem solution for computing division model distortion are examined. It is shown that the existing method can require more than 1000 iterations when dealing with severe distortion. A method is presented for accelerating convergence to less than 10 iterations for any amount of distortion. The new method is shown to produce equivalent or better results than the existing method with up to two orders of magnitude reduction in iterations. Through detailed simulation it is found that the number of data points used to compute geometry and lens distortion has a strong influence on convergence speed and solution accuracy. It is recommended that more than the minimal number of data points be used when computing geometry using a robust estimator such as RANSAC. Adding two to four extra samples improves the convergence rate and accuracy sufficiently to compensate for the increased number of samples required by the RANSAC process.
Resumo:
Smut fungi are important pathogens of grasses, including the cultivated crops maize, sorghum and sugarcane. Typically, smut fungi infect the inflorescence of their host plants. Three genera of smut fungi (Ustilago, Sporisorium and Macalpinomyces) form a complex with overlapping morphological characters, making species placement problematic. For example, the newly described Macalpinomyces mackinlayi possesses a combination of morphological characters such that it cannot be unambiguously accommodated in any of the three genera. Previous attempts to define Ustilago, Sporisorium and Macalpinomyces using morphology and molecular phylogenetics have highlighted the polyphyletic nature of the genera, but have failed to produce a satisfactory taxonomic resolution. A detailed systematic study of 137 smut species in the Ustilago-Sporisorium- Macalpinomyces complex was completed in the current work. Morphological and DNA sequence data from five loci were assessed with maximum likelihood and Bayesian inference to reconstruct a phylogeny of the complex. The phylogenetic hypotheses generated were used to identify morphological synapomorphies, some of which had previously been dismissed as a useful way to delimit the complex. These synapomorphic characters are the basis for a revised taxonomic classification of the Ustilago-Sporisorium-Macalpinomyces complex, which takes into account their morphological diversity and coevolution with their grass hosts. The new classification is based on a redescription of the type genus Sporisorium, and the establishment of four genera, described from newly recognised monophyletic groups, to accommodate species expelled from Sporisorium. Over 150 taxonomic combinations have been proposed as an outcome of this investigation, which makes a rigorous and objective contribution to the fungal systematics of these important plant pathogens.
Resumo:
Modelling an environmental process involves creating a model structure and parameterising the model with appropriate values to accurately represent the process. Determining accurate parameter values for environmental systems can be challenging. Existing methods for parameter estimation typically make assumptions regarding the form of the Likelihood, and will often ignore any uncertainty around estimated values. This can be problematic, however, particularly in complex problems where Likelihoods may be intractable. In this paper we demonstrate an Approximate Bayesian Computational method for the estimation of parameters of a stochastic CA. We use as an example a CA constructed to simulate a range expansion such as might occur after a biological invasion, making parameter estimates using only count data such as could be gathered from field observations. We demonstrate ABC is a highly useful method for parameter estimation, with accurate estimates of parameters that are important for the management of invasive species such as the intrinsic rate of increase and the point in a landscape where a species has invaded. We also show that the method is capable of estimating the probability of long distance dispersal, a characteristic of biological invasions that is very influential in determining spread rates but has until now proved difficult to estimate accurately.
Resumo:
The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.
Resumo:
Emerging from the challenge to reduce energy consumption in buildings is the need for energy simulation to be used more effectively to support integrated decision making in early design. As a critical response to a Green Star case study, we present DEEPA, a parametric modeling framework that enables architects and engineers to work at the same semantic level to generate shared models for energy simulation. A cloud-based toolkit provides web and data services for parametric design software that automate the process of simulating and tracking design alternatives, by linking building geometry more directly to analysis inputs. Data, semantics, models and simulation results can be shared on the fly. This allows the complex relationships between architecture, building services and energy consumption to be explored in an integrated manner, and decisions to be made collaboratively.