941 resultados para image noise modeling
Resumo:
This paper presents two novel nonlinear models of u-shaped anti-roll tanks for ships, and their linearizations. In addition, a third simplified nonlinear model is presented. The models are derived using Lagrangian mechanics. This formulation not only simplifies the modeling process, but also allows one to obtain models that satisfy energy-related physical properties. The proposed nonlinear models and their linearizations are validated using model-scale experimental data. Unlike other models in the literature, the nonlinear models in this paper are valid for large roll amplitudes. Even at moderate roll angles, the nonlinear models have three orders of magnitude lower mean square error relative to experimental data than the linear models.
Resumo:
Process models are usually depicted as directed graphs, with nodes representing activities and directed edges control flow. While structured processes with pre-defined control flow have been studied in detail, flexible processes including ad-hoc activities need further investigation. This paper presents flexible process graph, a novel approach to model processes in the context of dynamic environment and adaptive process participants’ behavior. The approach allows defining execution constraints, which are more restrictive than traditional ad-hoc processes and less restrictive than traditional control flow, thereby balancing structured control flow with unstructured ad-hoc activities. Flexible process graph focuses on what can be done to perform a process. Process participants’ routing decisions are based on the current process state. As a formal grounding, the approach uses hypergraphs, where each edge can associate any number of nodes. Hypergraphs are used to define execution semantics of processes formally. We provide a process scenario to motivate and illustrate the approach.
Resumo:
Light of Extinction presents a diverse series of views into the complex antics of a semi-autonomous gaggle of robotic actants. Audiences initially enter into the 'backend' of the experience to be rudely confronted with the raw, messy operations of a horde of object-manipulating robotic forms. Seen through viewing apertures these ‘things’ deny any opportunity to grasp their imagined order. Audiences then flow on into the 'front end' of the work where now, seen through another aperture, the very same forms seemingly coordinate a stunning deep-field choreography, floating lusciously within inky landscapes of media, noise and embodied sound. As one series of conceptions slip into extinction, so others flow on in. The idea of the 'extinction of human experience' expresses a projected fear for that which will disappear when biodiverse worlds have descended into an era of permanent darkness. ‘Light Of Extinction' re-positions this anthropomorphic lament in order to suggest a more rounded acknowledgement of what might still remain - suggesting the previously unacknowledged power and place of autonomous, synthetic creation. Momentary disbelief gives way to a relieving celebration of the imagined birth of ‘things’ – without need for staples such as conventional light or the harmonious lullabies of long-extinguished sounds.
Resumo:
A new form of media installation combining image, multi-channel sound and internally lit objects into a mysterious, deep image plane. Staged on the very edge of spectrum blackout, and moving into the deep of night, Version 1 (Night Rage) for ISEA 2013 examined the many shades of 'nocturnal', threats to night biodiversity and the myriad myths and stories that have shaped our cultural understandings of life after light. Barely recognisable images float within landscapes of media, noise and sound as the work asserts a profound resistance to today's all consuming media mesh. Version 2 (Night Fall) for the Queensland State Museum examined contemporary ideas around the ‘night’ and the 'nocturnal'. Beginning with the dark myths and stories that have long shaped our cultural understandings of life after light, NIGHT FALL considers how fearful ideas have often underpinned actions that continue to reduce Australia’s extraordinary night biodiversity. Today’s growing hostility towards Australia’s ancient, iconic flying foxes - who have been quietly pollinating our forests for millennia - hints at just how far we have yet to travel in our thinking. Enter the darkened tunnel to experience mysterious, edge-of-perception 3D forms, enhanced by a range of cinematic, illusionary and animatronic techniques, and become immersed in a strangely familiar sound track based upon seasonal field recordings made after dark, sourced from across the eastern coast of Queensland.
Resumo:
Lean construction and building information modeling (BIM) are quite different initiatives, but both are having profound impacts on the construction industry. A rigorous analysis of the myriad specific interactions between them indicates that a synergy exists which, if properly understood in theoretical terms, can be exploited to improve construction processes beyond the degree to which it might be improved by application of either of these paradigms independently. Using a matrix that juxtaposes BIM functionalities with prescriptive lean construction principles, 56 interactions have been identified, all but four of which represent constructive interaction. Although evidence for the majority of these has been found, the matrix is not considered complete but rather a framework for research to explore the degree of validity of the interactions. Construction executives, managers, designers, and developers of information technology systems for construction can also benefit from the framework as an aid to recognizing the potential synergies when planning their lean and BIM adoption strategies.
Resumo:
Unstable density-driven flow can lead to enhanced solute transport in groundwater. Only recently has the complex fingering pattern associated with free convection been documented in field settings. Electrical resistivity (ER) tomography has been used to capture a snapshot of convective instabilities at a single point in time, but a thorough transient analysis is still lacking in the literature. We present the results of a 2 year experimental study at a shallow aquifer in the United Arab Emirates that was designed to specifically explore the transient nature of free convection. ER tomography data documented the presence of convective fingers following a significant rainfall event. We demonstrate that the complex fingering pattern had completely disappeared a year after the rainfall event. The observation is supported by an analysis of the aquifer halite budget and hydrodynamic modeling of the transient character of the fingering instabilities. Modeling results show that the transient dynamics of the gravitational instabilities (their initial development, infiltration into the underlying lower-density groundwater, and subsequent decay) are in agreement with the timing observed in the time-lapse ER measurements. All experimental observations and modeling results are consistent with the hypothesis that a dense brine that infiltrated into the aquifer from a surficial source was the cause of free convection at this site, and that the finite nature of the dense brine source and dispersive mixing led to the decay of instabilities with time. This study highlights the importance of the transience of free convection phenomena and suggests that these processes are more rapid than was previously understood.
Resumo:
Brisbane City Hall (BCH) is arguably one of Brisbane’s most notable and iconic buildings. Serving as the public’s central civic and municipal building since 1930, the importance of this heritage listed building to cultural significance and identity is unquestionable. This attribute is reflected within the local government, with a simplified image of the halls main portico entrance supplying Brisbane City Council with its insignia and trademark signifier. Regardless of these qualities, this building has been neglected in a number of ways, primarily in the physical sense with built materials, but also, and just as importantly, through inaccurate and undocumented works. Numerous restoration and renovation works have been undertaken throughout BCH’s lifetime, however the records of these amendments are far and few between. Between 2010 and 2013, BCH underwent major restoration works, the largest production project undertaken on the building since its initial construction. Just prior to this conservation process, the full extent of the buildings deterioration was identified, much of which there was little to no original documentation of. This has led to a number of issues pertaining to what investigators expected to find within the building, versus what was uncovered (the unexpected), which have resulted directly from this lack of data. This absence of record keeping is the key factor that has contributed to the decay and unknown deficiencies that had amassed within BCH. Accordingly, this raises a debate about the methods of record keeping, and the need for a more advanced process that is able to be integrated within architectural and engineering programs, whilst still maintaining the ability to act as a standalone database. The immediate objective of this research is to investigate the restoration process of BCH, with focus on the auditorium, to evaluate possible strategies to record and manage data connected to building pathology so that a framework can be developed for a digital heritage management system. The framework produced for this digital tool will enable dynamic uses of a centralised database and aims to reduce the significant data loss. Following an in-depth analysis of this framework, it can be concluded that the implementation of the suggested digital tool would directly benefit BCH, and could ultimately be incorporated into a number of heritage related built form.
Resumo:
Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder-involvement in early design where non-formal techniques supported strong collaboration resulting in deep understanding of requirements and of the feasibility of solutions.
Resumo:
Motivation ?Task analysis for designing modern collaborative work needs a more fine grained approach. Especially in a complex task domain, like collaborative scientific authoring, when there is a single overall goal that can only be accomplished only by collaboration between multiple roles, each requiring its own expertise. We analyzed and re-considered roles, activities, and objects for design for complex collaboration contexts. Our main focus is on a generic approach to design for multiple roles and subtasks in a domain with a shared overall goal, which requires a detailed approach. Collaborative authoring is our current example. This research is incremental: an existing task analysis approach (GTA) is reconsidered by applying it to a case of complex collaboration. Our analysis shows that designing for collaboration indeed requires a refined approach to task modeling: GTA, in future, will need to consider tasks at the lowest level that can be delegated or mandates. These tasks need to be analyzed and redesigned in more in detail, along with the relevant task object.
Resumo:
Process choreographies describe interactions between different business partners and the dependencies between these interactions. While different proposals were made for capturing choreographies at an implementation level, it remains unclear how choreographies should be described on a conceptual level.While the Business Process Modeling Notation (BPMN) is already in use for describing choreographies in terms of interconnected interface behavior models, this paper will introduce interaction modeling using BPMN. Such interaction models do not suffer from incompatibility issues and are better suited for human modelers. BPMN extensions are proposed and a mapping from interaction models to interface behavior models is presented.
Resumo:
This paper presents a novel framework to further advance the recent trend of using query decomposition and high-order term relationships in query language modeling, which takes into account terms implicitly associated with different subsets of query terms. Existing approaches, most remarkably the language model based on the Information Flow method are however unable to capture multiple levels of associations and also suffer from a high computational overhead. In this paper, we propose to compute association rules from pseudo feedback documents that are segmented into variable length chunks via multiple sliding windows of different sizes. Extensive experiments have been conducted on various TREC collections and our approach significantly outperforms a baseline Query Likelihood language model, the Relevance Model and the Information Flow model.
Resumo:
Recent advances suggest that encoding images through Symmetric Positive Definite (SPD) matrices and then interpreting such matrices as points on Riemannian manifolds can lead to increased classification performance. Taking into account manifold geometry is typically done via (1) embedding the manifolds in tangent spaces, or (2) embedding into Reproducing Kernel Hilbert Spaces (RKHS). While embedding into tangent spaces allows the use of existing Euclidean-based learning algorithms, manifold shape is only approximated which can cause loss of discriminatory information. The RKHS approach retains more of the manifold structure, but may require non-trivial effort to kernelise Euclidean-based learning algorithms. In contrast to the above approaches, in this paper we offer a novel solution that allows SPD matrices to be used with unmodified Euclidean-based learning algorithms, with the true manifold shape well-preserved. Specifically, we propose to project SPD matrices using a set of random projection hyperplanes over RKHS into a random projection space, which leads to representing each matrix as a vector of projection coefficients. Experiments on face recognition, person re-identification and texture classification show that the proposed approach outperforms several recent methods, such as Tensor Sparse Coding, Histogram Plus Epitome, Riemannian Locality Preserving Projection and Relational Divergence Classification.
Resumo:
Existing multi-model approaches for image set classification extract local models by clustering each image set individually only once, with fixed clusters used for matching with other image sets. However, this may result in the two closest clusters to represent different characteristics of an object, due to different undesirable environmental conditions (such as variations in illumination and pose). To address this problem, we propose to constrain the clustering of each query image set by forcing the clusters to have resemblance to the clusters in the gallery image sets. We first define a Frobenius norm distance between subspaces over Grassmann manifolds based on reconstruction error. We then extract local linear subspaces from a gallery image set via sparse representation. For each local linear subspace, we adaptively construct the corresponding closest subspace from the samples of a probe image set by joint sparse representation. We show that by minimising the sparse representation reconstruction error, we approach the nearest point on a Grassmann manifold. Experiments on Honda, ETH-80 and Cambridge-Gesture datasets show that the proposed method consistently outperforms several other recent techniques, such as Affine Hull based Image Set Distance (AHISD), Sparse Approximated Nearest Points (SANP) and Manifold Discriminant Analysis (MDA).
Resumo:
Mammographic density (MD) adjusted for age and body mass index (BMI) is a strong heritable breast cancer risk factor; however, its biological basis remains elusive. Previous studies assessed MD-associated histology using random sampling approaches, despite evidence that high and low MD areas exist within a breast and are negatively correlated with respect to one another. We have used an image-guided approach to sample high and low MD tissues from within individual breasts to examine the relationship between histology and degree of MD. Image-guided sampling was performed using two different methodologies on mastectomy tissues (n = 12): (1) sampling of high and low MD regions within a slice guided by bright (high MD) and dark (low MD) areas in a slice X-ray film; (2) sampling of high and low MD regions within a whole breast using a stereotactically guided vacuum-assisted core biopsy technique. Pairwise analysis accounting for potential confounders (i.e. age, BMI, menopausal status, etc.) provides appropriate power for analysis despite the small sample size. High MD tissues had higher stromal (P = 0.002) and lower fat (P = 0.002) compositions, but no evidence of difference in glandular areas (P = 0.084) compared to low MD tissues from the same breast. High MD regions had higher relative gland counts (P = 0.023), and a preponderance of Type I lobules in high MD compared to low MD regions was observed in 58% of subjects (n = 7), but did not achieve significance. These findings clarify the histologic nature of high MD tissue and support hypotheses regarding the biophysical impact of dense connective tissue on mammary malignancy. They also provide important terms of reference for ongoing analyses of the underlying genetics of MD.
Resumo:
This thesis investigates the fusion of 3D visual information with 2D image cues to provide 3D semantic maps of large-scale environments in which a robot traverses for robotic applications. A major theme of this thesis was to exploit the availability of 3D information acquired from robot sensors to improve upon 2D object classification alone. The proposed methods have been evaluated on several indoor and outdoor datasets collected from mobile robotic platforms including a quadcopter and ground vehicle covering several kilometres of urban roads.