235 resultados para analogy calculation
Resumo:
Power system dynamic analysis and security assessment are becoming more significant today due to increases in size and complexity from restructuring, emerging new uncertainties, integration of renewable energy sources, distributed generation, and micro grids. Precise modelling of all contributed elements/devices, understanding interactions in detail, and observing hidden dynamics using existing analysis tools/theorems are difficult, and even impossible. In this chapter, the power system is considered as a continuum and the propagated electomechanical waves initiated by faults and other random events are studied to provide a new scheme for stability investigation of a large dimensional system. For this purpose, the measured electrical indices (such as rotor angle and bus voltage) following a fault in different points among the network are used, and the behaviour of the propagated waves through the lines, nodes, and buses is analyzed. The impact of weak transmission links on a progressive electromechanical wave using energy function concept is addressed. It is also emphasized that determining severity of a disturbance/contingency accurately, without considering the related electromechanical waves, hidden dynamics, and their properties is not secure enough. Considering these phenomena takes heavy and time consuming calculation, which is not suitable for online stability assessment problems. However, using a continuum model for a power system reduces the burden of complex calculations
Resumo:
Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, very few attempts have been made to explore the structure damage with noise polluted data which is unavoidable effect in real world. The measurement data are contaminated by noise because of test environment as well as electronic devices and this noise tend to give error results with structural damage identification methods. Therefore it is important to investigate a method which can perform better with noise polluted data. This paper introduces a new damage index using principal component analysis (PCA) for damage detection of building structures being able to accept noise polluted frequency response functions (FRFs) as input. The FRF data are obtained from the function datagen of MATLAB program which is available on the web site of the IASC-ASCE (International Association for Structural Control– American Society of Civil Engineers) Structural Health Monitoring (SHM) Task Group. The proposed method involves a five-stage process: calculation of FRFs, calculation of damage index values using proposed algorithm, development of the artificial neural networks and introducing damage indices as input parameters and damage detection of the structure. This paper briefly describes the methodology and the results obtained in detecting damage in all six cases of the benchmark study with different noise levels. The proposed method is applied to a benchmark problem sponsored by the IASC-ASCE Task Group on Structural Health Monitoring, which was developed in order to facilitate the comparison of various damage identification methods. The illustrated results show that the PCA-based algorithm is effective for structural health monitoring with noise polluted FRFs which is of common occurrence when dealing with industrial structures.
Resumo:
This two-storey office building and upper floor interior fit-out, completed for the 25th anniversary of Adelaide-based construction firm, Badge Constructions, is a signature building for the client, and its recently established Brisbane-based operations, and a showpiece for their commercial and industrial construction prowess and dynamic, collaborative and transparent work ethic. Situated in the industrial precinct of Bulimba’s Oxford Street, the building is a continuation of the street’s nearby commercial heart, whilst its architectural language references the adjacent industrial structures. The building’s shed-like skillion roof and western wall have been considered as a folded plane, allowing space to be considered as the inhabitation of the inner surface of this plane. The analogy of a lined garment, tailored to suit its wearer, clarifies the relationship between the western façade plane’s unadorned, monochromatic outer surface and the coloured and patterned inner surface, celebrating inhabitation. The use of typically external construction materials are re-positioned as an integral part of the building’s interior fit-out, alluding to Badge’s construction repertoire, and weakening traditional barriers between interior and exterior commercial space. In reference to its Queensland context, the external glazed line of the building is pulled back from the street, providing an eastern verandah edge and a northern court, as a part of the public realm. The upper floor office incorporates a cantilevered outdoor mezzanine within the northern court, whilst the adjacent reception area and stairwell utilises clear glazing in order to visually connect to the street. The building is designed to take advantage of natural light to the east, whilst shading habitable spaces from the north, a building strategy that reduces solar heat gain and energy consumption. Placement of the building’s amenities core to the west provides substantial bracing and allows maximum activation of the north and east street edge. A collaborative design process has resulted in an affordable commercial building with a high level of design resolution and relationship to its Brisbane context, while also challenging the traditional relationships between exterior and interior commercial space, and informed client and consultant team of allied disciplines.
Resumo:
Purpose-- DB clients play a vital role in the delivery of DB system and the clients’ competences are critical to the success of DB projects. Most of DB clients, however, remain inexperienced with the DB system. This study, therefore, aims to identify the key competences that DB clients should possess to ensure the success of DB projects in the construction market of China. Design/Methodology/Approach -- Five semi-structured face-to-face interviews and two rounds Delphi questionnaire survey were conducted in the construction market of China to identify the key competences of DB clients. Rankings have been assigned to these key competences on the basis of their relative importance. Findings-- Six ranked key competences of DB clients have been identified, which are, namely, (1) the ability to clearly define project scope and objectives; (2) financial capacity for the projects; (3) capacity in contract management; (4) adequate staff or consulting team; (5) effective coordination with DB contractors and (6) experience with similar design-build projects. Calculation of Kendall’s Coefficient of Concordance (W) indicates a statistically significant consensus of panel experts on these top six key competences. Practical implications—Clients should clearly understand the competence requirements in DB projects and should assess their DB capability before going for the DB option. Originality/Value-- The examination of DB client’s key competences will help the client deepen the understanding of the DB system. DB clients can also make use of the research findings as guidelines to improve their DB competence.
Resumo:
The growth of solid tumours beyond a critical size is dependent upon angiogenesis, the formation of new blood vessels from an existing vasculature. Tumours may remain dormant at microscopic sizes for some years before switching to a mode in which growth of a supportive vasculature is initiated. The new blood vessels supply nutrients, oxygen, and access to routes by which tumour cells may travel to other sites within the host (metastasize). In recent decades an abundance of biological research has focused on tumour-induced angiogenesis in the hope that treatments targeted at the vasculature may result in a stabilisation or regression of the disease: a tantalizing prospect. The complex and fascinating process of angiogenesis has also attracted the interest of researchers in the field of mathematical biology, a discipline that is, for mathematics, relatively new. The challenge in mathematical biology is to produce a model that captures the essential elements and critical dependencies of a biological system. Such a model may ultimately be used as a predictive tool. In this thesis we examine a number of aspects of tumour-induced angiogenesis, focusing on growth of the neovasculature external to the tumour. Firstly we present a one-dimensional continuum model of tumour-induced angiogenesis in which elements of the immune system or other tumour-cytotoxins are delivered via the newly formed vessels. This model, based on observations from experiments by Judah Folkman et al., is able to show regression of the tumour for some parameter regimes. The modelling highlights a number of interesting aspects of the process that may be characterised further in the laboratory. The next model we present examines the initiation positions of blood vessel sprouts on an existing vessel, in a two-dimensional domain. This model hypothesises that a simple feedback inhibition mechanism may be used to describe the spacing of these sprouts with the inhibitor being produced by breakdown of the existing vessel's basement membrane. Finally, we have developed a stochastic model of blood vessel growth and anastomosis in three dimensions. The model has been implemented in C++, includes an openGL interface, and uses a novel algorithm for calculating proximity of the line segments representing a growing vessel. This choice of programming language and graphics interface allows for near-simultaneous calculation and visualisation of blood vessel networks using a contemporary personal computer. In addition the visualised results may be transformed interactively, and drop-down menus facilitate changes in the parameter values. Visualisation of results is of vital importance in the communication of mathematical information to a wide audience, and we aim to incorporate this philosophy in the thesis. As biological research further uncovers the intriguing processes involved in tumourinduced angiogenesis, we conclude with a comment from mathematical biologist Jim Murray, Mathematical biology is : : : the most exciting modern application of mathematics.
Resumo:
Since the availability of 3D full body scanners and the associated software systems for operations with large point clouds, 3D anthropometry has been marketed as a breakthrough and milestone in ergonomic design. The assumptions made by the representatives of the 3D paradigm need to be critically reviewed though. 3D anthropometry has advantages as well as shortfalls, which need to be carefully considered. While it is apparent that the measurement of a full body point cloud allows for easier storage of raw data and improves quality control, the difficulties in calculation of standardized measurements from the point cloud are widely underestimated. Early studies that made use of 3D point clouds to derive anthropometric dimensions have shown unacceptable deviations from the standardized results measured manually. While 3D human point clouds provide a valuable tool to replicate specific single persons for further virtual studies, or personalize garment, their use in ergonomic design must be critically assessed. Ergonomic, volumetric problems are defined by their 2-dimensional boundary or one dimensional sections. A 1D/2D approach is therefore sufficient to solve an ergonomic design problem. As a consequence, all modern 3D human manikins are defined by the underlying anthropometric girths (2D) and lengths/widths (1D), which can be measured efficiently using manual techniques. Traditionally, Ergonomists have taken a statistical approach to design for generalized percentiles of the population rather than for a single user. The underlying method is based on the distribution function of meaningful single and two-dimensional anthropometric variables. Compared to these variables, the distribution of human volume has no ergonomic relevance. On the other hand, if volume is to be seen as a two-dimensional integral or distribution function of length and girth, the calculation of combined percentiles – a common ergonomic requirement - is undefined. Consequently, we suggest to critically review the cost and use of 3D anthropometry. We also recommend making proper use of widely available single and 2-dimensional anthropometric data in ergonomic design.
Resumo:
This paper examines the essential qualities of a good classroom teacher. The analogy of a caring gardener who understands the individual growing needs of a vast variety of plants is used. The paper argues for the freedom to have differing non-sectarian curricula and for trust to be placed in the teaching profession.
Resumo:
Designing Well: Vegetarianism Sustainability and Interaction Design, focuses on the field of Interaction Design and is an exploration of how design can be reconsidered by employing a different critical lens – that of vegetarianism. By extending the eating analogy to design, other aspects of practice can be reframed and reviewed. This is done through a survey of different ways designers and artists have approached the problems of electricity use. This survey begins by looking at a number of functional products that are currently on the market, and then turns to consider a range of alternate approaches taken in research, art and critical design. The second half of the paper can be considered as a form of contextual review, as a survey of different approaches artists and designers employ to address a specific issue in and through practice. This ranges from pragmatic design to critical and radical interventions.
Resumo:
This is a professional practice paper for Psychology practitioners to reflect on their skills and therapeutic practices. A Master- practitioner model or Artizan - apprentice analogy is used to understand the development of a practicing psychologist from his/her "salad days" (when we are green [Shakespeare- Anthony and Cleopatra]) to our Autumn years in the profession.
Resumo:
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.
Resumo:
Increasing global competitiveness worldwide has forced manufacturing organizations to produce high-quality products more quickly and at a competitive cost which demand of continuous improvements techniques. In this paper, we propose a fuzzy based performance evaluation method for lean supply chain. To understand the overall performance of cost competitive supply chain, we investigate the alignment of market strategy and position of the supply chain. Competitive strategies can be achieved by using a different weight calculation for different supply chain situations. By identifying optimal performance metrics and applying performance evaluation methods, managers can predict the overall supply chain performance under lean strategy.
Resumo:
Volume measurements are useful in many branches of science and medicine. They are usually accomplished by acquiring a sequence of cross sectional images through the object using an appropriate scanning modality, for example x-ray computed tomography (CT), magnetic resonance (MR) or ultrasound (US). In the cases of CT and MR, a dividing cubes algorithm can be used to describe the surface as a triangle mesh. However, such algorithms are not suitable for US data, especially when the image sequence is multiplanar (as it usually is). This problem may be overcome by manually tracing regions of interest (ROIs) on the registered multiplanar images and connecting the points into a triangular mesh. In this paper we describe and evaluate a new discreet form of Gauss’ theorem which enables the calculation of the volume of any enclosed surface described by a triangular mesh. The volume is calculated by summing the vector product of the centroid, area and normal of each surface triangle. The algorithm was tested on computer-generated objects, US-scanned balloons, livers and kidneys and CT-scanned clay rocks. The results, expressed as the mean percentage difference ± one standard deviation were 1.2 ± 2.3, 5.5 ± 4.7, 3.0 ± 3.2 and −1.2 ± 3.2% for balloons, livers, kidneys and rocks respectively. The results compare favourably with other volume estimation methods such as planimetry and tetrahedral decomposition.
Resumo:
In this video, an abstract swirling colour-field animation is accompanied by a female voice-over that describes facts and analogies about the earth, the universe and our place in it. This work engages with scientific language and the signifying processes of analogy. It questions the capacity of language to describe epic ideas like the qualities and quantities of the universe and our place in it. By emphasizing the absurdity of describing the scale and formation of the universe through analogies with the ‘everyday’, this work draws attention to the limits of verbal language and its assumed relationship to rational thought.
Resumo:
The position of housing demand and supply is not consistent. The Australian situation counters the experience demonstrated in many other parts of the world in the aftermath of the Global Financial Crisis, with residential housing prices proving particularly resilient. A seemingly inexorable housing demand remains a critical issue affecting the socio-economic landscape. Underpinned by high levels of population growth fuelled by immigration, and further buoyed by sustained historically low interest rates, increasing income levels, and increased government assistance for first home buyers, this strong housing demand level ensures problems related to housing affordability continue almost unabated. A significant, but less visible factor impacting housing affordability relates to holding costs. Although only one contributor in the housing affordability matrix, the nature and extent of holding cost impact requires elucidation: for example, the computation and methodology behind the calculation of holding costs varies widely - and in some instances completely ignored. In addition, ambiguity exists in terms of the inclusion of various elements that comprise holding costs, thereby affecting the assessment of their relative contribution. Such anomalies may be explained by considering that assessment is conducted over time in an ever-changing environment. A strong relationship with opportunity cost - in turn dependant inter alia upon prevailing inflation and / or interest rates - adds further complexity. By extending research in the general area of housing affordability, this thesis seeks to provide a detailed investigation of those elements related to holding costs specifically in the context of midsized (i.e. between 15-200 lots) greenfield residential property developments in South East Queensland. With the dimensions of holding costs and their influence over housing affordability determined, the null hypothesis H0 that holding costs are not passed on can be addressed. Arriving at these conclusions involves the development of robust economic and econometric models which seek to clarify the componentry impacts of holding cost elements. An explanatory sequential design research methodology has been adopted, whereby the compilation and analysis of quantitative data and the development of an economic model is informed by the subsequent collection and analysis of primarily qualitative data derived from surveying development related organisations. Ultimately, there are significant policy implications in relation to the framework used in Australian jurisdictions that promote, retain, or otherwise maximise, the opportunities for affordable housing.
Resumo:
Objectives:Despite many years of research, there is currently no treatment available that results in major neurological or functional recovery after traumatic spinal cord injury (tSCI). In particular, no conclusive data related to the role of the timing of decompressive surgery, and the impact of injury severity on its benefit, have been published to date. This paper presents a protocol that was designed to examine the hypothesized association between the timing of surgical decompression and the extent of neurological recovery in tSCI patients.Study design: The SCI-POEM study is a Prospective, Observational European Multicenter comparative cohort study. This study compares acute (<12 h) versus non-acute (>12 h, <2 weeks) decompressive surgery in patients with a traumatic spinal column injury and concomitant spinal cord injury. The sample size calculation was based on a representative European patient cohort of 492 tSCI patients. During a 4-year period, 300 patients will need to be enrolled from 10 trauma centers across Europe. The primary endpoint is lower-extremity motor score as assessed according to the 'International standards for neurological classification of SCI' at 12 months after injury. Secondary endpoints include motor, sensory, imaging and functional outcomes at 3, 6 and 12 months after injury.Conclusion:In order to minimize bias and reduce the impact of confounders, special attention is paid to key methodological principles in this study protocol. A significant difference in safety and/or efficacy endpoints will provide meaningful information to clinicians, as this would confirm the hypothesis that rapid referral to and treatment in specialized centers result in important improvements in tSCI patients.Spinal Cord advance online publication, 17 April 2012; doi:10.1038/sc.2012.34.