878 resultados para Observational techniques and algorithms
Resumo:
Damage assessment (damage detection, localization and quantification) in structures and appropriate retrofitting will enable the safe and efficient function of the structures. In this context, many Vibration Based Damage Identification Techniques (VBDIT) have emerged with potential for accurate damage assessment. VBDITs have achieved significant research interest in recent years, mainly due to their non-destructive nature and ability to assess inaccessible and invisible damage locations. Damage Index (DI) methods are also vibration based, but they are not based on the structural model. DI methods are fast and inexpensive compared to the model-based methods and have the ability to automate the damage detection process. DI method analyses the change in vibration response of the structure between two states so that the damage can be identified. Extensive research has been carried out to apply the DI method to assess damage in steel structures. Comparatively, there has been very little research interest in the use of DI methods to assess damage in Reinforced Concrete (RC) structures due to the complexity of simulating the predominant damage type, the flexural crack. Flexural cracks in RC beams distribute non- linearly and propagate along all directions. Secondary cracks extend more rapidly along the longitudinal and transverse directions of a RC structure than propagation of existing cracks in the depth direction due to stress distribution caused by the tensile reinforcement. Simplified damage simulation techniques (such as reductions in the modulus or section depth or use of rotational spring elements) that have been extensively used with research on steel structures, cannot be applied to simulate flexural cracks in RC elements. This highlights a big gap in knowledge and as a consequence VBDITs have not been successfully applied to damage assessment in RC structures. This research will address the above gap in knowledge and will develop and apply a modal strain energy based DI method to assess damage in RC flexural members. Firstly, this research evaluated different damage simulation techniques and recommended an appropriate technique to simulate the post cracking behaviour of RC structures. The ABAQUS finite element package was used throughout the study with properly validated material models. The damaged plasticity model was recommended as the method which can correctly simulate the post cracking behaviour of RC structures and was used in the rest of this study. Four different forms of Modal Strain Energy based Damage Indices (MSEDIs) were proposed to improve the damage assessment capability by minimising the numbers and intensities of false alarms. The developed MSEDIs were then used to automate the damage detection process by incorporating programmable algorithms. The developed algorithms have the ability to identify common issues associated with the vibration properties such as mode shifting and phase change. To minimise the effect of noise on the DI calculation process, this research proposed a sequential order of curve fitting technique. Finally, a statistical based damage assessment scheme was proposed to enhance the reliability of the damage assessment results. The proposed techniques were applied to locate damage in RC beams and slabs on girder bridge model to demonstrate their accuracy and efficiency. The outcomes of this research will make a significant contribution to the technical knowledge of VBDIT and will enhance the accuracy of damage assessment in RC structures. The application of the research findings to RC flexural members will enable their safe and efficient performance.
Resumo:
Ian Hunter's early work on the history of literature education and the emergence of English as school subject issued a bold challenge to traditional accounts that have in the main focused on English either as knowledge of a particular field or as ideology. The alternative proposal put forward by Hunter and supported by detailed historical analysis is that English exists as a series of historically contingent techniques and practices for shaping the self-managing capacities of children. The challenge for the field is to advance this historical work and to examine possible implications for English teaching.
Resumo:
Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.
Resumo:
This paper explores what we are calling “Guerrilla Research Tactics” (GRT): research methods that exploit emerging mobile and cloud based digital technologies. We examine some case studies in the use of this technology to generate research data directly from the physical fabric and the people of the city. We argue that GRT is a new and novel way of engaging public participation in urban, place based research because it facilitates the co- creation of knowledge, with city inhabitants, ‘on the fly’. This paper discusses the potential of these new research techniques and what they have to offer researchers operating in the creative disciplines and beyond. This work builds on and extends Gauntlett’s “new creative methods” (2007) and contributes to the existing body of literature addressing creative and interactive approaches to data collection.
Resumo:
Abstract: Texture enhancement is an important component of image processing, with extensive application in science and engineering. The quality of medical images, quantified using the texture of the images, plays a significant role in the routine diagnosis performed by medical practitioners. Previously, image texture enhancement was performed using classical integral order differential mask operators. Recently, first order fractional differential operators were implemented to enhance images. Experiments conclude that the use of the fractional differential not only maintains the low frequency contour features in the smooth areas of the image, but also nonlinearly enhances edges and textures corresponding to high-frequency image components. However, whilst these methods perform well in particular cases, they are not routinely useful across all applications. To this end, we applied the second order Riesz fractional differential operator to improve upon existing approaches of texture enhancement. Compared with the classical integral order differential mask operators and other fractional differential operators, our new algorithms provide higher signal to noise values, which leads to superior image quality.
Resumo:
Transport processes within heterogeneous media may exhibit non-classical diffusion or dispersion; that is, not adequately described by the classical theory of Brownian motion and Fick's law. We consider a space fractional advection-dispersion equation based on a fractional Fick's law. The equation involves the Riemann-Liouville fractional derivative which arises from assuming that particles may make large jumps. Finite difference methods for solving this equation have been proposed by Meerschaert and Tadjeran. In the variable coefficient case, the product rule is first applied, and then the Riemann-Liouville fractional derivatives are discretised using standard and shifted Grunwald formulas, depending on the fractional order. In this work, we consider a finite volume method that deals directly with the equation in conservative form. Fractionally-shifted Grunwald formulas are used to discretise the fractional derivatives at control volume faces. We compare the two methods for several case studies from the literature, highlighting the convenience of the finite volume approach.
Resumo:
Quantitative imaging methods to analyze cell migration assays are not standardized. Here we present a suite of two–dimensional barrier assays describing the collective spreading of an initially–confined population of 3T3 fibroblast cells. To quantify the motility rate we apply two different automatic image detection methods to locate the position of the leading edge of the spreading population after 24, 48 and 72 hours. These results are compared with a manual edge detection method where we systematically vary the detection threshold. Our results indicate that the observed spreading rates are very sensitive to the choice of image analysis tools and we show that a standard measure of cell migration can vary by as much as 25% for the same experimental images depending on the details of the image analysis tools. Our results imply that it is very difficult, if not impossible, to meaningfully compare previously published measures of cell migration since previous results have been obtained using different image analysis techniques and the details of these techniques are not always reported. Using a mathematical model, we provide a physical interpretation of our edge detection results. The physical interpretation is important since edge detection algorithms alone do not specify any physical measure, or physical definition, of the leading edge of the spreading population. Our modeling indicates that variations in the image threshold parameter correspond to a consistent variation in the local cell density. This means that varying the threshold parameter is equivalent to varying the location of the leading edge in the range of approximately 1–5% of the maximum cell density.
Resumo:
This paper describes the content and delivery of a software internationalisation subject (ITN677) that was developed for Master of Information Technology (MIT) students in the Faculty of Information Technology at Queensland University of Technology. This elective subject introduces students to the strategies, technologies, techniques and current development associated with this growing 'software development for the world' specialty area. Students learn what is involved in planning and managing a software internationalisation project as well as designing, building and using a software internationalisation application. Students also learn about how a software internationalisation project must fit into an over-all product localisation and globalisation that may include culturalisation, tailored system architectures, and reliance upon industry standards. In addition, students are exposed to the different software development techniques used by organizations in this arena and the perils and pitfalls of managing software internationalisation projects.
Resumo:
The contemporary working environment is being rapidly reshaped by technological, industrial and political forces. Increased global competitiveness and an emphasis on productivity have led to the appearance of alternative methods of employment, such as part-time, casual and itinerant work, allowing greater flexibility. This allows for the development of a core permanent staff and the simultaneous utilisation of casual staff according to business needs. Flexible workers across industries are generally referred to as the non-standard workforce and full-time permanent workers as the standard workforce. Even though labour flexibility favours the employer, increased opportunity for flexible work has been embraced by women for many reasons, including the gender struggle for greater economic independence and social equality. Consequently, the largely female nursing industry, both nationally and internationally, has been caught up in this wave of change. This ageing workforce has been at the forefront of the push for flexibility with recent figures showing almost half the nursing workforce is employed in non-standard capacity. In part, this has allowed women to fulfil caring roles outside their work, to ease off nearing retirement and to supplement the family income. More significantly, however, flexibility has developed as an economic management initiative, as a strategy for cost constraint. The result has been the development of a dual workforce and as suggested by Pocock, Buchanan and Campbell (2004), associated deep-seated resentment and the marginalisation of part-time and casual workers by their full-time colleagues and managers. Additionally, as nursing currently faces serious recruitment and retention problems there is urgent need to understand the factors which are underlying present discontent in the nursing profession. There is an identified gap in nursing knowledge surrounding the issues relating to recruitment and retention. Communication involves speaking, listening, reading and writing and is an interactive process which is central to the lives of humans. Workplace communication refers to human interaction, information technology, and multimedia and print. It is the means to relationship building between workers, management, and their external environment and is critical to organisational effectiveness. Communication and language are integral to nursing performance (Hall, 2005), in twenty-four hour service however increasing fragmentation due to part-time and casual work in the nursing industry means that effective communication management has become increasingly difficult. More broadly it is known that disruption to communication systems impacts negatively on consumer outcomes. Because of this gap in understanding how nurses view their contemporary nursing world, an interpretative ethnographic study which progressed to a critical ethnographic study, based on the conceptual framework of constructionism and interpretativism was used. The study site was a division within an acute health care facility, and the relationship between increasing casualisation of the nursing workforce and the experiences of communication of standard and non-standard nurses was explored. For this study, full-time standard nurses were those employed to work in a specific unit for forty hours per week. Non-standard nurses were those employed part-time in specific units or those nurses employed to work as relief pool nurses for shift short falls where needed. Nurses employed by external agencies, but required to fill in for shifts at the facility were excluded from this research. This study involved an analysis of observational, interview and focus group data of standard and non-standard nurses within this facility. Three analytical findings - the organisation of nursing work; constructing the casual nurse as other; and the function of space, situate communication within a broader discussion about non-standard work and organisational culture. The study results suggest that a significant culture of marginalisation exists for nurses who work in a non-standard capacity and that this affects communication for nurses and has implications for the quality of patient care. The discussion draws on the seven elements of marginalisation described by Hall, Stephen and Melius (1994). The arguments propose that these elements underpin a culture which supports remnants of the historically gendered stereotype "the good nurse" and these cultural values contribute to practices and behaviour which marginalise all nurses, particularly those who work less than full-time. Gender inequality is argued to be at the heart of marginalising practices because of long standing subordination of nurses by the powerful medical profession, paralleling historical subordination of women in society. This has denied nurses adequate representation and voice in decision making. The new knowledge emanating from this study extends current knowledge of factors surrounding recruitment and retention and as such contributes to an understanding of the current and complex nursing environment.
Resumo:
This study aims to redefine spaces of learning to places of learning through the direct engagement of local communities as a way to examine and learn from real world issues in the city. This paper exemplifies Smart City Learning, where the key goal is to promote the generation and exchange of urban design ideas for the future development of South Bank, in Brisbane, Australia, informing the creation of new design policies responding to the needs of local citizens. Specific to this project was the implementation of urban informatics techniques and approaches to promote innovative engagement strategies. Architecture and Urban Design students were encouraged to review and appropriate real-time, ubiquitous technology, social media, and mobile devices that were used by urban residents to augment and mediate the physical and digital layers of urban infrastructures. Our study’s experience found that urban informatics provide an innovative opportunity to enrich students’ place of learning within the city.
Resumo:
Objectives The aim of this study was to evaluate the role of cardiac K+ channel gene variants in families with atrial fibrillation (AF). Background The K+ channels play a major role in atrial repolarization but single mutations in cardiac K+ channel genes are infrequently present in AF families. The collective effect of background K+ channel variants of varying prevalence and effect size on the atrial substrate for AF is largely unexplored. Methods Genes encoding the major cardiac K+ channels were resequenced in 80 AF probands. Nonsynonymous coding sequence variants identified in AF probands were evaluated in 240 control subjects. Novel variants were characterized using patch-clamp techniques and in silico modeling was performed using the Courtemanche atrial cell model. Results Nineteen nonsynonymous variants in 9 genes were found, including 11 rare variants. Rare variants were more frequent in AF probands (18.8% vs. 4.2%, p < 0.001), and the mean number of variants was greater (0.21 vs. 0.04, p < 0.001). The majority of K+ channel variants individually had modest functional effects. Modeling simulations to evaluate combinations of K+ channel variants of varying population frequency indicated that simultaneous small perturbations of multiple current densities had nonlinear interactions and could result in substantial (>30 ms) shortening or lengthening of action potential duration as well as increased dispersion of repolarization. Conclusions Families with AF show an excess of rare functional K+ channel gene variants of varying phenotypic effect size that may contribute to an atrial arrhythmogenic substrate. Atrial cell modeling is a useful tool to assess epistatic interactions between multiple variants.
Resumo:
Damage detection using modal properties is a widely accepted method. However, quantifying such damage using modal properties is still not well established. With this in mind, a research project is presently underway towards the development of a procedure to detect, locate and quantify damage in structural components using the variations in modal properties. A novel vibration based parameter called Vibration based Damage Index is introduced into the damage assessment procedure. This paper presents the early part of the research project which treats flexural members. The proposed procedure is validated using experimental data and/or theoretical techniques and illustrated through application. Outcomes of this research highlight the ability of the proposed procedure to successfully detect, locate and quantify damage in flexural structural components using the modal properties of the first few modes.
Resumo:
Since the late 1970s, there has been a significant expansion in techniques for using mediated interactions between offenders and those affected by their behaviour. This trend began with juvenile justice conferencing, family group conferencing and Indigenous sentencing circles. The umbrella term used to describe these techniques and processes is ‘restorative justice’ (‘RJ’ to its fans and practitioners).Two important catalysts for this expansion were an increased awareness of the marginalisation of victims in the criminal justice system, and concerns over climbing recidivism rates.
Resumo:
Fossils and sediments preserved in caves are an excellent source of information for investigating impacts of past environmental changes on biodiversity. Until recently studies have relied on morphology-based palaeontological approaches, but recent advances in molecular analytical methods offer excellent potential for extracting a greater array of biological information from these sites. This study presents a thorough assessment of DNA preservation from late Pleistocene–Holocene vertebrate fossils and sediments from Kelly Hill Cave Kangaroo Island, South Australia. Using a combination of extraction techniques and sequencing technologies, ancient DNA was characterised from over 70 bones and 20 sediment samples from 15 stratigraphic layers ranging in age from >20 ka to ∼6.8 ka. A combination of primers targeting marsupial and placental mammals, reptiles and two universal plant primers were used to reveal genetic biodiversity for comparison with the mainland and with the morphological fossil record for Kelly Hill Cave. We demonstrate that Kelly Hill Cave has excellent long-term DNA preservation, back to at least 20 ka. This contrasts with the majority of Australian cave sites thus far explored for ancient DNA preservation, and highlights the great promise Kangaroo Island caves hold for yielding the hitherto-elusive DNA of extinct Australian Pleistocene species.
Resumo:
The huge amount of CCTV footage available makes it very burdensome to process these videos manually through human operators. This has made automated processing of video footage through computer vision technologies necessary. During the past several years, there has been a large effort to detect abnormal activities through computer vision techniques. Typically, the problem is formulated as a novelty detection task where the system is trained on normal data and is required to detect events which do not fit the learned ‘normal’ model. There is no precise and exact definition for an abnormal activity; it is dependent on the context of the scene. Hence there is a requirement for different feature sets to detect different kinds of abnormal activities. In this work we evaluate the performance of different state of the art features to detect the presence of the abnormal objects in the scene. These include optical flow vectors to detect motion related anomalies, textures of optical flow and image textures to detect the presence of abnormal objects. These extracted features in different combinations are modeled using different state of the art models such as Gaussian mixture model(GMM) and Semi- 2D Hidden Markov model(HMM) to analyse the performances. Further we apply perspective normalization to the extracted features to compensate for perspective distortion due to the distance between the camera and objects of consideration. The proposed approach is evaluated using the publicly available UCSD datasets and we demonstrate improved performance compared to other state of the art methods.