952 resultados para Detection models
Resumo:
Structurally novel compounds able to block voltage-gated Ca2+ channels (VGCCs) are currently being sought for the development of new drugs directed at neurological disorders. Fluorescence techniques have recently been developed to facilitate the analysis of VGCC blockers in a multi-well format. By utilising the small cell lung carcinoma cell line, NCI-H146, we were able to detect changes in intracellular Ca2+ concentration ([Ca2+]i) using a fluorescence microplate reader. NCI-H146 cells have characteristics resembling those of neuronal cells and express multiple VGCC subtypes, including those of the L-, N- and P-type. We found that K+-depolarisation of fluo-3 loaded NCI-H146 cells causes a rapid and transient increase in fluorescence, which was readily detected in a 96-well plate. Extracts of Australian plants, including those used traditionally as headache or pain treatments, were tested in this study to identify those affecting Ca2+ influx following membrane depolarisation of NCI-H146 cells. We found that E. bignoniiflora, A. symphyocarpa and E. vespertilio caused dose-dependent inhibition of K+-depolarised Ca2+ influx, with IC50 values calculated to be 234, 548 and 209 μg/ml, respectively. This data suggests an effect of these extracts on the function of VGCCs in these cells. Furthermore, we found similar effects using a fluorescence laser imaging plate reader (FLIPR) that allows simultaneous measurement of real-time fluorescence in a multi-well plate. Our results indicate that the dichloromethane extract of E. bignoniiflora and the methanolic extract of E. vespertilio show considerable promise as antagonists of neuronal VGCCs. Further analysis is required to characterise the function of the bioactive constituents in these extracts and determine their selectivity on VGCC subtypes.
Resumo:
This paper presents a recursive strategy for online detection of actuator faults on a unmanned aerial system (UAS) subjected to accidental actuator faults. The proposed detection algorithm aims to provide a UAS with the capability of identifying and determining characteristics of actuator faults, offering necessary flight information for the design of fault-tolerant mechanism to compensate for the resultant side-effect when faults occur. The proposed fault detection strategy consists of a bank of unscented Kalman filters (UKFs) with each one detecting a specific type of actuator faults and estimating corresponding velocity and attitude information. Performance of the proposed method is evaluated using a typical nonlinear UAS model and it is demonstrated in simulations that our method is able to detect representative faults with a sufficient accuracy and acceptable time delay, and can be applied to the design of fault-tolerant flight control systems of UASs.
Resumo:
Mathematical models of mosquito-borne pathogen transmission originated in the early twentieth century to provide insights into how to most effectively combat malaria. The foundations of the Ross–Macdonald theory were established by 1970. Since then, there has been a growing interest in reducing the public health burden of mosquito-borne pathogens and an expanding use of models to guide their control. To assess how theory has changed to confront evolving public health challenges, we compiled a bibliography of 325 publications from 1970 through 2010 that included at least one mathematical model of mosquito-borne pathogen transmission and then used a 79-part questionnaire to classify each of 388 associated models according to its biological assumptions. As a composite measure to interpret the multidimensional results of our survey, we assigned a numerical value to each model that measured its similarity to 15 core assumptions of the Ross–Macdonald model. Although the analysis illustrated a growing acknowledgement of geographical, ecological and epidemiological complexities in modelling transmission, most models during the past 40 years closely resemble the Ross–Macdonald model. Modern theory would benefit from an expansion around the concepts of heterogeneous mosquito biting, poorly mixed mosquito-host encounters, spatial heterogeneity and temporal variation in the transmission process.
Resumo:
The in situ-reverse transcription-polymerase chain reaction (IS-RT-PCR) is a method that allows the direct localisation of gene expression. The method utilises the dual buffer mediated activity of the enzyme rTth DNA polymerase enabling both reverse transcription and DNA amplification. Labelled nucleoside triphosphates allow the site of expression to be labelled, rather than the PCR primers themselves, giving a more accurate localisation of transcript expression and decreased background than standard in situ hybridisation (ISH) assays. The MDA-MB-231 human breast carcinoma (HBC) cell line was assayed via the IS-RT-PCR technique, using primers encoding MT-MMP (membrane-type matrix metalloproteinase) and human β-actin. Our results clearly indicate baseline expression of MT-MMP in the relatively invasive MDA-MB-231 cell line at a signal intensity similar to the housekeeping gene β-actin, and results following induction with Concanavalin A (Con A) are consistent with our previous results obtained via Northern blotting.
Resumo:
Before the age of 75 years, approximately 10% of women will be diagnosed with breast cancer, one of the most common malignancies and a leading cause of death among women. The objective of this study was to determine if expression of the nuclear receptor coactivators 1 and 3 (NCoA1 and NCoA3) varied in breast cancer grades. RNA was extracted from 25 breast tumours and transcribed into cDNA which underwent semi-quantitative polymerase chain reaction, normalised using 18S. Analysis indicated that an expression change for NCoA1 in cancer grades and estrogen receptor alpha negative tissue (P= 0.028 and 0.001 respectively). NCoA1 expression increased in grade 3 and estrogen receptor alpha negative tumours, compared to controls. NCoA3 showed a similar, but not significant, trend in grade and a non-significant decrease in estrogen receptor alpha negative tissues. Expression of NCoA1 in late stage and estrogen receptor alpha negative breast tumours may have implications to breast cancer treatment, particularly in the area of manipulation of hormone signalling systems in advanced tumours.
Resumo:
Crashes on motorway contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence reduce crashes will help address congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a Short time window around the time of crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques, that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists, and that this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with traffic flow data of one hour prior to the crash using an incident detection algorithm. Traffic flow trends (traffic speed/occupancy time series) revealed that crashes could be clustered with regards of the dominant traffic flow pattern prior to the crash. Using the k-means clustering method allowed the crashes to be clustered based on their flow trends rather than their distance. Four major trends have been found in the clustering results. Based on these findings, crash likelihood estimation algorithms can be fine-tuned based on the monitored traffic flow conditions with a sliding window of 60 minutes to increase accuracy of the results and minimize false alarms.
Resumo:
Crashes that occur on motorways contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence, reducing the frequency of crashes assists in addressing congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a short time window around the time of a crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists. We will compare them with normal traffic trends and show this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with crashes corresponding to traffic flow data using an incident detection algorithm. Traffic trends (traffic speed time series) revealed that crashes can be clustered with regards to the dominant traffic patterns prior to the crash. Using the K-Means clustering method with Euclidean distance function allowed the crashes to be clustered. Then, normal situation data was extracted based on the time distribution of crashes and were clustered to compare with the “high risk” clusters. Five major trends have been found in the clustering results for both high risk and normal conditions. The study discovered traffic regimes had differences in the speed trends. Based on these findings, crash likelihood estimation models can be fine-tuned based on the monitored traffic conditions with a sliding window of 30 minutes to increase accuracy of the results and minimize false alarms.
Resumo:
Wind power has become one of the popular renewable resources all over the world and is anticipated to occupy 12% of the total global electricity generation capacity by 2020. For the harsh environment that the wind turbine operates, fault diagnostic and condition monitoring are important for wind turbine safety and reliability. This paper employs a systematic literature review to report the most recent promotions in the wind turbine fault diagnostic, from 2005 to 2012. The frequent faults and failures in wind turbines are considered and different techniques which have been used by researchers are introduced, classified and discussed.
Resumo:
In this paper, a polynomial time algorithm is presented for solving the Eden problem for graph cellular automata. The algorithm is based on our neighborhood elimination operation which removes local neighborhood configurations which cannot be used in a pre-image of a given configuration. This paper presents a detailed derivation of our algorithm from first principles, and a detailed complexity and accuracy analysis is also given. In the case of time complexity, it is shown that the average case time complexity of the algorithm is \Theta(n^2), and the best and worst cases are \Omega(n) and O(n^3) respectively. This represents a vast improvement in the upper bound over current methods, without compromising average case performance.
Resumo:
Business models to date have remained the creation of management, however, it is the belief of the authors that designers should be critically approaching, challenging and creating new business models as part of their practice. This belief portrays a new era where business model constructs become the new design brief of the future and fuel design and innovation to work together at the strategic level of an organisation. Innovation can no longer rely on technology and R&D alone but must incorporate business models. Business model innovation has become a strong type of competitive advantage. As firms choose not to compete only on price, but through the delivery of a unique value proposition in order to engage with customers and to differentiate a company within a competitive market. The purpose of this paper is to explore and investigate business model design through various product and/or service deliveries, and identify common drivers that are catalysts for business model innovation. Fifty companies spanning a diverse range of criteria were chosen, to evaluate and compare commonalities and differences in the design of their business models. The analysis of these business cases uncovered commonalities of the key strategic drivers behind these innovative business models. Five Meta Models were derived from this content analysis: Customer Led, Cost Driven, Resource Led, Partnership Led and Price Led. These five key foci provide a designer with a focus from which quick prototypes of new business models are created. Implications from this research suggest there is no ‘one right’ model, but rather through experimentation, the generation of many unique and diverse concepts can result in greater possibilities for future innovation and sustained competitive advantage.
Resumo:
This dissertation seeks to define and classify potential forms of Nonlinear structure and explore the possibilities they afford for the creation of new musical works. It provides the first comprehensive framework for the discussion of Nonlinear structure in musical works and provides a detailed overview of the rise of nonlinearity in music during the 20th century. Nonlinear events are shown to emerge through significant parametrical discontinuity at the boundaries between regions of relatively strong internal cohesion. The dissertation situates Nonlinear structures in relation to linear structures and unstructured sonic phenomena and provides a means of evaluating Nonlinearity in a musical structure through the consideration of the degree to which the structure is integrated, contingent, compressible and determinate as a whole. It is proposed that Nonlinearity can be classified as a three dimensional space described by three continuums: the temporal continuum, encompassing sequential and multilinear forms of organization, the narrative continuum encompassing processual, game structure and developmental narrative forms and the referential continuum encompassing stylistic allusion, adaptation and quotation. The use of spectrograms of recorded musical works is proposed as a means of evaluating Nonlinearity in a musical work through the visual representation of parametrical divergence in pitch, duration, timbre and dynamic over time. Spectral and structural analysis of repertoire works is undertaken as part of an exploration of musical nonlinearity and the compositional and performative features that characterize it. The contribution of cultural, ideological, scientific and technological shifts to the emergence of Nonlinearity in music is discussed and a range of compositional factors that contributed to the emergence of musical Nonlinearity is examined. The evolution of notational innovations from the mobile score to the screen score is plotted and a novel framework for the discussion of these forms of musical transmission is proposed. A computer coordinated performative model is discussed, in which a computer synchronises screening of notational information, provides temporal coordination of the performers through click-tracks or similar methods and synchronises the audio processing and synthesized elements of the work. It is proposed that such a model constitutes a highly effective means of realizing complex Nonlinear structures. A creative folio comprising 29 original works that explore nonlinearity is presented, discussed and categorised utilising the proposed classifications. Spectrograms of these works are employed where appropriate to illustrate the instantiation of parametrically divergent substructures and examples of structural openness through multiple versioning.
Resumo:
This project was a step forward in developing intrusion detection systems in distributed environments such as web services. It investigates a new approach of detection based on so-called "taint-marking" techniques and introduces a theoretical framework along with its implementation in the Linux kernel.
Resumo:
Parallel interleaved converters are finding more applications everyday, for example they are frequently used for VRMs on PC main boards mainly to obtain better transient response. Parallel interleaved converters can have their inductances uncoupled, directly coupled or inversely coupled, all of which have different applications with associated advantages and disadvantages. Coupled systems offer more control over converter features, such as ripple currents, inductance volume and transient response. To be able to gain an intuitive understanding of which type of parallel interleaved converter, what amount of coupling, what number of levels and how much inductance should be used for different applications a simple equivalent model is needed. As all phases of an interleaved converter are supposed to be identical, the equivalent model is nothing more than a separate inductance which is common to all phases. Without utilising this simplification the design of a coupled system is quite daunting. Being able to design a coupled system involves solving and understanding the RMS currents of the input, individual phase (or cell) and output. A procedure using this equivalent model and a small amount of modulo arithmetic is detailed.
Resumo:
This thesis concerns the mathematical model of moving fluid interfaces in a Hele-Shaw cell: an experimental device in which fluid flow is studied by sandwiching the fluid between two closely separated plates. Analytic and numerical methods are developed to gain new insights into interfacial stability and bubble evolution, and the influence of different boundary effects is examined. In particular, the properties of the velocity-dependent kinetic undercooling boundary condition are analysed, with regard to the selection of only discrete possible shapes of travelling fingers of fluid, the formation of corners on the interface, and the interaction of kinetic undercooling with the better known effect of surface tension. Explicit solutions to the problem of an expanding or contracting ring of fluid are also developed.
Resumo:
This paper evaluates the efficiency of a number of popular corpus-based distributional models in performing discovery on very large document sets, including online collections. Literature-based discovery is the process of identifying previously unknown connections from text, often published literature, that could lead to the development of new techniques or technologies. Literature-based discovery has attracted growing research interest ever since Swanson's serendipitous discovery of the therapeutic effects of fish oil on Raynaud's disease in 1986. The successful application of distributional models in automating the identification of indirect associations underpinning literature-based discovery has been heavily demonstrated in the medical domain. However, we wish to investigate the computational complexity of distributional models for literature-based discovery on much larger document collections, as they may provide computationally tractable solutions to tasks including, predicting future disruptive innovations. In this paper we perform a computational complexity analysis on four successful corpus-based distributional models to evaluate their fit for such tasks. Our results indicate that corpus-based distributional models that store their representations in fixed dimensions provide superior efficiency on literature-based discovery tasks.