288 resultados para Rule-based techniques


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Process compliance measurement is getting increasing attention in companies due to stricter legal requirements and market pressure for operational excellence. In order to judge on compliance of the business processing, the degree of behavioural deviation of a case, i.e., an observed execution sequence, is quantified with respect to a process model (referred to as fitness, or recall). Recently, different compliance measures have been proposed. Still, nearly all of them are grounded on state-based techniques and the trace equivalence criterion, in particular. As a consequence, these approaches have to deal with the state explosion problem. In this paper, we argue that a behavioural abstraction may be leveraged to measure the compliance of a process log – a collection of cases. To this end, we utilise causal behavioural profiles that capture the behavioural characteristics of process models and cases, and can be computed efficiently. We propose different compliance measures based on these profiles, discuss the impact of noise in process logs on our measures, and show how diagnostic information on non-compliance is derived. As a validation, we report on findings of applying our approach in a case study with an international service provider.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In attempting to build intelligent litigation support tools, we have moved beyond first generation, production rule legal expert systems. Our work integrates rule based and case based reasoning with intelligent information retrieval. When using the case based reasoning methodology, or in our case the specialisation of case based retrieval, we need to be aware of how to retrieve relevant experience. Our research, in the legal domain, specifies an approach to the retrieval problem which relies heavily on an extended object oriented/rule based system architecture that is supplemented with causal background information. We use a distributed agent architecture to help support the reasoning process of lawyers. Our approach to integrating rule based reasoning, case based reasoning and case based retrieval is contrasted to the CABARET and PROLEXS architectures which rely on a centralised blackboard architecture. We discuss in detail how our various cooperating agents interact, and provide examples of the system at work. The IKBALS system uses a specialised induction algorithm to induce rules from cases. These rules are then used as indices during the case based retrieval process. Because we aim to build legal support tools which can be modified to suit various domains rather than single purpose legal expert systems, we focus on principles behind developing legal knowledge based systems. The original domain chosen was theAccident Compensation Act 1989 (Victoria, Australia), which relates to the provision of benefits for employees injured at work. For various reasons, which are indicated in the paper, we changed our domain to that ofCredit Act 1984 (Victoria, Australia). This Act regulates the provision of loans by financial institutions. The rule based part of our system which provides advice on the Credit Act has been commercially developed in conjunction with a legal firm. We indicate how this work has lead to the development of a methodology for constructing rule based legal knowledge based systems. We explain the process of integrating this existing commercial rule based system with the case base reasoning and retrieval architecture.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reinforced concrete structures are susceptible to a variety of deterioration mechanisms due to creep and shrinkage, alkali-silica reaction (ASR), carbonation, and corrosion of the reinforcement. The deterioration problems can affect the integrity and load carrying capacity of the structure. Substantial research has been dedicated to these various mechanisms aiming to identify the causes, reactions, accelerants, retardants and consequences. This has improved our understanding of the long-term behaviour of reinforced concrete structures. However, the strengthening of reinforced concrete structures for durability has to date been mainly undertaken after expert assessment of field data followed by the development of a scheme to both terminate continuing degradation, by separating the structure from the environment, and strengthening the structure. The process does not include any significant consideration of the residual load-bearing capacity of the structure and the highly variable nature of estimates of such remaining capacity. Development of performance curves for deteriorating bridge structures has not been attempted due to the difficulty in developing a model when the input parameters have an extremely large variability. This paper presents a framework developed for an asset management system which assesses residual capacity and identifies the most appropriate rehabilitation method for a given reinforced concrete structure exposed to aggressive environments. In developing the framework, several industry consultation sessions have been conducted to identify input data required, research methodology and output knowledge base. Capturing expert opinion in a useable knowledge base requires development of a rule based formulation, which can subsequently be used to model the reliability of the performance curve of a reinforced concrete structure exposed to a given environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Precise, up-to-date and increasingly detailed road maps are crucial for various advanced road applications, such as lane-level vehicle navigation, and advanced driver assistant systems. With the very high resolution (VHR) imagery from digital airborne sources, it will greatly facilitate the data acquisition, data collection and updates if the road details can be automatically extracted from the aerial images. In this paper, we proposed an effective approach to detect road lane information from aerial images with employment of the object-oriented image analysis method. Our proposed algorithm starts with constructing the DSM and true orthophotos from the stereo images. The road lane details are detected using an object-oriented rule based image classification approach. Due to the affection of other objects with similar spectral and geometrical attributes, the extracted road lanes are filtered with the road surface obtained by a progressive two-class decision classifier. The generated road network is evaluated using the datasets provided by Queensland department of Main Roads. The evaluation shows completeness values that range between 76% and 98% and correctness values that range between 82% and 97%.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The increasing diversity of the Internet has created a vast number of multilingual resources on the Web. A huge number of these documents are written in various languages other than English. Consequently, the demand for searching in non-English languages is growing exponentially. It is desirable that a search engine can search for information over collections of documents in other languages. This research investigates the techniques for developing high-quality Chinese information retrieval systems. A distinctive feature of Chinese text is that a Chinese document is a sequence of Chinese characters with no space or boundary between Chinese words. This feature makes Chinese information retrieval more difficult since a retrieved document which contains the query term as a sequence of Chinese characters may not be really relevant to the query since the query term (as a sequence Chinese characters) may not be a valid Chinese word in that documents. On the other hand, a document that is actually relevant may not be retrieved because it does not contain the query sequence but contains other relevant words. In this research, we propose two approaches to deal with the problems. In the first approach, we propose a hybrid Chinese information retrieval model by incorporating word-based techniques with the traditional character-based techniques. The aim of this approach is to investigate the influence of Chinese segmentation on the performance of Chinese information retrieval. Two ranking methods are proposed to rank retrieved documents based on the relevancy to the query calculated by combining character-based ranking and word-based ranking. Our experimental results show that Chinese segmentation can improve the performance of Chinese information retrieval, but the improvement is not significant if it incorporates only Chinese segmentation with the traditional character-based approach. In the second approach, we propose a novel query expansion method which applies text mining techniques in order to find the most relevant words to extend the query. Unlike most existing query expansion methods, which generally select the highly frequent indexing terms from the retrieved documents to expand the query. In our approach, we utilize text mining techniques to find patterns from the retrieved documents that highly correlate with the query term and then use the relevant words in the patterns to expand the original query. This research project develops and implements a Chinese information retrieval system for evaluating the proposed approaches. There are two stages in the experiments. The first stage is to investigate if high accuracy segmentation can make an improvement to Chinese information retrieval. In the second stage, a text mining based query expansion approach is implemented and a further experiment has been done to compare its performance with the standard Rocchio approach with the proposed text mining based query expansion method. The NTCIR5 Chinese collections are used in the experiments. The experiment results show that by incorporating the text mining based query expansion with the hybrid model, significant improvement has been achieved in both precision and recall assessments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Applied Theatre is an umbrella term for a range of drama-based techniques, all of which align with a lineage of pedagogical theory and practice: (e.g.) Freire, Moreno, Heathcote. It encompasses methods and forms including Drama Education (O’Neill); Forum Theatre (Boal); and Process Drama (Haseman, O’Toole). Applied theatre often occurs in non-theatrical settings (schools, hospitals, prisons) with the aim of helping participants address issues of local concern. Increasingly, Applied Theatre practices are utilised in the corporate environment. Appied Theatre adopts artistic principles in production, but posits a practical utility beyond simple entertainment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The band The Escalators together with the music uniquely composed for it and a subsequent CD and DVDs was the work that emerged from my period of research. The areas of interest that were investigated were sampling, minimalism, stasis, the work of David Lynch, as well as a desire to produce new and innovative music. The above concepts defined the bands composition and makeup. While each may be regarded as a discreet concept with its own boundaries in my work they seamlessly intermingle resulting in that which is unique to The Escalators sound. The research methodology used for this work was practice led research.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The present rate of technological advance continues to place significant demands on data storage devices. The sheer amount of digital data being generated each year along with consumer expectations, fuels these demands. At present, most digital data is stored magnetically, in the form of hard disk drives or on magnetic tape. The increase in areal density (AD) of magnetic hard disk drives over the past 50 years has been of the order of 100 million times, and current devices are storing data at ADs of the order of hundreds of gigabits per square inch. However, it has been known for some time that the progress in this form of data storage is approaching fundamental limits. The main limitation relates to the lower size limit that an individual bit can have for stable storage. Various techniques for overcoming these fundamental limits are currently the focus of considerable research effort. Most attempt to improve current data storage methods, or modify these slightly for higher density storage. Alternatively, three dimensional optical data storage is a promising field for the information storage needs of the future, offering very high density, high speed memory. There are two ways in which data may be recorded in a three dimensional optical medium; either bit-by-bit (similar in principle to an optical disc medium such as CD or DVD) or by using pages of bit data. Bit-by-bit techniques for three dimensional storage offer high density but are inherently slow due to the serial nature of data access. Page-based techniques, where a two-dimensional page of data bits is written in one write operation, can offer significantly higher data rates, due to their parallel nature. Holographic Data Storage (HDS) is one such page-oriented optical memory technique. This field of research has been active for several decades, but with few commercial products presently available. Another page-oriented optical memory technique involves recording pages of data as phase masks in a photorefractive medium. A photorefractive material is one by which the refractive index can be modified by light of the appropriate wavelength and intensity, and this property can be used to store information in these materials. In phase mask storage, two dimensional pages of data are recorded into a photorefractive crystal, as refractive index changes in the medium. A low-intensity readout beam propagating through the medium will have its intensity profile modified by these refractive index changes and a CCD camera can be used to monitor the readout beam, and thus read the stored data. The main aim of this research was to investigate data storage using phase masks in the photorefractive crystal, lithium niobate (LiNbO3). Firstly the experimental methods for storing the two dimensional pages of data (a set of vertical stripes of varying lengths) in the medium are presented. The laser beam used for writing, whose intensity profile is modified by an amplitudemask which contains a pattern of the information to be stored, illuminates the lithium niobate crystal and the photorefractive effect causes the patterns to be stored as refractive index changes in the medium. These patterns are read out non-destructively using a low intensity probe beam and a CCD camera. A common complication of information storage in photorefractive crystals is the issue of destructive readout. This is a problem particularly for holographic data storage, where the readout beam should be at the same wavelength as the beam used for writing. Since the charge carriers in the medium are still sensitive to the read light field, the readout beam erases the stored information. A method to avoid this is by using thermal fixing. Here the photorefractive medium is heated to temperatures above 150�C; this process forms an ionic grating in the medium. This ionic grating is insensitive to the readout beam and therefore the information is not erased during readout. A non-contact method for determining temperature change in a lithium niobate crystal is presented in this thesis. The temperature-dependent birefringent properties of the medium cause intensity oscillations to be observed for a beam propagating through the medium during a change in temperature. It is shown that each oscillation corresponds to a particular temperature change, and by counting the number of oscillations observed, the temperature change of the medium can be deduced. The presented technique for measuring temperature change could easily be applied to a situation where thermal fixing of data in a photorefractive medium is required. Furthermore, by using an expanded beam and monitoring the intensity oscillations over a wide region, it is shown that the temperature in various locations of the crystal can be monitored simultaneously. This technique could be used to deduce temperature gradients in the medium. It is shown that the three dimensional nature of the recording medium causes interesting degradation effects to occur when the patterns are written for a longer-than-optimal time. This degradation results in the splitting of the vertical stripes in the data pattern, and for long writing exposure times this process can result in the complete deterioration of the information in the medium. It is shown in that simply by using incoherent illumination, the original pattern can be recovered from the degraded state. The reason for the recovery is that the refractive index changes causing the degradation are of a smaller magnitude since they are induced by the write field components scattered from the written structures. During incoherent erasure, the lower magnitude refractive index changes are neutralised first, allowing the original pattern to be recovered. The degradation process is shown to be reversed during the recovery process, and a simple relationship is found relating the time at which particular features appear during degradation and recovery. A further outcome of this work is that the minimum stripe width of 30 ìm is required for accurate storage and recovery of the information in the medium, any size smaller than this results in incomplete recovery. The degradation and recovery process could be applied to an application in image scrambling or cryptography for optical information storage. A two dimensional numerical model based on the finite-difference beam propagation method (FD-BPM) is presented and used to gain insight into the pattern storage process. The model shows that the degradation of the patterns is due to the complicated path taken by the write beam as it propagates through the crystal, and in particular the scattering of this beam from the induced refractive index structures in the medium. The model indicates that the highest quality pattern storage would be achieved with a thin 0.5 mm medium; however this type of medium would also remove the degradation property of the patterns and the subsequent recovery process. To overcome the simplistic treatment of the refractive index change in the FD-BPM model, a fully three dimensional photorefractive model developed by Devaux is presented. This model shows significant insight into the pattern storage, particularly for the degradation and recovery process, and confirms the theory that the recovery of the degraded patterns is possible since the refractive index changes responsible for the degradation are of a smaller magnitude. Finally, detailed analysis of the pattern formation and degradation dynamics for periodic patterns of various periodicities is presented. It is shown that stripe widths in the write beam of greater than 150 ìm result in the formation of different types of refractive index changes, compared with the stripes of smaller widths. As a result, it is shown that the pattern storage method discussed in this thesis has an upper feature size limit of 150 ìm, for accurate and reliable pattern storage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AC motors are largely used in a wide range of modern systems, from household appliances to automated industry applications such as: ventilations systems, fans, pumps, conveyors and machine tool drives. Inverters are widely used in industrial and commercial applications due to the growing need for speed control in ASD systems. Fast switching transients and the common mode voltage, in interaction with parasitic capacitive couplings, may cause many unwanted problems in the ASD applications. These include shaft voltage and leakage currents. One of the inherent characteristics of Pulse Width Modulation (PWM) techniques is the generation of the common mode voltage, which is defined as the voltage between the electrical neutral of the inverter output and the ground. Shaft voltage can cause bearing currents when it exceeds the amount of breakdown voltage level of the thin lubricant film between the inner and outer rings of the bearing. This phenomenon is the main reason for early bearing failures. A rapid development in power switches technology has lead to a drastic decrement of switching rise and fall times. Because there is considerable capacitance between the stator windings and the frame, there can be a significant capacitive current (ground current escaping to earth through stray capacitors inside a motor) if the common mode voltage has high frequency components. This current leads to noises and Electromagnetic Interferences (EMI) issues in motor drive systems. These problems have been dealt with using a variety of methods which have been reported in the literature. However, cost and maintenance issues have prevented these methods from being widely accepted. Extra cost or rating of the inverter switches is usually the price to pay for such approaches. Thus, the determination of cost-effective techniques for shaft and common mode voltage reduction in ASD systems, with the focus on the first step of the design process, is the targeted scope of this thesis. An introduction to this research – including a description of the research problem, the literature review and an account of the research progress linking the research papers – is presented in Chapter 1. Electrical power generation from renewable energy sources, such as wind energy systems, has become a crucial issue because of environmental problems and a predicted future shortage of traditional energy sources. Thus, Chapter 2 focuses on the shaft voltage analysis of stator-fed induction generators (IG) and Doubly Fed Induction Generators DFIGs in wind turbine applications. This shaft voltage analysis includes: topologies, high frequency modelling, calculation and mitigation techniques. A back-to-back AC-DC-AC converter is investigated in terms of shaft voltage generation in a DFIG. Different topologies of LC filter placement are analysed in an effort to eliminate the shaft voltage. Different capacitive couplings exist in the motor/generator structure and any change in design parameters affects the capacitive couplings. Thus, an appropriate design for AC motors should lead to the smallest possible shaft voltage. Calculation of the shaft voltage based on different capacitive couplings, and an investigation of the effects of different design parameters are discussed in Chapter 3. This is achieved through 2-D and 3-D finite element simulation and experimental analysis. End-winding parameters of the motor are also effective factors in the calculation of the shaft voltage and have not been taken into account in previous reported studies. Calculation of the end-winding capacitances is rather complex because of the diversity of end winding shapes and the complexity of their geometry. A comprehensive analysis of these capacitances has been carried out with 3-D finite element simulations and experimental studies to determine their effective design parameters. These are documented in Chapter 4. Results of this analysis show that, by choosing appropriate design parameters, it is possible to decrease the shaft voltage and resultant bearing current in the primary stage of generator/motor design without using any additional active and passive filter-based techniques. The common mode voltage is defined by a switching pattern and, by using the appropriate pattern; the common mode voltage level can be controlled. Therefore, any PWM pattern which eliminates or minimizes the common mode voltage will be an effective shaft voltage reduction technique. Thus, common mode voltage reduction of a three-phase AC motor supplied with a single-phase diode rectifier is the focus of Chapter 5. The proposed strategy is mainly based on proper utilization of the zero vectors. Multilevel inverters are also used in ASD systems which have more voltage levels and switching states, and can provide more possibilities to reduce common mode voltage. A description of common mode voltage of multilevel inverters is investigated in Chapter 6. Chapter 7 investigates the elimination techniques of the shaft voltage in a DFIG based on the methods presented in the literature by the use of simulation results. However, it could be shown that every solution to reduce the shaft voltage in DFIG systems has its own characteristics, and these have to be taken into account in determining the most effective strategy. Calculation of the capacitive coupling and electric fields between the outer and inner races and the balls at different motor speeds in symmetrical and asymmetrical shaft and balls positions is discussed in Chapter 8. The analysis is carried out using finite element simulations to determine the conditions which will increase the probability of high rates of bearing failure due to current discharges through the balls and races.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Corneal-height data are typically measured with videokeratoscopes and modeled using a set of orthogonal Zernike polynomials. We address the estimation of the number of Zernike polynomials, which is formalized as a model-order selection problem in linear regression. Classical information-theoretic criteria tend to overestimate the corneal surface due to the weakness of their penalty functions, while bootstrap-based techniques tend to underestimate the surface or require extensive processing. In this paper, we propose to use the efficient detection criterion (EDC), which has the same general form of information-theoretic-based criteria, as an alternative to estimating the optimal number of Zernike polynomials. We first show, via simulations, that the EDC outperforms a large number of information-theoretic criteria and resampling-based techniques. We then illustrate that using the EDC for real corneas results in models that are in closer agreement with clinical expectations and provides means for distinguishing normal corneal surfaces from astigmatic and keratoconic surfaces.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A distinctive feature of Chinese test is that a Chinese document is a sequence of Chinese with no space or boundary between Chinese words. This feature makes Chinese information retrieval more difficult since a retrieved document which contains the query term as a sequence of Chinese characters may not be really relevant to the query since the query term (as a sequence Chinese characters) may not be a valid Chinese word in that documents. On the other hand, a document that is actually relevant may not be retrieved because it does not contain the query sequence but contains other relevant words. In this research, we propose a hybrid Chinese information retrieval model by incorporating word-based techniques with the traditional character-based techniques. The aim of this approach is to investigate the influence of Chinese segmentation on the performance of Chinese information retrieval. Two ranking methods are proposed to rank retrieved documents based on the relevancy to the query calculated by combining character-based ranking and word-based ranking. Our experimental results show that Chinese segmentation can improve the performance of Chinese information retrieval, but the improvement is not significant if it incorporates only Chinese segmentation with the traditional character-based approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Undergraduates working in teams can be a problematic endeavour, sometimes exacerbated for the student by poor prior experiences, a predisposition to an individual orientation of assessment, and simply the busyness that now typifies the life of a student. But effort in pedagogical design is worthwhile where team work is often a prerequisite in terms of graduate capabilities, robust learning, increased motivation, and indeed in terms of equipping individuals for emergent knowledge-age work practice, often epitomised by collaborative effort in both blended and virtual contexts. Through an iterative approach, based extensively on the established literature, we have developed a successful scaffold which is workable with a large cohort group (n >800), such that it affords students the lived experience of being a part of a learning network. Individuals within teams work together, to develop individual components that are subsequently aggregated and reified to an overall team knowledge artefact. We describe our case and propose a pedagogical model of scaffolding based on three perspectives: conceptual, rule-based and community-driven. This model provides designers with guidelines for producing and refining assessment tasks for team-based learning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The tear film plays an important role preserving the health of the ocular surface and maintaining the optimal refractive power of the cornea. Moreover dry eye syndrome is one of the most commonly reported eye health problems. This syndrome is caused by abnormalities in the properties of the tear film. Current clinical tools to assess the tear film properties have shown certain limitations. The traditional invasive methods for the assessment of tear film quality, which are used by most clinicians, have been criticized for the lack of reliability and/or repeatability. A range of non-invasive methods of tear assessment have been investigated, but also present limitations. Hence no “gold standard” test is currently available to assess the tear film integrity. Therefore, improving techniques for the assessment of the tear film quality is of clinical significance and the main motivation for the work described in this thesis. In this study the tear film surface quality (TFSQ) changes were investigated by means of high-speed videokeratoscopy (HSV). In this technique, a set of concentric rings formed in an illuminated cone or a bowl is projected on the anterior cornea and their reflection from the ocular surface imaged on a charge-coupled device (CCD). The reflection of the light is produced in the outer most layer of the cornea, the tear film. Hence, when the tear film is smooth the reflected image presents a well structure pattern. In contrast, when the tear film surface presents irregularities, the pattern also becomes irregular due to the light scatter and deviation of the reflected light. The videokeratoscope provides an estimate of the corneal topography associated with each Placido disk image. Topographical estimates, which have been used in the past to quantify tear film changes, may not always be suitable for the evaluation of all the dynamic phases of the tear film. However the Placido disk image itself, which contains the reflected pattern, may be more appropriate to assess the tear film dynamics. A set of novel routines have been purposely developed to quantify the changes of the reflected pattern and to extract a time series estimate of the TFSQ from the video recording. The routine extracts from each frame of the video recording a maximized area of analysis. In this area a metric of the TFSQ is calculated. Initially two metrics based on the Gabor filter and Gaussian gradient-based techniques, were used to quantify the consistency of the pattern’s local orientation as a metric of TFSQ. These metrics have helped to demonstrate the applicability of HSV to assess the tear film, and the influence of contact lens wear on TFSQ. The results suggest that the dynamic-area analysis method of HSV was able to distinguish and quantify the subtle, but systematic degradation of tear film surface quality in the inter-blink interval in contact lens wear. It was also able to clearly show a difference between bare eye and contact lens wearing conditions. Thus, the HSV method appears to be a useful technique for quantitatively investigating the effects of contact lens wear on the TFSQ. Subsequently a larger clinical study was conducted to perform a comparison between HSV and two other non-invasive techniques, lateral shearing interferometry (LSI) and dynamic wavefront sensing (DWS). Of these non-invasive techniques, the HSV appeared to be the most precise method for measuring TFSQ, by virtue of its lower coefficient of variation. While the LSI appears to be the most sensitive method for analyzing the tear build-up time (TBUT). The capability of each of the non-invasive methods to discriminate dry eye from normal subjects was also investigated. The receiver operating characteristic (ROC) curves were calculated to assess the ability of each method to predict dry eye syndrome. The LSI technique gave the best results under both natural blinking conditions and in suppressed blinking conditions, which was closely followed by HSV. The DWS did not perform as well as LSI or HSV. The main limitation of the HSV technique, which was identified during the former clinical study, was the lack of the sensitivity to quantify the build-up/formation phase of the tear film cycle. For that reason an extra metric based on image transformation and block processing was proposed. In this metric, the area of analysis was transformed from Cartesian to Polar coordinates, converting the concentric circles pattern into a quasi-straight lines image in which a block statistics value was extracted. This metric has shown better sensitivity under low pattern disturbance as well as has improved the performance of the ROC curves. Additionally a theoretical study, based on ray-tracing techniques and topographical models of the tear film, was proposed to fully comprehend the HSV measurement and the instrument’s potential limitations. Of special interested was the assessment of the instrument’s sensitivity under subtle topographic changes. The theoretical simulations have helped to provide some understanding on the tear film dynamics, for instance the model extracted for the build-up phase has helped to provide some insight into the dynamics during this initial phase. Finally some aspects of the mathematical modeling of TFSQ time series have been reported in this thesis. Over the years, different functions have been used to model the time series as well as to extract the key clinical parameters (i.e., timing). Unfortunately those techniques to model the tear film time series do not simultaneously consider the underlying physiological mechanism and the parameter extraction methods. A set of guidelines are proposed to meet both criteria. Special attention was given to a commonly used fit, the polynomial function, and considerations to select the appropriate model order to ensure the true derivative of the signal is accurately represented. The work described in this thesis has shown the potential of using high-speed videokeratoscopy to assess tear film surface quality. A set of novel image and signal processing techniques have been proposed to quantify different aspects of the tear film assessment, analysis and modeling. The dynamic-area HSV has shown good performance in a broad range of conditions (i.e., contact lens, normal and dry eye subjects). As a result, this technique could be a useful clinical tool to assess tear film surface quality in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cell based therapies require cells capable of self renewal and differentiation, and a prerequisite is the ability to prepare an effective dose of ex vivo expanded cells for autologous transplants. The in vivo identification of a source of physiologically relevant cell types suitable for cell therapies is therefore an integral part of tissue engineering. Bone marrow is the most easily accessible source of mesenchymal stem cells (MSCs), and harbours two distinct populations of adult stem cells; namely hematopoietic stem cells (HSCs) and bone mesenchymal stem cells (BMSCs). Unlike HSCs, there are yet no rigorous criteria for characterizing BMSCs. Changing understanding about the pluripotency of BMSCs in recent studies has expanded their potential application; however, the underlying molecular pathways which impart the features distinctive to BMSCs remain elusive. Furthermore, the sparse in vivo distribution of these cells imposes a clear limitation to their in vitro study. Also, when BMSCs are cultured in vitro there is a loss of the in vivo microenvironment which results in a progressive decline in proliferation potential and multipotentiality. This is further exacerbated with increased passage number, characterized by the onset of senescence related changes. Accordingly, establishing protocols for generating large numbers of BMSCs without affecting their differentiation potential is necessary. The principal aims of this thesis were to identify potential molecular factors for characterizing BMSCs from osteoarthritic patients, and also to attempt to establish culture protocols favourable for generating large number of BMSCs, while at the same time retaining their proliferation and differentiation potential. Previously published studies concerning clonal cells have demonstrated that BMSCs are heterogeneous populations of cells at various stages of growth. Some cells are higher in the hierarchy and represent the progenitors, while other cells occupy a lower position in the hierarchy and are therefore more committed to a particular lineage. This feature of BMSCs was made evident by the work of Mareddy et al., which involved generating clonal populations of BMSCs from bone marrow of osteoarthritic patients, by a single cell clonal culture method. Proliferation potential and differentiation capabilities were used to group cells into fast growing and slow growing clones. The study presented here is a continuation of the work of Mareddy et al. and employed immunological and array based techniques to identify the primary molecular factors involved in regulating phenotypic characteristics exhibited by contrasting clonal populations. The subtractive immunization (SI) was used to generate novel antibodies against favourably expressed proteins in the fast growing clonal cell population. The difference between the clonal populations at the transcriptional level was determined using a Stem Cell RT2 Profiler TM PCR Array which focuses on stem cell pathway gene expression. Monoclonal antibodies (mAb) generated by SI were able to effectively highlight differentially expressed antigenic determinants, as was evident by Western blot analysis and confocal microscopy. Co-immunoprecipitation, followed by mass spectroscopy analysis, identified a favourably expressed protein as the cytoskeletal protein vimentin. The stem cell gene array highlighted genes that were highly upregulated in the fast growing clonal cell population. Based on their functions these genes were grouped into growth factors, cell fate determination and maintenance of embryonic and neural stem cell renewal. Furthermore, on a closer analysis it was established that the cytoskeletal protein vimentin and nine out of ten genes identified by gene array were associated with chondrogenesis or cartilage repair, consistent with the potential role played by BMSCs in defect repair and maintaining tissue homeostasis, by modulating the gene expression pattern to compensate for degenerated cartilage in osteoarthritic tissues. The gene array also presented transcripts for embryonic lineage markers such as FOXA2 and Sox2, both of which were significantly over expressed in fast growing clonal populations. A recent groundbreaking study by Yamanaka et al imparted embryonic stem cell (ESCs) -like characteristic to somatic cells in a process termed nuclear reprogramming, by the ectopic expression of the genes Sox2, cMyc and Oct4. The expression of embryonic lineage markers in adult stem cells may be a mechanism by which the favourable behaviour of fast growing clonal cells is determined and suggests a possible active phenomenon of spontaneous reprogramming in fast growing clonal cells. The expression pattern of these critical molecular markers could be indicative of the competence of BMSCs. For this reason, the expression pattern of Sox2, Oct4 and cMyc, at various passages in heterogeneous BMSCs population and tissue derived cells (osteoblasts and chondrocytes), was investigated by a real-time PCR and immunoflourescence staining. A strong nuclear staining was observed for Sox2, Oct4 and cMyc, which gradually weakened accompanied with cytoplasmic translocation after several passage. The mRNA and protein expression of Sox2, Oct4 and cMyc peaked at the third passage for osteoblasts, chondrocytes and third passage for BMSCs, and declined with each subsequent passage, indicating towards a possible mechanism of spontaneous reprogramming. This study proposes that the progressive decline in proliferation potential and multipotentiality associated with increased passaging of BMSCs in vitro might be a consequence of loss of these propluripotency factors. We therefore hypothesise that the expression of these master genes is not an intrinsic cell function, but rather an outcome of interaction of the cells with their microenvironment; this was evident by the fact that when removed from their in vivo microenvironment, BMSCs undergo a rapid loss of stemness after only a few passages. One of the most interesting aspects of this study was the integration of factors in the culture conditions, which to some extent, mimicked the in vivo microenvironmental niche of the BMSCs. A number of studies have successfully established that the cellular niche is not an inert tissue component but is of prime importance. The total sum of stimuli from the microenvironment underpins the complex interplay of regulatory mechanisms which control multiple functions in stem cells most importantly stem cell renewal. Therefore, well characterised factors which affect BMSCs characteristics, such as fibronectin (FN) coating, and morphogens such as FGF2 and BMP4, were incorporated into the cell culture conditions. The experimental set up was designed to provide insight into the expression pattern of the stem cell related transcription factors Sox2, cMyc and Oct4, in BMSCs with respect to passaging and changes in culture conditions. Induction of these pluripotency markers in somatic cells by retroviral transfection has been shown to confer pluripotency and an ESCs like state. Our study demonstrated that all treatments could transiently induce the expression of Sox2, cMyc and Oct4, and favourably affect the proliferation potential of BMSCs. The combined effect of these treatments was able to induce and retain the endogenous nuclear expression of stem cell transcription factors in BMSCs over an extended number of in vitro passages. Our results therefore suggest that the transient induction and manipulation of endogenous expression of transcription factors critical for stemness can be achieved by modulating the culture conditions; the benefit of which is to circumvent the need for genetic manipulations. In summary, this study has explored the role of BMSCs in the diseased state of osteoarthritis, by employing transcriptional profiling along with SI. In particular this study pioneered the use of primary cells for generating novel antibodies by SI. We established that somatic cells and BMSCs have a basal level of expression of pluripotency markers. Furthermore, our study indicates that intrinsic signalling mechanisms of BMSCs are intimately linked with extrinsic cues from the microenvironment and that these signals appear to be critical for retaining the expression of genes to maintain cell stemness in long term in vitro culture. This project provides a basis for developing an “artificial niche” required for reversion of commitment and maintenance of BMSC in their uncommitted homeostatic state.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Gait recognition approaches continue to struggle with challenges including view-invariance, low-resolution data, robustness to unconstrained environments, and fluctuating gait patterns due to subjects carrying goods or wearing different clothes. Although computationally expensive, model based techniques offer promise over appearance based techniques for these challenges as they gather gait features and interpret gait dynamics in skeleton form. In this paper, we propose a fast 3D ellipsoidal-based gait recognition algorithm using a 3D voxel model derived from multi-view silhouette images. This approach directly solves the limitations of view dependency and self-occlusion in existing ellipse fitting model-based approaches. Voxel models are segmented into four components (left and right legs, above and below the knee), and ellipsoids are fitted to each region using eigenvalue decomposition. Features derived from the ellipsoid parameters are modeled using a Fourier representation to retain the temporal dynamic pattern for classification. We demonstrate the proposed approach using the CMU MoBo database and show that an improvement of 15-20% can be achieved over a 2D ellipse fitting baseline.