787 resultados para Gradient-based approaches
Resumo:
Gait energy images (GEIs) and its variants form the basis of many recent appearance-based gait recognition systems. The GEI combines good recognition performance with a simple implementation, though it suffers problems inherent to appearance-based approaches, such as being highly view dependent. In this paper, we extend the concept of the GEI to 3D, to create what we call the gait energy volume, or GEV. A basic GEV implementation is tested on the CMU MoBo database, showing improvements over both the GEI baseline and a fused multi-view GEI approach. We also demonstrate the efficacy of this approach on partial volume reconstructions created from frontal depth images, which can be more practically acquired, for example, in biometric portals implemented with stereo cameras, or other depth acquisition systems. Experiments on frontal depth images are evaluated on an in-house developed database captured using the Microsoft Kinect, and demonstrate the validity of the proposed approach.
Resumo:
Many current HCI, social networking, ubiquitous computing, and context aware designs, in order for the design to function, have access to, or collect, significant personal information about the user. This raises concerns about privacy and security, in both the research community and main-stream media. From a practical perspective, in the social world, secrecy and security form an ongoing accomplishment rather than something that is set up and left alone. We explore how design can support privacy as practical action, and investigate the notion of collective information-practice of privacy and security concerns of participants of a mobile, social software for ride sharing. This paper contributes an understanding of HCI security and privacy tensions, discovered while “designing in use” using a Reflective, Agile, Iterative Design (RAID) method.
Resumo:
It is a big challenge to guarantee the quality of discovered relevance features in text documents for describing user preferences because of the large number of terms, patterns, and noise. Most existing popular text mining and classification methods have adopted term-based approaches. However, they have all suffered from the problems of polysemy and synonymy. Over the years, people have often held the hypothesis that pattern-based methods should perform better than term- based ones in describing user preferences, but many experiments do not support this hypothesis. This research presents a promising method, Relevance Feature Discovery (RFD), for solving this challenging issue. It discovers both positive and negative patterns in text documents as high-level features in order to accurately weight low-level features (terms) based on their specificity and their distributions in the high-level features. The thesis also introduces an adaptive model (called ARFD) to enhance the exibility of using RFD in adaptive environment. ARFD automatically updates the system's knowledge based on a sliding window over new incoming feedback documents. It can efficiently decide which incoming documents can bring in new knowledge into the system. Substantial experiments using the proposed models on Reuters Corpus Volume 1 and TREC topics show that the proposed models significantly outperform both the state-of-the-art term-based methods underpinned by Okapi BM25, Rocchio or Support Vector Machine and other pattern-based methods.
Resumo:
Information mismatch and overload are two fundamental issues influencing the effectiveness of information filtering systems. Even though both term-based and pattern-based approaches have been proposed to address the issues, neither of these approaches alone can provide a satisfactory decision for determining the relevant information. This paper presents a novel two-stage decision model for solving the issues. The first stage is a novel rough analysis model to address the overload problem. The second stage is a pattern taxonomy mining model to address the mismatch problem. The experimental results on RCV1 and TREC filtering topics show that the proposed model significantly outperforms the state-of-the-art filtering systems.
Resumo:
Curriculum developers and researchers have promoted context based programmes to arrest waning student interest and participation in the enabling sciences at high school and university. Context-based programmes aim for connections between scientific discourse and real-world contexts to elevate curricular relevance without diminishing conceptual understanding. Literature relating to context-based approaches to learning will be reviewed in this chapter. In particular, international trends in curricular development and results from evaluations of major projects (e.g. PLON, Salters Advanced Chemistry, ChemCom) will be highlighted. Research projects that explore context-based interventions focusing on such outcomes as student interest, perceived relevance and conceptual understanding also will feature in the review. The chapter culminates with a discussion of current context-based research that interprets classroom actions from a dialectical socio-cultural framework, and identifies possible new directions for research.
Resumo:
In this paper, we apply a simulation based approach for estimating transmission rates of nosocomial pathogens. In particular, the objective is to infer the transmission rate between colonised health-care practitioners and uncolonised patients (and vice versa) solely from routinely collected incidence data. The method, using approximate Bayesian computation, is substantially less computer intensive and easier to implement than likelihood-based approaches we refer to here. We find through replacing the likelihood with a comparison of an efficient summary statistic between observed and simulated data that little is lost in the precision of estimated transmission rates. Furthermore, we investigate the impact of incorporating uncertainty in previously fixed parameters on the precision of the estimated transmission rates.
Resumo:
This study proposes a full Bayes (FB) hierarchical modeling approach in traffic crash hotspot identification. The FB approach is able to account for all uncertainties associated with crash risk and various risk factors by estimating a posterior distribution of the site safety on which various ranking criteria could be based. Moreover, by use of hierarchical model specification, FB approach is able to flexibly take into account various heterogeneities of crash occurrence due to spatiotemporal effects on traffic safety. Using Singapore intersection crash data(1997-2006), an empirical evaluate was conducted to compare the proposed FB approach to the state-of-the-art approaches. Results show that the Bayesian hierarchical models with accommodation for site specific effect and serial correlation have better goodness-of-fit than non hierarchical models. Furthermore, all model-based approaches perform significantly better in safety ranking than the naive approach using raw crash count. The FB hierarchical models were found to significantly outperform the standard EB approach in correctly identifying hotspots.
Resumo:
Pre-service teacher education is unfinished business. New social education teachers face the challenge of fluid policy environments in which curriculum content and pedagogy are continually changing. The evolving Australian curriculum is the most recent example of such fluidity with its emphasis on shifting the educational agenda to a focus on discipline-based approaches. This paper addresses the concerns of final year pre-service and early career social education teachers, in terms of their professional development needs, by drawing on the findings of a pilot study with students and recent graduates from a university in south-east Queensland. It concludes that social education curriculum units which embed links to professional practice and professional development in teaching, learning and assessment may provide the way forward for enhancing the transition to practice for beginning teachers and assist them in navigating constant change.
Resumo:
Articular cartilage is a highly resilient tissue located at the ends of long bones. It has a zonal structure, which has functional significance in load-bearing. Cartilage does not spontaneously heal itself when damaged, and untreated cartilage lesions or age-related wear often lead to osteoarthritis (OA). OA is a degenerative condition that is highly prevalent, age-associated, and significantly affects patient mobility and quality of life. There is no cure for OA, and patients usually resort to replacing the biological joint with an artificial prosthesis. An alternative approach is to dynamically regenerate damaged or diseased cartilage through cartilage tissue engineering, where cells, materials, and stimuli are combined to form new cartilage. However, despite extensive research, major limitations remain that have prevented the wide-spread application of tissue-engineered cartilage. Critically, there is a dearth of information on whether autologous chondrocytes obtained from OA patients can be used to successfully generate cartilage tissues with structural hierarchy typically found in normal articular cartilage. I aim to address these limitations in this thesis by showing that chondrocyte subpopulations isolated from macroscopically normal areas of the cartilage can be used to engineer stratified cartilage tissues and that compressive loading plays an important role in zone-dependent biosynthesis of these chondrocytes. I first demonstrate that chondrocyte subpopulations from the superficial (S) and middle/deep (MD) zones of OA cartilage are responsive to compressive stimulation in vitro, and that the effect of compression on construct quality is zone-dependent. I also show that compressive stimulation can influence pericelluar matrix production, matrix metalloproteinase secretion, and cytokine expression in zonal chondrocytes in an alginate hydrogel model. Subsequently, I focus on recreating the zonal structure by forming layered constructs using the alginate-released chondrocyte (ARC) method either with or without polymeric scaffolds. Resulting zonal ARC constructs had hyaline morphology, and expressed cartilage matrix molecules such as proteoglycans and collagen type II in both scaffold-free and scaffold-based approaches. Overall, my findings demonstrate that chondrocyte subpopulations obtained from OA joints respond sensitively to compressive stimulation, and are able to form cartilaginous constructs with stratified organization similar to native cartilage using the scaffold-free and scaffold-based ARC technique. The ultimate goal in tissue engineering is to help provide improved treatment options for patients suffering from debilitating conditions such as OA. Further investigations in developing functional cartilage replacement tissues using autologous chondrocytes will bring us a step closer to improving the quality of life for millions of OA patients worldwide.
Resumo:
This chapter examines why policy decision-makers opt for command and control environmental regulation despite the availability of a plethora of market-based instruments which are more efficient and cost-effective. Interestingly, Sri Lanka has adopted a wholly command and control system, during both the pre and post liberalisation economic policies. This chapter first examines the merits and demerits of command and control and market-based approaches and then looks at Sri Lanka’s extensive environmental regulatory framework. The chapter then examines the likely reasons as to why the country has gone down the path of inflexible regulatory measures and has become entrenched in them. The various hypotheses are discussed and empirical evidence is provided. The chapter also discusses the consequences of an environmentally slack economy and policy implications stemming from adopting a wholly regulatory approach. The chapter concludes with a discussion of the main results.
Resumo:
Regenerative medicine-based approaches for the repair of damaged cartilage rely on the ability to propagate cells while promoting their chondrogenic potential. Thus, conditions for cell expansion should be optimized through careful environmental control. Appropriate oxygen tension and cell expansion substrates and controllable bioreactor systems are probably critical for expansion and subsequent tissue formation during chondrogenic differentiation. We therefore evaluated the effects of oxygen and microcarrier culture on the expansion and subsequent differentiation of human osteoarthritic chondrocytes. Freshly isolated chondrocytes were expanded on tissue culture plastic or CultiSpher-G microcarriers under hypoxic or normoxic conditions (5% or 20% oxygen partial pressure, respectively) followed by cell phenotype analysis with flow cytometry. Cells were redifferentiated in micromass pellet cultures over 4 weeks, under either hypoxia or normoxia. Chondrocytes cultured on tissue culture plastic proliferated faster, expressed higher levels of cell surface markers CD44 and CD105 and demonstrated stronger staining for proteoglycans and collagen type II in pellet cultures compared with microcarrier-cultivated cells. Pellet wet weight, glycosaminoglycan content and expression of chondrogenic genes were significantly increased in cells differentiated under hypoxia. Hypoxia-inducible factor-3alpha mRNA was up-regulated in these cultures in response to low oxygen tension. These data confirm the beneficial influence of reduced oxygen on ex vivo chondrogenesis. However, hypoxia during cell expansion and microcarrier bioreactor culture does not enhance intrinsic chondrogenic potential. Further improvements in cell culture conditions are therefore required before chondrocytes from osteoarthritic and aged patients can become a useful cell source for cartilage regeneration.
Resumo:
Circulating tumour cells (CTCs) have attracted much recent interest in cancer research as a potential biomarker and as a means of studying the process of metastasis. It has long been understood that metastasis is a hallmark of malignancy, and conceptual theories on the basis of metastasis from the nineteenth century foretold the existence of a tumour "seed" which is capable of establishing discrete tumours in the "soil" of distant organs. This prescient "seed and soil" hypothesis accurately predicted the existence of CTCs; microscopic tumour fragments in the blood, at least some of which are capable of forming metastases. However, it is only in recent years that reliable, reproducible methods of CTC detection and analysis have been developed. To date, the majority of studies have employed the CellSearch™ system (Veridex LLC), which is an immunomagnetic purification method. Other promising techniques include microfluidic filters, isolation of tumour cells by size using microporous polycarbonate filters and flow cytometry-based approaches. While many challenges still exist, the detection of CTCs in blood is becoming increasingly feasible, giving rise to some tantalizing questions about the use of CTCs as a potential biomarker. CTC enumeration has been used to guide prognosis in patients with metastatic disease, and to act as a surrogate marker for disease response during therapy. Other possible uses for CTC detection include prognostication in early stage patients, identifying patients requiring adjuvant therapy, or in surveillance, for the detection of relapsing disease. Another exciting possible use for CTC detection assays is the molecular and genetic characterization of CTCs to act as a "liquid biopsy" representative of the primary tumour. Indeed it has already been demonstrated that it is possible to detect HER2, KRAS and EGFR mutation status in breast, colon and lung cancer CTCs respectively. In the course of this review, we shall discuss the biology of CTCs and their role in metastagenesis, the most commonly used techniques for their detection and the evidence to date of their clinical utility, with particular reference to lung cancer.
Resumo:
Entity-oriented search has become an essential component of modern search engines. It focuses on retrieving a list of entities or information about the specific entities instead of documents. In this paper, we study the problem of finding entity related information, referred to as attribute-value pairs, that play a significant role in searching target entities. We propose a novel decomposition framework combining reduced relations and the discriminative model, Conditional Random Field (CRF), for automatically finding entity-related attribute-value pairs from free text documents. This decomposition framework allows us to locate potential text fragments and identify the hidden semantics, in the form of attribute-value pairs for user queries. Empirical analysis shows that the decomposition framework outperforms pattern-based approaches due to its capability of effective integration of syntactic and semantic features.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.
Resumo:
We consider the problem of how to maximize secure connectivity of multi-hop wireless ad hoc networks after deployment. Two approaches, based on graph augmentation problems with nonlinear edge costs, are formulated. The first one is based on establishing a secret key using only the links that are already secured by secret keys. This problem is in NP-hard and does not accept polynomial time approximation scheme PTAS since minimum cutsets to be augmented do not admit constant costs. The second one is based of increasing the power level between a pair of nodes that has a secret key to enable them physically connect. This problem can be formulated as the optimal key establishment problem with interference constraints with bi-objectives: (i) maximizing the concurrent key establishment flow, (ii) minimizing the cost. We show that both problems are NP-hard and MAX-SNP (i.e., it is NP-hard to approximate them within a factor of 1 + e for e > 0 ) with a reduction to MAX3SAT problem. Thus, we design and implement a fully distributed algorithm for authenticated key establishment in wireless sensor networks where each sensor knows only its one- hop neighborhood. Our witness based approaches find witnesses in multi-hop neighborhood to authenticate the key establishment between two sensor nodes which do not share a key and which are not connected through a secure path.