911 resultados para CAPTURE THREADS
Resumo:
Recently, user tagging systems have grown in popularity on the web. The tagging process is quite simple for ordinary users, which contributes to its popularity. However, free vocabulary has lack of standardization and semantic ambiguity. It is possible to capture the semantics from user tagging and represent those in a form of ontology, but the application of the learned ontology for recommendation making has not been that flourishing. In this paper we discuss our approach to learn domain ontology from user tagging information and apply the extracted tag ontology in a pilot tag recommendation experiment. The initial result shows that by using the tag ontology to re-rank the recommended tags, the accuracy of the tag recommendation can be improved.
Resumo:
We review accounting and finance research on corporate governance (CG). In the course of our review, we focus on a particularly vexing issue, namely endogeneity in the relationships between CG and other matters of concern to accounting and finance scholars, and suggest ways to deal with it. Given the advent of large commercial CG databases, we also stress the importance of how CG is measured and in particular, the construction of CG indices, which should be sensitive to local institutional arrangements, and the need to capture both internal and external aspects of governance. The ‘stickiness’ of CG characteristics provides an additional challenge to CG scholars. Better theory is required, for example, to explain whether various CG practices substitute for each other or are complements. While a multidisciplinary approach to developing better theory is never without its difficulties, it could enrich the current body of knowledge in CG. Despite the vastness of the existing CG literature, these issues do suggest a number of avenues for future research.
Resumo:
Recently, user tagging systems have grown in popularity on the web. The tagging process is quite simple for ordinary users, which contributes to its popularity. However, free vocabulary has lack of standardization and semantic ambiguity. It is possible to capture the semantics from user tagging into some form of ontology, but the application of the resulted ontology for recommendation making has not been that flourishing. In this paper we discuss our approach to learn domain ontology from user tagging information and apply the extracted tag ontology in a pilot tag recommendation experiment. The initial result shows that by using the tag ontology to re-rank the recommended tags, the accuracy of the tag recommendation can be improved.
Resumo:
Texture analysis and textural cues have been applied for image classification, segmentation and pattern recognition. Dominant texture descriptors include directionality, coarseness, line-likeness etc. In this dissertation a class of textures known as particulate textures are defined, which are predominantly coarse or blob-like. The set of features that characterise particulate textures are different from those that characterise classical textures. These features are micro-texture, macro-texture, size, shape and compaction. Classical texture analysis techniques do not adequately capture particulate texture features. This gap is identified and new methods for analysing particulate textures are proposed. The levels of complexity in particulate textures are also presented ranging from the simplest images where blob-like particles are easily isolated from their back- ground to the more complex images where the particles and the background are not easily separable or the particles are occluded. Simple particulate images can be analysed for particle shapes and sizes. Complex particulate texture images, on the other hand, often permit only the estimation of particle dimensions. Real life applications of particulate textures are reviewed, including applications to sedimentology, granulometry and road surface texture analysis. A new framework for computation of particulate shape is proposed. A granulometric approach for particle size estimation based on edge detection is developed which can be adapted to the gray level of the images by varying its parameters. This study binds visual texture analysis and road surface macrotexture in a theoretical framework, thus making it possible to apply monocular imaging techniques to road surface texture analysis. Results from the application of the developed algorithm to road surface macro-texture, are compared with results based on Fourier spectra, the auto- correlation function and wavelet decomposition, indicating the superior performance of the proposed technique. The influence of image acquisition conditions such as illumination and camera angle on the results was systematically analysed. Experimental data was collected from over 5km of road in Brisbane and the estimated coarseness along the road was compared with laser profilometer measurements. Coefficient of determination R2 exceeding 0.9 was obtained when correlating the proposed imaging technique with the state of the art Sensor Measured Texture Depth (SMTD) obtained using laser profilometers.
Resumo:
This chapter will begin by considering some of the distinctive features of media as creative industries, including their assessment of risk and return on investment, team-based production, the management of creativity, the value chain of production, distribution and circulation, and the significance of intellectual property in their revenue strategies. It will then critically appraise three strategies to capture new markets and revenue streams in the context of the rise of the Internet, digital media and globally networked distribution. The three strategies to be considered are conglomeration, networking and globalization, and the focus will be on the media giants such as News Corporation, Disney and Time-Warner. It will be argued that all three present considerable challenges in their application, and digital media technologies are weakening rather than strengthening their capacity to control the global media environment. The chapter will conclude with consideration of some implications of this analysis for questions of media power.
Resumo:
The behaviour of ion channels within cardiac and neuronal cells is intrinsically stochastic in nature. When the number of channels is small this stochastic noise is large and can have an impact on the dynamics of the system which is potentially an issue when modelling small neurons and drug block in cardiac cells. While exact methods correctly capture the stochastic dynamics of a system they are computationally expensive, restricting their inclusion into tissue level models and so approximations to exact methods are often used instead. The other issue in modelling ion channel dynamics is that the transition rates are voltage dependent, adding a level of complexity as the channel dynamics are coupled to the membrane potential. By assuming that such transition rates are constant over each time step, it is possible to derive a stochastic differential equation (SDE), in the same manner as for biochemical reaction networks, that describes the stochastic dynamics of ion channels. While such a model is more computationally efficient than exact methods we show that there are analytical problems with the resulting SDE as well as issues in using current numerical schemes to solve such an equation. We therefore make two contributions: develop a different model to describe the stochastic ion channel dynamics that analytically behaves in the correct manner and also discuss numerical methods that preserve the analytical properties of the model.
Resumo:
One of the fundamental motivations underlying computational cell biology is to gain insight into the complicated dynamical processes taking place, for example, on the plasma membrane or in the cytosol of a cell. These processes are often so complicated that purely temporal mathematical models cannot adequately capture the complex chemical kinetics and transport processes of, for example, proteins or vesicles. On the other hand, spatial models such as Monte Carlo approaches can have very large computational overheads. This chapter gives an overview of the state of the art in the development of stochastic simulation techniques for the spatial modelling of dynamic processes in a living cell.
Resumo:
Endocytosis is the process by which cells internalise molecules including nutrient proteins from the extracellular media. In one form, macropinocytosis, the membrane at the cell surface ruffles and folds over to give rise to an internalised vesicle. Negatively charged phospholipids within the membrane called phosphoinositides then undergo a series of transformations that are critical for the correct trafficking of the vesicle within the cell, and which are often pirated by pathogens such as Salmonella. Advanced fluorescent video microscopy imaging now allows the detailed observation and quantification of these events in live cells over time. Here we use these observations as a basis for building differential equation models of the transformations. An initial investigation of these interactions was modelled with reaction rates proportional to the sum of the concentrations of the individual constituents. A first order linear system for the concentrations results. The structure of the system enables analytical expressions to be obtained and the problem becomes one of determining the reaction rates which generate the observed data plots. We present results with reaction rates which capture the general behaviour of the reactions so that we now have a complete mathematical model of phosphoinositide transformations that fits the experimental observations. Some excellent fits are obtained with modulated exponential functions; however, these are not solutions of the linear system. The question arises as to how the model may be modified to obtain a system whose solution provides a more accurate fit.
Resumo:
Plate elements are used in many engineering applications. In-plane loads and deformations have significant influence on the vibration characteristics of plate elements. Numerous methods have been developed to quantify the effects of in-plane loads and deformations of individual plate elements with different boundary conditions based on their natural frequencies. However, these developments cannot be applied to the plate elements in a structural system as the natural frequency is a global parameter for the entire structure. This highlights the need for a method to quantify in-plane deformations of plate elements in structural framing systems. Motivated by this gap in knowledge, this research has developed a comprehensive vibration based procedure to quantify in-plane deformation of plate elements in a structural framing system. This procedure with its unique capabilities to capture the influence of load migration, boundary conditions and different tributary areas is presented herein and illustrated through examples.
Resumo:
This paper reports the feasibility and methodological considerations of using the Short Message System Experience Sampling (SMS-ES) Method, which is an experience sampling research method developed to assist researchers to collect repeat measures of consumers’ affective experiences. The method combines SMS with web-based technology in a simple yet effective way. It is described using a practical implementation study that collected consumers’ emotions in response to using mobile phones in everyday situations. The method is further evaluated in terms of the quality of data collected in the study, as well as against the methodological considerations for experience sampling studies. These two evaluations suggest that the SMS-ES Method is both a valid and reliable approach for collecting consumers’ affective experiences. Moreover, the method can be applied across a range of for-profit and not-for-profit contexts where researchers want to capture repeated measures of consumers’ affective experiences occurring over a period of time. The benefits of the method are discussed to assist researchers who wish to apply the SMS-ES Method in their own research designs.
Resumo:
There is worldwide interest in reducing aircraft emissions. The difficulty of reducing emissions including water vapour, carbon dioxide (CO2) and oxides of nitrogen (NOx) is mainly due from the fact that a commercial aircraft is usually designed for a particular optimal cruise altitude but may be requested or required to operate and deviate at different altitude and speeds to archive a desired or commanded flight plan, resulting in increased emissions. This is a multi- disciplinary problem with multiple trade-offs such as optimising engine efficiency, minimising fuel burnt, minimise emissions while maintaining aircraft separation and air safety. This project presents the coupling of an advanced optimisation technique with mathematical models and algorithms for aircraft emission reduction through flight optimisation. Numerical results show that the method is able to capture a set of useful trade-offs between aircraft range and NOx, and mission fuel consumption and NOx. In addition, alternative cruise operating conditions including Mach and altitude that produce minimum NOx and CO2 (minimum mission fuel weight) are suggested.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
This paper proposes the use of eigenvoice modeling techniques with the Cross Likelihood Ratio (CLR) as a criterion for speaker clustering within a speaker diarization system. The CLR has previously been shown to be a robust decision criterion for speaker clustering using Gaussian Mixture Models. Recently, eigenvoice modeling techniques have become increasingly popular, due to its ability to adequately represent a speaker based on sparse training data, as well as an improved capture of differences in speaker characteristics. This paper hence proposes that it would be beneficial to capitalize on the advantages of eigenvoice modeling in a CLR framework. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, resulting in a 35.1% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.
Resumo:
Experimental action potential (AP) recordings in isolated ventricular myoctes display significant temporal beat-to-beat variability in morphology and duration. Furthermore, significant cell-to-cell differences in AP also exist even for isolated cells originating from the same region of the same heart. However, current mathematical models of ventricular AP fail to replicate the temporal and cell-to-cell variability in AP observed experimentally. In this study, we propose a novel mathematical framework for the development of phenomenological AP models capable of capturing cell-to-cell and temporal variabilty in cardiac APs. A novel stochastic phenomenological model of the AP is developed, based on the deterministic Bueno-Orovio/Fentonmodel. Experimental recordings of AP are fit to the model to produce AP models of individual cells from the apex and the base of the guinea-pig ventricles. Our results show that the phenomenological model is able to capture the considerable differences in AP recorded from isolated cells originating from the location. We demonstrate the closeness of fit to the available experimental data which may be achieved using a phenomenological model, and also demonstrate the ability of the stochastic form of the model to capture the observed beat-to-beat variablity in action potential duration.
Resumo:
In an effort to evaluate and improve their practices to ensure the future excellence of the Texas highway system, the Texas Department of Transportation (TxDOT) sought a forum in which experts from other state departments of transportation could share their expertise. Thus, the Peer State Review of TxDOT Maintenance Practices project was organized and conducted for TxDOT by the Center for Transportation Research (CTR) at The University of Texas at Austin. The goal of the project was to conduct a workshop at CTR and in the Austin District that would educate the visiting peers on TxDOT’s maintenance practices and invite their feedback. CTR and TxDOT arranged the participation of the following directors of maintenance: Steve Takigawa, CA; Roy Rissky, KS; Eric Pitts, GA; Jim Carney, MO; Jennifer Brandenburg, NC; and David Bierschbach, WA. One of the means used to capture the peer reviewers’ opinions was a carefully designed booklet of 15 questions. The peers provided TxDOT with written responses to these questions, and the oral comments made during the workshop were also captured. This information was then compiled and summarized in the following report. An examination of the peers’ comments suggests that TxDOT should use a more holistic, statewide approach to funding and planning rather than funding and planning for each district separately. Additionally, the peers stressed the importance of allocating funds based on the actual conditions of the roadways instead of on inventory. The visiting directors of maintenance also recommended continuing and proliferating programs that enhance communication, such as peer review workshops.