975 resultados para Information Acquisition


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND INFORMATION: Evidence has shown that mesenchymal-epithelial transition (MET) and epithelial-mesenchymal transition (EMT) are linked to stem cell properties. We currently lack a model showing how the occurrence of MET and EMT in immortalised cells influences the maintenance of stem cell properties. Thus, we established a project aiming to investigate the roles of EMT and MET in the acquisition of stem cell properties in immortalised oral epithelial cells. RESULTS: In this study, a retroviral transfection vector (pLXSN-hTERT) was used to immortalise oral epithelial cells by insertion of the hTERT gene (hTERT(+)-oral mucosal epithelial cell line [OME]). The protein and RNA expression of EMT transcriptional factors (Snail, Slug and Twist), their downstream markers (E-cadherin and N-cadherin) and embryonic stem cell markers (OCT4, Nanog and Sox2) were studied by reverse transcription PCR and Western blots in these cells. Some EMT markers were detected at both mRNA and protein levels. Adipocytes and bone cells were noted in the multi-differentiation assay, showing that the immortal cells underwent EMT. The differentiation assay for hTERT(+)-OME cells revealed the recovery of epithelial phenotypes, implicating the presence of MET. The stem cell properties were confirmed by the detection of appropriate markers. Altered expression of alpha-tubulin and gamma-tubulin in both two-dimensional-cultured (without serum) and three-dimensional-cultured hTERT(+)-OME spheroids indicated the re-programming of cytoskeleton proteins which is attributed to MET processes in hTERT(+)-OME cells. CONCLUSIONS: EMT and MET are essential for hTERT-immortalised cells to maintain their epithelial stem cell properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expert searchers engage with information in a variety of professional settings, as information brokers, reference librarians, information architects and faculty who teach advanced searching. As my recent research shows, the expert searcher’s information experience is defined by profound discernment of critical concepts about information, and a fluid ability to apply this knowledge to their engagement with the information environment. The information experience of the expert searcher means active and intentional participation with the processes and players that created that information environment. Expert searchers become an integral and seamless part of their information environment and also play a role in facilitating the information experiences of others. In this chapter, after discussing my understanding of the concept of information experience, I outline how I used threshold concept theory to explore the information experience of expert searchers. Through the findings, I identify four threshold concepts in the acquisition of search expertise that provide new perspectives on the information experience of the expert searcher. These new perspectives have implications for search engine design and how advanced search skills are taught. Finally, I consider how the fresh insights about the expert searcher’s experiences contribute to wider understanding about information experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This technical report describes a Light Detection and Ranging (LiDAR) augmented optimal path planning at low level flight methodology for remote sensing and sampling Unmanned Aerial Vehicles (UAV). The UAV is used to perform remote air sampling and data acquisition from a network of sensors on the ground. The data that contains information on the terrain is in the form of a 3D point clouds maps is processed by the algorithms to find an optimal path. The results show that the method and algorithm are able to use the LiDAR data to avoid obstacles when planning a path from a start to a target point. The report compares the performance of the method as the resolution of the LIDAR map is increased and when a Digital Elevation Model (DEM) is included. From a practical point of view, the optimal path plan is loaded and works seemingly with the UAV ground station and also shows the UAV ground station software augmented with more accurate LIDAR data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim was to analyse the growth and compositional development of the receptive and expressive lexicons between the ages 0,9 and 2;0 in the full-term (FT) and the very-low-birth-weight (VLBW) children who are acquiring Finnish. The associations between the expressive lexicon and grammar at 1;6 and 2;0 in the FT children were also studied. In addition, the language skills of the VLBW children at 2;0 were analysed, as well as the predictive value of early lexicon to the later language performance. Four groups took part in the studies: the longitudinal (N = 35) and cross-sectional (N = 146) samples of the FT children, and the longitudinal (N = 32) and cross-sectional (N = 66) samples of VLBW children. The data was gathered by applying of the structured parental rating method (the Finnish version of the Communicative Development Inventory), through analysis of the children´s spontaneous speech and by administering a a formal test (Reynell Developmental Language Scales). The FT children acquired their receptive lexicons earlier, at a faster rate and with larger individual variation than their expressive lexicons. The acquisition rate of the expressive lexicon increased from slow to faster in most children (91%). Highly parallel developmental paths for lexical semantic categories were detected in the receptive and expressive lexicons of the Finnish children when they were analysed in relation to the growth of the lexicon size, as described in the literature for children acquiring other languages. The emergence of grammar was closely associated with expressive lexical growth. The VLBW children acquired their receptive lexicons at a slower rate and had weaker language skills at 2;0 than the full-term children. The compositional development of both lexicons happened at a slower rate in the VLBW children when compared to the FT controls. However, when the compositional development was analysed in relation to the growth of lexicon size, this development occurred qualitatively in a nearly parallel manner in the VLBW children as in the FT children. Early receptive and expressive lexicon sizes were significantly associated with later language skills in both groups. The effect of the background variables (gender, length of the mother s basic education, birth weight) on the language development in the FT and the VLBW children differed. The results provide new information of early language acquisition by the Finnish FT and VLBW children. The results support the view that the early acquisition of the semantic lexical categories is related to lexicon growth. The current findings also propose that the early grammatical acquisition is closely related to the growth of expressive vocabulary size. The language development of the VLBW children should be followed in clinical work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Service researchers have repeatedly claimed that firms should acquire customer information in order to develop services that fit customer needs. Despite this, studies that would concentrate on the actual use of customer information in service development are lacking. The present study fulfils this research gap by investigating information use during a service development process. It demonstrates that use is not a straightforward task that automatically follows the acquisition of customer information. In fact, out of the six identified types of use, four represent non usage of customer information. Hence, the study demonstrates that the acquisition of customer information does not guarantee that the information will actually be used in development. The current study used an ethnographic approach. Consequently, the study was conducted in the field in real time over an extensive period of 13 months. Participant observation allowed direct access to the investigated phenomenon, i.e. the different types of use by the observed development project members were captured while they emerged. In addition, interviews, informal discussions and internal documents were used to gather data. A development process of a bank’s website constituted the empirical context of the investigation. This ethnography brings novel insights to both academia and practice. It critically questions the traditional focus on the firm’s acquisition of customer information and suggests that this focus ought to be expanded to the actual use of customer information. What is the point in acquiring costly customer information if it is not used in the development? Based on the findings of this study, a holistic view on customer information, “information in use” is generated. This view extends the traditional view of customer information in three ways: the source, timing and form of data collection. First, the study showed that the customer information can come explicitly from the customer, from speculation among the developers or it can already exist implicitly. Prior research has mainly focused on the customer as the information provider and the explicit source to turn to for information. Second, the study identified that the used and non-used customer information was acquired both previously, and currently within the time frame of the focal development process, as well as potentially in the future. Prior research has primarily focused on the currently acquired customer information, i.e. within the timeframe of the development process. Third, the used and non-used customer information was both formally and informally acquired. In prior research a large number of sophisticated formal methods have been suggested for the acquisition of customer information to be used in development. By focusing on “information in use”, new knowledge on types of customer information that are actually used was generated. For example, the findings show that the formal customer information acquired during the development process is used less than customer information already existent within the firm. With this knowledge at hand, better methods to capture this more usable customer information can be developed. Moreover, the thesis suggests that by focusing stronger on use of customer information, service development processes can be restructured in order to facilitate the information that is actually used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Menneinä vuosikymmeninä maatalouden työt ovat ensin koneellistuneet voimakkaasti ja sittemmin mukaan on tullut automaatio. Nykyään koneiden kokoa suurentamalla ei enää saada tuottavuutta nostettua merkittävästi, vaan työn tehostaminen täytyy tehdä olemassa olevien resurssien käyttöä tehostamalla. Tässä työssä tarkastelun kohteena on ajosilppuriketju nurmisäilörehun korjuussa. Säilörehun korjuun intensiivisyys ja koneyksiköiden runsas määrä ovat työnjohdon kannalta vaativa yhdistelmä. Työn tavoitteena oli selvittää vaatimuksia maatalouden urakoinnin tueksi kehitettävälle tiedonhallintajärjestelmälle. Tutkimusta varten haastateltiin yhteensä 12 urakoitsijaa tai yhteistyötä tekevää viljelijää. Tutkimuksen perusteella urakoitsijoilla on tarvetta tietojärjestelmille.Luonnollisesti urakoinnin laajuus ja järjestelyt vaikuttavat asiaan. Tutkimuksen perusteella keskeisimpiä vaatimuksia tiedonhallinnalle ovat: • mahdollisimman laaja, yksityiskohtainen ja automaattinen tiedon keruu tehtävästä työstä • karttapohjaisuus, kuljettajien opastus kohteisiin • asiakasrekisteri, työn tilaus sähköisesti • tarjouspyyntöpohjat, hintalaskurit • luotettavuus, tiedon säilyvyys • sovellettavuus monenlaisiin töihin • yhteensopivuus muiden järjestelmien kanssa Kehitettävän järjestelmän tulisi siis tutkimuksen perusteella sisältää seuraavia osia: helppokäyttöinen suunnittelu/asiakasrekisterityökalu, toimintoja koneiden seurantaan, opastukseen ja johtamiseen, työnaikainen tiedonkeruu sekä kerätyn tiedon käsittelytoimintoja. Kaikki käyttäjät eivät kuitenkaan tarvitse kaikkia toimintoja, joten urakoitsijan on voitava valita tarvitsemansa osat ja mahdollisesti lisätä toimintoja myöhemmin. Tiukoissa taloudellisissa ja ajallisissa raameissa toimivat urakoitsijat ovat vaativia asiakkaita, joiden käyttämän tekniikan tulee olla toimivaa ja luotettavaa. Toisaalta inhimillisiä virheitä sattuu kokeneillekin, joten hyvällä tietojärjestelmällä työstä tulee helpompaa ja tehokkaampaa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed compressed sensing exploits information redundancy, inbuilt in multi-signal ensembles with interas well as intra-signal correlations, to reconstruct undersampled signals. In this paper we revisit this problem, albeit from a different perspective, of taking streaming data, from several correlated sources, as input to a real time system which, without any a priori information, incrementally learns and admits each source into the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In animal populations, the constraints of energy and time can cause intraspecific variation in foraging behaviour. The proximate developmental mediators of such variation are often the mechanisms underlying perception and associative learning. Here, experience-dependent changes in foraging behaviour and their consequences were investigated in an urban population of free-ranging dogs, Canis familiaris by continually challenging them with the task of food extraction from specially crafted packets. Typically, males and pregnant/lactating (PL) females extracted food using the sophisticated `gap widening' technique, whereas non-pregnant/non-lactating (NPNL) females, the relatively underdeveloped `rip opening' technique. In contrast to most males and PL females (and a few NPNL females) that repeatedly used the gap widening technique and improved their performance in food extraction with experience, most NPNL females (and a few males and PL females) non-preferentially used the two extraction techniques and did not improve over successive trials. Furthermore, the ability of dogs to sophisticatedly extract food was positively related to their ability to improve their performance with experience. Collectively, these findings demonstrate that factors such as sex and physiological state can cause differences among individuals in the likelihood of learning new information and hence, in the rate of resource acquisition and monopolization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following rising demands in positioning with GPS, low-cost receivers are becoming widely available; but their energy demands are still too high. For energy efficient GPS sensing in delay-tolerant applications, the possibility of offloading a few milliseconds of raw signal samples and leveraging the greater processing power of the cloud for obtaining a position fix is being actively investigated. In an attempt to reduce the energy cost of this data offloading operation, we propose Sparse-GPS(1): a new computing framework for GPS acquisition via sparse approximation. Within the framework, GPS signals can be efficiently compressed by random ensembles. The sparse acquisition information, pertaining to the visible satellites that are embedded within these limited measurements, can subsequently be recovered by our proposed representation dictionary. By extensive empirical evaluations, we demonstrate the acquisition quality and energy gains of Sparse-GPS. We show that it is twice as energy efficient than offloading uncompressed data, and has 5-10 times lower energy costs than standalone GPS; with a median positioning accuracy of 40 m.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NMR-based approach to metabolomics typically involves the collection of two-dimensional (2D) heteronuclear correlation spectra for identification and assignment of metabolites. In case of spectral overlap, a 3D spectrum becomes necessary, which is hampered by slow data acquisition for achieving sufficient resolution. We describe here a method to simultaneously acquire three spectra (one 3D and two 2D) in a single data set, which is based on a combination of different fast data acquisition techniques such as G-matrix Fourier transform (GFT) NMR spectroscopy, parallel data acquisition and non-uniform sampling. The following spectra are acquired simultaneously: (1) C-13 multiplicity edited GFT (3,2)D HSQC-TOCSY, (2) 2D H-1- H-1] TOCSY and (3) 2D C-13- H-1] HETCOR. The spectra are obtained at high resolution and provide high-dimensional spectral information for resolving ambiguities. While the GFT spectrum has been shown previously to provide good resolution, the editing of spin systems based on their CH multiplicities further resolves the ambiguities for resonance assignments. The experiment is demonstrated on a mixture of 21 metabolites commonly observed in metabolomics. The spectra were acquired at natural abundance of C-13. This is the first application of a combination of three fast NMR methods for small molecules and opens up new avenues for high-throughput approaches for NMR-based metabolomics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contributed to: Fusion of Cultures. XXXVIII Annual Conference on Computer Applications and Quantitative Methods in Archaeology – CAA2010 (Granada, Spain, Apr 6-9, 2010)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser interferometer gravitational wave observatory (LIGO) consists of two complex large-scale laser interferometers designed for direct detection of gravitational waves from distant astrophysical sources in the frequency range 10Hz - 5kHz. Direct detection of space-time ripples will support Einstein's general theory of relativity and provide invaluable information and new insight into physics of the Universe.

Initial phase of LIGO started in 2002, and since then data was collected during six science runs. Instrument sensitivity was improving from run to run due to the effort of commissioning team. Initial LIGO has reached designed sensitivity during the last science run, which ended in October 2010.

In parallel with commissioning and data analysis with the initial detector, LIGO group worked on research and development of the next generation detectors. Major instrument upgrade from initial to advanced LIGO started in 2010 and lasted till 2014.

This thesis describes results of commissioning work done at LIGO Livingston site from 2013 until 2015 in parallel with and after the installation of the instrument. This thesis also discusses new techniques and tools developed at the 40m prototype including adaptive filtering, estimation of quantization noise in digital filters and design of isolation kits for ground seismometers.

The first part of this thesis is devoted to the description of methods for bringing interferometer to the linear regime when collection of data becomes possible. States of longitudinal and angular controls of interferometer degrees of freedom during lock acquisition process and in low noise configuration are discussed in details.

Once interferometer is locked and transitioned to low noise regime, instrument produces astrophysics data that should be calibrated to units of meters or strain. The second part of this thesis describes online calibration technique set up in both observatories to monitor the quality of the collected data in real time. Sensitivity analysis was done to understand and eliminate noise sources of the instrument.

Coupling of noise sources to gravitational wave channel can be reduced if robust feedforward and optimal feedback control loops are implemented. The last part of this thesis describes static and adaptive feedforward noise cancellation techniques applied to Advanced LIGO interferometers and tested at the 40m prototype. Applications of optimal time domain feedback control techniques and estimators to aLIGO control loops are also discussed.

Commissioning work is still ongoing at the sites. First science run of advanced LIGO is planned for September 2015 and will last for 3-4 months. This run will be followed by a set of small instrument upgrades that will be installed on a time scale of few months. Second science run will start in spring 2016 and last for about 6 months. Since current sensitivity of advanced LIGO is already more than factor of 3 higher compared to initial detectors and keeps improving on a monthly basis, upcoming science runs have a good chance for the first direct detection of gravitational waves.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop a framework of total acquisition cost of overseas outsourcing/sourcing in manufacturing industry. This framework contains categorized cost items that may occur during the overseas outsourcing/sourcing process. The framework was tested by a case study to establish both its feasibility and usability. Design/methodology/approach - First, interviews were carried out with practitioners who have the experience of overseas outsourcing/sourcing in order to obtain inputs from industry. The framework was then built up based on combined inputs from literature and from practitioners. Finally, the framework was tested by a case study in a multinational high-tech manufacturer to establish both its feasibility and usability. Findings - A practical barrier for implementing this framework is shortage of information. The predictability of the cost items in the framework varies. How to deal with the trade off between accuracy and applicability is a problem needed to be solved in the future research. Originality/value - There are always limitations to the generalizations that can be made from just one case. However, despite these limitations, this case study is believed to have shown the general requirement of modeling the uncertainty and dealing with the dilemma between accuracy and applicability in practice. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study is a cross-linguistic, cross-sectional investigation of the impact of learning contexts on the acquisition of sociopragmatic variation patterns and the subsequent enactment of compound identities. The informants are 20 non-native speaker teachers of English from a range of 10 European countries. They are all primarily mono-contextual foreign language learners/users of English: however, they differ with respect to the length of time accumulated in a target language environment. This allows for three groups to be established – those who have accumulated 60 days or less; those with between 90 days and one year and the final group, all of whom have accumulated in excess of one year. In order to foster the dismantling of the monolith of learning context, both learning contexts under consideration – i.e. the foreign language context and submersion context are broken down into micro-contexts which I refer to as loci of learning. For the purpose of this study, two loci are considered: the institutional and the conversational locus. In order to make a correlation between the impact of learning contexts and loci of learning on the acquisition of sociopragmatic variation patterns, a two-fold study is conducted. The first stage is the completion of a highly detailed language contact profile (LCP) questionnaire. This provides extensive biographical information regarding language learning history and is a powerful tool in illuminating the intensity of contact with the L2 that learners experience in both contexts as well as shedding light on the loci of learning to which learners are exposed in both contexts. Following the completion of the LCP, the informants take part in two role plays which require the enactment of differential identities when engaged in a speech event of asking for advice. The enactment of identities then undergoes a strategic and linguistic analysis in order to investigate if and how differences in the enactment of compound identities are indexed in language. Results indicate that learning context has a considerable impact not only on how identity is indexed in language, but also on the nature of identities enacted. Informants with very low levels of crosscontextuality index identity through strategic means – i.e. levels of directness and conventionality; however greater degrees of cross-contextuality give rise to the indexing of differential identities linguistically by means of speaker/hearer orientation and (non-) solidary moves. When it comes to the nature of identity enacted, it seems that more time spent in intense contact with native speakers in a range of loci of learning allows learners to enact their core identity; whereas low levels of contact with over-exposure to the institutional locus of learning fosters the enactment of generic identities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: X-ray computed tomography (CT) is widely used, both clinically and preclinically, for fast, high-resolution anatomic imaging; however, compelling opportunities exist to expand its use in functional imaging applications. For instance, spectral information combined with nanoparticle contrast agents enables quantification of tissue perfusion levels, while temporal information details cardiac and respiratory dynamics. The authors propose and demonstrate a projection acquisition and reconstruction strategy for 5D CT (3D+dual energy+time) which recovers spectral and temporal information without substantially increasing radiation dose or sampling time relative to anatomic imaging protocols. METHODS: The authors approach the 5D reconstruction problem within the framework of low-rank and sparse matrix decomposition. Unlike previous work on rank-sparsity constrained CT reconstruction, the authors establish an explicit rank-sparse signal model to describe the spectral and temporal dimensions. The spectral dimension is represented as a well-sampled time and energy averaged image plus regularly undersampled principal components describing the spectral contrast. The temporal dimension is represented as the same time and energy averaged reconstruction plus contiguous, spatially sparse, and irregularly sampled temporal contrast images. Using a nonlinear, image domain filtration approach, the authors refer to as rank-sparse kernel regression, the authors transfer image structure from the well-sampled time and energy averaged reconstruction to the spectral and temporal contrast images. This regularization strategy strictly constrains the reconstruction problem while approximately separating the temporal and spectral dimensions. Separability results in a highly compressed representation for the 5D data in which projections are shared between the temporal and spectral reconstruction subproblems, enabling substantial undersampling. The authors solved the 5D reconstruction problem using the split Bregman method and GPU-based implementations of backprojection, reprojection, and kernel regression. Using a preclinical mouse model, the authors apply the proposed algorithm to study myocardial injury following radiation treatment of breast cancer. RESULTS: Quantitative 5D simulations are performed using the MOBY mouse phantom. Twenty data sets (ten cardiac phases, two energies) are reconstructed with 88 μm, isotropic voxels from 450 total projections acquired over a single 360° rotation. In vivo 5D myocardial injury data sets acquired in two mice injected with gold and iodine nanoparticles are also reconstructed with 20 data sets per mouse using the same acquisition parameters (dose: ∼60 mGy). For both the simulations and the in vivo data, the reconstruction quality is sufficient to perform material decomposition into gold and iodine maps to localize the extent of myocardial injury (gold accumulation) and to measure cardiac functional metrics (vascular iodine). Their 5D CT imaging protocol represents a 95% reduction in radiation dose per cardiac phase and energy and a 40-fold decrease in projection sampling time relative to their standard imaging protocol. CONCLUSIONS: Their 5D CT data acquisition and reconstruction protocol efficiently exploits the rank-sparse nature of spectral and temporal CT data to provide high-fidelity reconstruction results without increased radiation dose or sampling time.