927 resultados para Leukocyte extract
Resumo:
Text is the main method of communicating information in the digital age. Messages, blogs, news articles, reviews, and opinionated information abounds on the Internet. People commonly purchase products online and post their opinions about purchased items. This feedback is displayed publicly to assist others with their purchasing decisions, creating the need for a mechanism with which to extract and summarize useful information for enhancing the decision-making process. Our contribution is to improve the accuracy of extraction by combining different techniques from three major areas, named Data Mining, Natural Language Processing techniques and Ontologies. The proposed framework sequentially mines product’s aspects and users’ opinions, groups representative aspects by similarity, and generates an output summary. This paper focuses on the task of extracting product aspects and users’ opinions by extracting all possible aspects and opinions from reviews using natural language, ontology, and frequent “tag” sets. The proposed framework, when compared with an existing baseline model, yielded promising results.
Resumo:
BACKGROUND: The use of salivary diagnostics is increasing because of its noninvasiveness, ease of sampling, and the relatively low risk of contracting infectious organisms. Saliva has been used as a biological fluid to identify and validate RNA targets in head and neck cancer patients. The goal of this study was to develop a robust, easy, and cost-effective method for isolating high yields of total RNA from saliva for downstream expression studies. METHODS: Oral whole saliva (200 mu L) was collected from healthy controls (n = 6) and from patients with head and neck cancer (n = 8). The method developed in-house used QIAzol lysis reagent (Qiagen) to extract RNA from saliva (both cell-free supernatants and cell pellets), followed by isopropyl alcohol precipitation, cDNA synthesis, and real-time PCR analyses for the genes encoding beta-actin ("housekeeping" gene) and histatin (a salivary gland-specific gene). RESULTS: The in-house QIAzol lysis reagent produced a high yield of total RNA (0.89 -7.1 mu g) from saliva (cell-free saliva and cell pellet) after DNase treatment. The ratio of the absorbance measured at 260 nm to that at 280 nm ranged from 1.6 to 1.9. The commercial kit produced a 10-fold lower RNA yield. Using our method with the QIAzol lysis reagent, we were also able to isolate RNA from archived saliva samples that had been stored without RNase inhibitors at -80 degrees C for >2 years. CONCLUSIONS: Our in-house QIAzol method is robust, is simple, provides RNA at high yields, and can be implemented to allow saliva transcriptomic studies to be translated into a clinical setting.
Resumo:
Purpose: Older adults have increased visual impairment, including refractive blur from presbyopic multifocal spectacle corrections, and are less able to extract visual information from the environment to plan and execute appropriate stepping actions; these factors may collectively contribute to their higher risk of falls. The aim of this study was to examine the effect of refractive blur and target visibility on the stepping accuracy and visuomotor stepping strategies of older adults during a precision stepping task. Methods: Ten healthy, visually normal older adults (mean age 69.4 ± 5.2 years) walked up and down a 20 m indoor corridor stepping onto selected high and low-contrast targets while viewing under three visual conditions: best-corrected vision, +2.00 DS and +3.00 DS blur; the order of blur conditions was randomised between participants. Stepping accuracy and gaze behaviours were recorded using an eyetracker and a secondary hand-held camera. Results: Older adults made significantly more stepping errors with increasing levels of blur, particularly exhibiting under-stepping (stepping more posteriorly) onto the targets (p<0.05), while visuomotor stepping strategies did not significantly alter. Stepping errors were also significantly greater for the low compared to the high contrast targets and differences in visuomotor stepping strategies were found, including increased duration of gaze and increased interval between gaze onset and initiation of the leg swing when stepping onto the low contrast targets. Conclusions: These findings highlight that stepping accuracy is reduced for low visibility targets, and for high levels of refractive blur at levels typically present in multifocal spectacle corrections, despite significant changes in some of the visuomotor stepping strategies. These findings highlight the importance of maximising the contrast of objects in the environment, and may help explain why older adults wearing multifocal spectacle corrections exhibit an increased risk of falling.
Resumo:
This Australian Indigenous creactive work and its Treatise promote ways of thinking about practice and research that extend well beyond the current discourse. It invites re-thinking on how research can be practice-led in new ways, and what that might mean for future students. When discussing the challenges of today, this work signifies how "Western Style" thinking and theory is wanting in so many ways. It engages a new dynamic and innovative way of theorising, encouraging future students to apply their full capacity of energy and wisdom. (Extract from examiners' reports.)
Resumo:
The upstream oil & gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data”—that is, the ability to apply more sophisticated types of analytical tools to information in a way that extracts new insights or creates new forms of value—is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil & gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This paper examines existing data management practices in the upstream oil & gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the Big Data revolution. The comparison shows that, in companies that are leading the Big Data revolution, data is regarded as a valuable asset. The presented evidence also shows, however, that this is usually not true within the oil & gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how upstream oil & gas companies could potentially extract more value from data, and concludes with a series of specific technical and management-related recommendations to this end.
Resumo:
Contemporary cities no longer offer the same types of permanent environments that we planned for in the latter part of the twentieth century. Our public spaces are increasingly temporary, transient, and ephemeral. The theories, principles and tactics with which we designed these spaces in the past are no longer appropriate. We need a new theory for understanding the creation, use, and reuse of temporary public space. Moe than a theory, we need new architectural tactics or strategies that can be reliably employed to create successful temporary public spaces. This paper will present ongoing research that starts that process through critical review and technical analysis of existing and historic temporary public spaces. Through the analysis of a number of public spaces, that were either designed for temporary use or became temporary through changing social conditions, this research identifies the tactics and heuristics used in such projects. These tactics and heuristics are then analysed to extract some broader principles for the design of temporary public space. The theories of time related building layers, a model of environmental sustainability, and the recycling of social meaning, are all explored. The paper will go on to identify a number of key questions that need to be explored and addressed by a theory for such developments: How can we retain social meaning in the fabric of the city and its public spaces while we disassemble it and recycle it into new purposes? What role will preservation have in the rapidly changing future; will exemplary temporary spaces be preserved and thereby become no longer temporary? Does the environmental advantage of recycling materials, components and spaces outweigh the removal or social loss of temporary public space? This research starts to identify the knowledge gaps and proposes a number of strategies for making public space in the age of temporary, recyclable, and repurposing of our urban infrastructure; a way of creating lighter, cheaper, quicker, and temporary interventions.
Resumo:
We report a more accurate method to determine the density of trap states in a polymer field-effect transistor. In the approach, we describe in this letter, we take into consideration the sub-threshold behavior in the calculation of the density of trap states. This is very important since the sub-threshold regime of operation extends to fairly large gate voltages in these disordered semiconductor based transistors. We employ the sub-threshold drift-limited mobility model (for sub-threshold response) and the conventional linear mobility model for above threshold response. The combined use of these two models allows us to extract the density of states from charge transport data much more accurately. We demonstrate our approach by analyzing data from diketopyrrolopyrrole based co-polymer transistors with high mobility. This approach will also work well for other disordered semiconductors in which sub-threshold conduction is important.
Resumo:
Accurate process model elicitation continues to be a time consuming task, requiring skill on the part of the interviewer to extract explicit and tacit process information from the interviewee. Many errors occur in this elicitation stage that would be avoided by better activity recall, more consistent specification methods and greater engagement in the elicitation process by interviewees. Metasonic GmbH has developed a process elicitation tool for their process suite. As part of a research engagement with Metasonic, staff from QUT, Australia have developed a 3D virtual world approach to the same problem, viz. eliciting process models from stakeholders in an intuitive manner. This book chapter tells the story of how QUT staff developed a 3D Virtual World tool for process elicitation, took the outcomes of their research project to Metasonic for evaluation, and finally, Metasonic’s response to the initial proof of concept.
Resumo:
Though popular, concepts such as Toffler's 'prosumer' (1970; 1980; 1990) are inherently limited in their ability to accurately describe the makeup and dynamics of current co-creative environments, from fundamentally non-profit initiatives like the Wikipedia to user-industry partnerships that engage in crowdsourcing and the development of collective intelligence. Instead, the success or failure of such projects can be understood best if the traditional producer/consumer divide is dissolved, allowing for the emergence of the produser (Bruns, 2008). A close investigation of leading spaces for produsage makes it possible to extract the key principles which underpin and guide such content co-creation, and to identify how innovative pro-am partnerships between commercial entities and user communities might be structured in order to maximise the benefits that both sides will be able to draw from such collaboration. This chapter will outline these principles, and point to successes and failures in applying them to pro- am initiatives.
Resumo:
The lateral amygdala (LA) receives information from auditory and visual sensory modalities, and uses this information to encode lasting memories that predict threat. One unresolved question about the amygdala is how multiple memories, derived from different sensory modalities, are organized at the level of neuronal ensembles. We previously showed that fear conditioning using an auditory conditioned stimulus (CS) was spatially allocated to a stable topography of neurons within the dorsolateral amygdala (LAd) (Bergstrom et al, 2011). Here, we asked how fear conditioning using a visual CS is topographically organized within the amygdala. To induce a lasting fear memory trace we paired either an auditory (2 khz, 55 dB, 20 s) or visual (1 Hz, 0.5 s on/0.5 s off, 35 lux, 20 s) CS with a mild foot shock unconditioned stimulus (0.6 mA, 0.5 s). To detect learning-induced plasticity in amygdala neurons, we used immunohistochemistry with an antibody for phosphorylated mitogen-activated protein kinase (pMAPK). Using a principal components analysis-based approach to extract and visualize spatial patterns, we uncovered two unique spatial patterns of activated neurons in the LA that were associated with auditory and visual fear conditioning. The first spatial pattern was specific to auditory cued fear conditioning and consisted of activated neurons topographically organized throughout the LAd and ventrolateral nuclei (LAvl) of the LA. The second spatial pattern overlapped for auditory and visual fear conditioning and was comprised of activated neurons located mainly within the LAvl. Overall, the density of pMAPK labeled cells throughout the LA was greatest in the auditory CS group, even though freezing in response to the visual and auditory CS was equivalent. There were no differences detected in the number of pMAPK activated neurons within the basal amygdala nuclei. Together, these results provide the first basic knowledge about the organizational structure of two different fear engrams within the amygdala and suggest they are dissociable at the level of neuronal ensembles within the LA
Resumo:
This is a comprehensive study of human kidney proximal tubular epithelial cells (PTEC) which are known to respond to and mediate the pathological process of a range of kidney diseases. It identifies various molecules expressed by PTEC and how these molecules participate in down-regulating the inflammatory process, thereby highlighting the clinical potential of these molecules to treat various kidney diseases. In the disease state, PTEC gain the ability to regulate the immune cell responses present within the interstitium. This down-regulation is a complex interaction of contact dependent/independent mechanisms involving various immuno-regulatory molecules including PD-L1, sHLA-G and IDO. The overall outcome of this down-regulation is suppressed DC maturation, decreased number of antibody producing B cells and low T cell responses. These manifestations within a clinical setting are expected to dampen the ongoing inflammation, preventing the damage caused to the kidney tissue.
Resumo:
Deliberate firesetting costs our community in destruction to property and lives. Public concern heightens when similar fires occur in a series, raising the specter of copycat firesetting. Difficulties associated with researching copycat crimes in general mean that not a lot is known about copycat firesetting. As an initial step toward filling this research gap, we explore connections between research on copycat crime and research into deliberate firesetting. The intention is to extract salient features from what is known about the phenomena of deliberate firesetting and copycat crime, map them together, and point out shared and unique characteristics. It is argued that a “copycat firesetter” is likely to exist as a distinct subgroup and potentially requiring targeted interventions.
Resumo:
We propose the use of optical flow information as a method for detecting and describing changes in the environment, from the perspective of a mobile camera. We analyze the characteristics of the optical flow signal and demonstrate how robust flow vectors can be generated and used for the detection of depth discontinuities and appearance changes at key locations. To successfully achieve this task, a full discussion on camera positioning, distortion compensation, noise filtering, and parameter estimation is presented. We then extract statistical attributes from the flow signal to describe the location of the scene changes. We also employ clustering and dominant shape of vectors to increase the descriptiveness. Once a database of nodes (where a node is a detected scene change) and their corresponding flow features is created, matching can be performed whenever nodes are encountered, such that topological localization can be achieved. We retrieve the most likely node according to the Mahalanobis and Chi-square distances between the current frame and the database. The results illustrate the applicability of the technique for detecting and describing scene changes in diverse lighting conditions, considering indoor and outdoor environments and different robot platforms.
Resumo:
The upstream oil and gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data” is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil and gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This viewpoint examines existing data management practices in the upstream oil and gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the way in Big Data. The comparison shows that, in companies that are widely considered to be leaders in Big Data analytics, data is regarded as a valuable asset—but this is usually not true within the oil and gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how the industry could potentially extract more value from data, and concludes with a series of policy-related questions to this end.
Resumo:
This study focuses on trying to understand why the range of experience with respect to HIV infection is so diverse, especially as regards to the latency period. The challenge is to determine what assumptions can be made about the nature of the experience of antigenic invasion and diversity that can be modelled, tested and argued plausibly. To investigate this, an agent-based approach is used to extract high-level behaviour which cannot be described analytically from the set of interaction rules at the cellular level. A prototype model encompasses local variation in baseline properties contributing to the individual disease experience and is included in a network which mimics the chain of lymphatic nodes. Dealing with massively multi-agent systems requires major computational efforts. However, parallelisation methods are a natural consequence and advantage of the multi-agent approach. These are implemented using the MPI library.