92 resultados para agent-based


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: Data from two randomized phase III trials were analyzed to evaluate prognostic factors and treatment selection in the first-line management of advanced non-small cell lung cancer patients with performance status (PS) 2. Patients and Methods: Patients randomized to combination chemotherapy (carboplatin and paclitaxel) in one trial and single-agent therapy (gemcitabine or vinorelbine) in the second were included in these analyses. Both studies had identical eligibility criteria and were conducted simultaneously. Comparison of efficacy and safety was performed between the two cohorts. A regression analysis identified prognostic factors and subgroups of patients that may benefit from combination or single-agent therapy. Results: Two hundred one patients were treated with combination and 190 with single-agent therapy. Objective responses were 37 and 15%, respectively. Median time to progression was 4.6 months in the combination arm and 3.5 months in the single-agent arm (p < 0.001). Median survival imes were 8.0 and 6.6 months, and 1-year survival rates were 31 and 26%, respectively. Albumin <3.5 g, extrathoracic metastases, lactate dehydrogenase ≥200 IU, and 2 comorbid conditions predicted outcome. Patients with 0-2 risk factors had similar outcomes independent of treatment, whereas patients with 3-4 factors had a nonsignificant improvement in median survival with combination chemotherapy. Conclusion: Our results show that PS2 non-small cell lung cancer patients are a heterogeneous group who have significantly different outcomes. Patients treated with first-line combination chemotherapy had a higher response and longer time to progression, whereas overall survival did not appear significantly different. A prognostic model may be helpful in selecting PS 2 patients for either treatment strategy. © 2009 by the International Association for the Study of Lung Cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The New Zealand green lipped mussel preparation Lyprinol is available without a prescription from a supermarket, pharmacy or Web. The Food and Drug Administration have recently warned Lyprinol USA about their extravagant anti-inflammatory claims for Lyprinol appearing on the web. These claims are put to thorough review. Lyprinol does have anti-inflammatory mechanisms, and has anti-inflammatory effects in some animal models of inflammation. Lyprinol may have benefits in dogs with arthritis. There are design problems with the clinical trials of Lyprinol in humans as an anti-inflammatory agent in osteoarthritis and rheumatoid arthritis, making it difficult to give a definite answer to how effective Lyprinol is in these conditions, but any benefit is small. Lyprinol also has a small benefit in atopic allergy. As anti-inflammatory agents, there is little to choose between Lyprinol and fish oil. No adverse effects have been reported with Lyprinol. Thus, although it is difficult to conclude whether Lyprinol does much good, it can be concluded that Lyprinol probably does no major harm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An adaptive agent improves its performance by learning from experience. This paper describes an approach to adaptation based on modelling dynamic elements of the environment in order to make predictions of likely future state. This approach is akin to an elite sports player being able to “read the play”, allowing for decisions to be made based on predictions of likely future outcomes. Modelling of the agent‟s likely future state is performed using Markov Chains and a technique called “Motion and Occupancy Grids”. The experiments in this paper compare the performance of the planning system with and without the use of this predictive model. The results of the study demonstrate a surprising decrease in performance when using the predictions of agent occupancy. The results are derived from statistical analysis of the agent‟s performance in a high fidelity simulation of a world leading real robot soccer team.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly, large amounts of public and private money are being invested in education and as a result, schools are becoming more accountable to stakeholders for this financial input. In terms of the curriculum, governments worldwide are frequently tying school funding to students‟ and schools‟ academic performances, which are monitored through high-stakes testing programs. To accommodate the resultant pressures from these testing initiatives, many principals are re-focussing their school‟s curriculum on the testing requirements. Such a re-focussing, which was examined critically in this thesis, constituted an externally facilitated rapid approach to curriculum change. In line with previously enacted change theories and recommendations from these, curriculum change in schools has tended to be a fairly slow, considered, collaborative process that is facilitated internally by a deputy-principal (curriculum). However, theoretically based research has shown that such a process has often proved to be difficult and very rarely successful. The present study reports and theorises the experiences of an externally facilitated process that emerged from a practitioner model of change. This case study of the development of the controlled rapid approach to curriculum change began by establishing the reasons three principals initiated curriculum change and why they then engaged an outsider to facilitate the process. It also examined this particular change process from the perspectives of the research participants. The investigation led to the revision of the practitioner model as used in the three schools and challenged the current thinking about the process of school curriculum change. The thesis aims to offer principals and the wider education community an alternative model for consideration when undertaking curriculum change. Finally, the thesis warns that, in the longer term, the application of study‟s revised model (the Controlled Rapid Approach to Curriculum Change [CRACC] Model) may have less then desirable educational consequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"How do you film a punch?" This question can be posed by actors, make-up artists, directors and cameramen. Though they can all ask the same question, they are not all seeking the same answer. Within a given domain, based on the roles they play, agents of the domain have different perspectives and they want the answers to their question from their perspective. In this example, an actor wants to know how to act when filming a scene involving a punch. A make-up artist is interested in how to do the make-up of the actor to show bruises that may result from the punch. Likewise, a director wants to know how to direct such a scene and a cameraman is seeking guidance on how best to film such a scene. This role-based difference in perspective is the underpinning of the Loculus framework for information management for the Motion Picture Industry. The Loculus framework exploits the perspective of agent for information extraction and classification within a given domain. The framework uses the positioning of the agent’s role within the domain ontology and its relatedness to other concepts in the ontology to determine the perspective of the agent. Domain ontology had to be developed for the motion picture industry as the domain lacked one. A rule-based relatedness score was developed to calculate the relative relatedness of concepts with the ontology, which were then used in the Loculus system for information exploitation and classification. The evaluation undertaken to date have yielded promising results and have indicated that exploiting perspective can lead to novel methods of information extraction and classifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intelligent agents are an advanced technology utilized in Web Intelligence. When searching information from a distributed Web environment, information is retrieved by multi-agents on the client site and fused on the broker site. The current information fusion techniques rely on cooperation of agents to provide statistics. Such techniques are computationally expensive and unrealistic in the real world. In this paper, we introduce a model that uses a world ontology constructed from the Dewey Decimal Classification to acquire user profiles. By search using specific and exhaustive user profiles, information fusion techniques no longer rely on the statistics provided by agents. The model has been successfully evaluated using the large INEX data set simulating the distributed Web environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis builds on the scholarship and practical know-how that have emerged from digital storytelling projects around the world with diverse groups of participants in a range of institutions. I have used the results of these projects to explore the opportunities Digital Storytelling workshop practice may hold for women’s participation in the public sphere in Turkey. Through theoretical discussion and practical experimentation, I examine the potential of Digital Storytelling workshop practice as a means to promote agency and self-expression in a feminist activist organisation, focusing in particular on whether Digital Storytelling can be used as a change agent – as a tool for challenging the idea of public sphere in ways that make it more inclusive of women’s participation. The thesis engages with feminist scholarship’s critiques of the public/private dichotomy, as well as the concept of gender, to seek connections with narrative identity in the light of the analysis of the Digital Storytelling workshops and the digital stories that were created in a feminist context. The study on which this thesis is based saw the introduction of Digital Storytelling to Turkey for the first time through workshops in Istanbul and Antakya, conducted in partnership with the feminist activist organisation Amargi Women’s Academy. Applying the principles of feminist post-structuralist discourse analysis as used by Judith Baxter (2003), I examine two sets of data collected in this project. First, I analyse the interactions during the Digital Storytelling workshops, where women from Amargi created their digital stories in a collaborative setting. This is done through participatory observation notes and in-depth interviews with the workshop participants and facilitators. Second, I seek to uncover the strategies that these women used to ‘speak back to power’ in their digital stories, reading these as texts. I conclude that women from the Amargi network used the workshops to create digital content in order to communicate their concerns about issues that can be classified as gender-specific matters. During this process, they also cooperated, established new connections, and at the end of the process even defined new ways of using, circulating and repurposing their digital stories for feminist activism in Turkey. My research thereby contributes equally to feminist discourse analysis, the study of new-media usage and uptake among non-professionals, and the study of media–public sphere interactions in a particular national setting: Turkey. My conclusion indicates that the process of production is as important as the product itself, and from that I am able to draw out some strategies for developing digitally equipped women’s activism in Turkey.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the paradigm of smart grid as the promising backbone for future network, this paper uses this paradigm to propose a new coordination approach for LV network based on distributed control algorithm. This approach divides the LV network into hierarchical communities where each community is controlled by a control agent. Different level of communication has been proposed for this structure to control the network in different operation modes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have explored the potential of deep Raman spectroscopy, specifically surface enhanced spatially offset Raman spectroscopy (SESORS), for non-invasive detection from within animal tissue, by employing SERS-barcoded nanoparticle (NP) assemblies as the diagnostic agent. This concept has been experimentally verified in a clinic-relevant backscattered Raman system with an excitation line of 785 nm under ex vivo conditions. We have shown that our SORS system, with a fixed offset of 2-3 mm, offered sensitive probing of injected QTH-barcoded NP assemblies through animal tissue containing both protein and lipid. In comparison to that of non-aggregated SERS-barcoded gold NPs, we have demonstrated that the tailored SERS-barcoded aggregated NP assemblies have significantly higher detection sensitivity. We report that these NP assemblies can be readily detected at depths of 7-8 mm from within animal proteinaceous tissue with high signal-to-noise (S/N) ratio. In addition they could also be detected from beneath 1-2 mm of animal tissue with high lipid content, which generally poses a challenge due to high absorption of lipids in the near-infrared region. We have also shown that the signal intensity and S/N ratio at a particular depth is a function of the SERS tag concentration used and that our SORS system has a QTH detection limit of 10-6 M. Higher detection depths may possibly be obtained with optimization of the NP assemblies, along with improvements in the instrumentation. Such NP assemblies offer prospects for in vivo, non-invasive detection of tumours along with scope for incorporation of drugs and their targeted and controlled release at tumour sites. These diagnostic agents combined with drug delivery systems could serve as a “theranostic agent”, an integration of diagnostics and therapeutics into a single platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To evaluate the efficacy and toxicity of Oxaliplatin and 5-Fluorouracil (5-FU)/Leucovorin (LV) combination in ovarian cancer relapsing within 2 years of prior platinum-based chemotherapy in a phase II trial. Methods Eligible patients had at least one prior platinum-based chemotherapy regimen, elevated CA-125 ≥ 60 IU/l, radiological evidence of disease progression and adequate hepatic, renal and bone marrow function. Patients with raised CA-125 levels alone as marker of disease relapse were not eligible. Oxaliplatin (85 mg/m 2) was given on day 1, and 5-Fluorouracil (370 mg/m 2) and Leucovorin (30 mg) was given on days 1 and 8 of a 14-day cycle. Results Twenty-seven patients were enrolled. The median age was 57 years (range 42-74 years). The median platinum-free interval (PFI) was 5 months (range 0-17 months) with only 30% of patients being platinum sensitive (PFI > 6 months). Six patients (22%) had two prior regimens of chemotherapy. A total of 191 cycles were administered (median 7; range 2-12). All patients were evaluable for toxicity. The following grade 3/4 toxicities were noted: anemia 4%; neutropenia 15%; thrombocytopenia 11%; neurotoxicity 8%; lethargy 4%; diarrhea 4%; hypokalemia 11%; hypomagnesemia 11%. Among 27 enrolled patients, 20 patients were evaluable for response by WHO criteria and 25 patients were evaluable by Rustin's CA-125 criteria. The overall response rate (RR) by WHO criteria was 30% (95% CI: 15- 52) [three complete responses (CRs) and three partial responses (PRs)]. The CA-125 response rate was 56% (95% CI: 37-73). Significantly, a 25% (95% CI: 9-53) radiological and a 50% (95% CI: 28-72) CA-125 response rate were noted in platinum resistant patients (PFI < 6 months). The median response duration was 4 months (range 3-12) and the median overall survival was 10 months. Conclusion Oxaliplatin and 5-Fluorouracil/ Leucovorin combination has a good safety profile and is active in platinum-pretreated advanced epithelial ovarian cancer. © 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid development of the World Wide Web has created massive information leading to the information overload problem. Under this circumstance, personalization techniques have been brought out to help users in finding content which meet their personalized interests or needs out of massively increasing information. User profiling techniques have performed the core role in this research. Traditionally, most user profiling techniques create user representations in a static way. However, changes of user interests may occur with time in real world applications. In this research we develop algorithms for mining user interests by integrating time decay mechanisms into topic-based user interest profiling. Time forgetting functions will be integrated into the calculation of topic interest measurements on in-depth level. The experimental study shows that, considering temporal effects of user interests by integrating time forgetting mechanisms shows better performance of recommendation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Different reputation models are used in the web in order to generate reputation values for products using uses' review data. Most of the current reputation models use review ratings and neglect users' textual reviews, because it is more difficult to process. However, we argue that the overall reputation score for an item does not reflect the actual reputation for all of its features. And that's why the use of users' textual reviews is necessary. In our work we introduce a new reputation model that defines a new aggregation method for users' extracted opinions about products' features from users' text. Our model uses features ontology in order to define general features and sub-features of a product. It also reflects the frequencies of positive and negative opinions. We provide a case study to show how our results compare with other reputation models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.