210 resultados para Programmable current source
Resumo:
Aim Background The expected benefit of transvaginal specimen extraction is reduced incision-related morbidity. Objectives A systematic review of transvaginal specimen extraction in colorectal surgery was carried out to assess this expectation. Method Search strategy The following keywords, in various combinations, were searched: NOSE (natural orifices specimen extraction), colorectal, colon surgery, transvaginal, right hemicolectomy, left hemicolectomy, low anterior resection, sigmoidectomy, ileocaecal resection, proctocolectomy, colon cancer, sigmoid diverticulitis and inflammatory bowel diseases. Selection criteria Selection criteria included large bowel resection with transvaginal specimen extraction, laparoscopic approach, human studies and English language. Exclusion criteria were experimental studies and laparotomic approach or local excision. All articles published up to February 2011 were included. Results Twenty-three articles (including a total of 130 patients) fulfilled the search criteria. The primary diagnosis was colorectal cancer in 51% (67) of patients, endometriosis in 46% (60) of patients and other conditions in the remaining patients. A concurrent gynaecological procedure was performed in 17% (22) of patients. One case of conversion to laparotomy was reported. In two patients, transvaginal extraction failed. In left- and right-sided resections, the rate of severe complications was 3.7% and 2%, respectively. Two significant complications, one of pelvic seroma and one of rectovaginal fistula, were likely to have been related to transvaginal extraction. The degree of follow up was specified in only one study. Harvested nodes and negative margins were adequate and reported in 70% of oncological cases. Conclusion Vaginal extraction of a colorectal surgery specimen shows potential benefit, particularly when associated with a gynaecological procedure. Data from prospective randomized trials are needed to support the routine use of this technique.
Resumo:
A criminal investigation requires to search and to interpret vestiges of a criminal act that happened in a past time. The forensic investigator arises in this context as a critical reader of the investigation scene, in search of physical traces that should enable her to tell a story of the offence/crime which allegedly occurred. The challenge of any investigator is to detect and recognise relevant physical traces in order to provide forensic clues for investigation and intelligence purposes. Inspired by this obser- vation, the current research focuses on the following questions : What is a relevant physical trace? And, how does the forensic investigator know she is facing one ? The interest of such questions is to provide a definition of a dimension often used in forensic science but never studied in its implications and operations. This doctoral research investigates scientific paths that are not often explored in forensic science, by using semiotic and sociological tools combined with statistical data analysis. The results are shown following a semiotic path, strongly influenced by Peir- ce's studies, and a second track, called empirical, where investigations data were analysed and forensic investigators interviewed about their work practices in the field. By the semiotic track, a macroscopic view is given of a signification process running from the discove- red physical trace at the scene to what is evaluated as being relevant for the investigator. The physical trace is perceived in the form of several signs, whose meaning is culturally codified. The reasoning should consist of three main steps : 1- What kind of source does the discovered physical trace refer to ? 2- What cause/activity is at the origin of this source in the specific context of the case ? 3- What story can be told from these observations ? The stage 3 requires to reason in creating hypotheses that should explain the presence of the discovered trace coming from an activity ; the specific activity that is related to the investigated case. To validate this assumption, it would depend on their ability to respond to a rule of relevancy. The last step is the symbolisation of the relevancy. The rule would consist of two points : the recognition of the factual/circumstantial relevancy (Is the link between the trace and the case recognised with the formulated hypothesis ? ) and appropriate relevancy (What investment is required to collect and analyse the discovered trace considering the expected outcome at the investigation/intelligence level?). This process of meaning is based on observations and a conjectural reasoning subject to many influences. In this study, relevancy in forensic science is presented as a conventional dimension that is symbolised and conditioned by the context, the forensic investigator's practice and her workplace environment (culture of the place). In short, the current research states relevancy results of the interactions between parameters from situational, structural (or organisational) and individual orders. The detection, collection and analysis of relevant physical traces at scenes depends on the knowledge and culture mastered by the forensic investigator. In the study of the relation relevant trace-forensic investigator, this research introduces the KEE model as a conceptual map that illustrates three major areas of forensic knowledge and culture acquisition, involved in the research and evaluation of the relevant physical trace. Through the analysis of the investigation data and interviews, the relationship between those three parameters and the relevancy was highlighted. K, for knowing, embodies a rela- tionship to the immediate knowledge allowing to give an overview of the reality at a specific moment ; an important point since relevancy is signified in a context. E, for education, is considered through its relationship with relevancy via a culture that tends to become institutionalised ; it represents the theoretical knowledge. As for the parameter E, for experience, it exists in its relation to relevancy in the adjustments of the strategies of intervention (i.e a practical knowledge) of each practitioner having modulated her work in the light of success and setbacks case after case. The two E parameters constitute the library resources for the semiotic recognition process and the K parameter ensures the contextualisation required to set up the reasoning and to formulate explanatory hypotheses for the discovered physical traces, questioned in their relevancy. This research demonstrates that the relevancy is not absolute. It is temporal and contextual; it is a conventional and relative dimension that must be discussed. This is where the whole issue of the meaning of what is relevant to each stakeholder of the investigation process rests. By proposing a step by step approach to the meaning process from the physical trace to the forensic clue, this study aims to provide a more advanced understanding of the reasoning and its operation, in order to streng- then forensic investigators' training. This doctoral research presents a set of tools critical to both pedagogical and practical aspects for crime scene management while identifying key-influences with individual, structural and situational dimensions. - Une enquête criminelle consiste à rechercher et à faire parler les vestiges d'un acte incriminé passé. L'investigateur forensique se pose dans ce cadre comme un lecteur critique des lieux à la recherche de traces devant lui permettre de former son récit, soit l'histoire du délit/crime censé s'être produit. Le challenge de tout investigateur est de pouvoir détecter et reconnaître les traces dites pertinentes pour fournir des indices forensiques à buts d'enquête et de renseignement. Inspirée par un tel constat, la présente recherche pose au coeur de ses réflexions les questions suivantes : Qu'est-ce qu'une trace pertinente ? Et, comment fait le forensicien pour déterminer qu'il y fait face ? L'intérêt de tels questionnements se comprend dans la volonté de définir une dimension souvent utili- sée en science forensique, mais encore jamais étudiée dans ses implications et fonctionnements. Cette recherche se lance dans des voies d'étude encore peu explorées en usant d'outils sémiotiques et des pratiques d'enquêtes sociologiques combinés à des traitements statistiques de données. Les résultats sont représentés en suivant une piste sémiotique fortement influencée par les écrits de Peirce et une seconde piste dite empirique où des données d'interventions ont été analysées et des investigateurs forensiques interviewés sur leurs pratiques de travail sur le terrain. Par la piste sémiotique, une vision macroscopique du processus de signification de la trace en élément pertinent est représentée. La trace est perçue sous la forme de plusieurs signes dont la signification est codifiée culturellement. Le raisonnement se formaliserait en trois principales étapes : 1- Quel type de source évoque la trace détectée? 2- Quelle cause/activité est à l'origine de cette source dans le contexte précis du cas ? 3- Quelle histoire peut être racontée à partir de ces observations ? Cette dernière étape consiste à raisonner en créant des hypothèses devant expliquer la présence de la trace détectée suite à une activité posée comme étant en lien avec le cas investigué. Pour valider ces hypothèses, cela dépendrait de leur capacité à répondre à une règle, celle de la pertinence. Cette dernière étape consiste en la symbolisation de la pertinence. La règle se composerait de deux points : la reconnaissance de la pertinence factuelle (le lien entre la trace et le cas est-il reconnu dans l'hypothèse fournie?) et la pertinence appropriée (quel est l'investissement à fournir dans la collecte et l'exploitation de la trace pour le bénéfice attendu au niveau de l'investigation/renseignement?). Tout ce processus de signification se base sur des observations et un raisonnement conjectural soumis à de nombreuses influences. Dans cette étude, la pertinence en science forensique se formalise sous les traits d'une dimension conventionnelle, symbolisée, conditionnée par le contexte, la pratique de l'investigateur forensique et la culture du milieu ; en somme cette recherche avance que la pertinence est le fruit d'une interaction entre des paramètres d'ordre situationnel, structurel (ou organisationnel) et individuel. Garantir la détection, la collecte et l'exploitation des traces pertinentes sur les lieux dépend de la connaissance et d'une culture maîtrisées par le forensicien. Dans l'étude du rapport trace pertinente-investigateur forensique, la présente recherche pose le modèle SFE comme une carte conceptuelle illustrant trois grands axes d'acquisition de la connaissance et de la culture forensiques intervenant dans la recherche et l'évaluation de la trace pertinente. Par l'analyse des données d'in- terventions et des entretiens, le rapport entre ces trois paramètres et la pertinence a été mis en évidence. S, pour savoir, incarne un rapport à la connaissance immédiate pour se faire une représentation d'une réalité à un instant donné, un point important pour une pertinence qui se comprend dans un contexte. F, pour formation, se conçoit dans son rapport à la pertinence via cette culture qui tend à s'institutionnaliser (soit une connaissance théorique). Quant au paramètre E, pour expérience, il se comprend dans son rapport à la pertinence dans cet ajustement des stratégies d'intervention (soit une connaissance pratique) de chaque praticien ayant modulé leur travail au regard des succès et échecs enregistrés cas après cas. F et E formeraient la bibliothèque de ressources permettant le processus de reconnaissance sémiotique et S assurerait la contextualisation nécessaire pour poser le raisonnement et formuler les hypothèses explicatives pour les traces détectées et questionnées dans leur pertinence. Ce travail démontre que la pertinence n'est pas absolue. Elle est temporelle et contextuelle, c'est une dimension conventionnelle relative et interprétée qui se doit d'être discutée. C'est là que repose toute la problématique de la signification de ce qui est pertinent pour chaque participant du processus d'investigation. En proposant une lecture par étapes du processus de signification depuis la trace à l'indice, l'étude vise à offrir une compréhension plus poussée du raisonnement et de son fonctionnement pour renforcer la formation des intervenants forensiques. Cette recherche présente ainsi un ensemble d'outils critiques à portée tant pédagogiques que pratiques pour la gestion des lieux tout en identifiant des influences-clé jouées par des dimensions individuelles, structurelles et situationnelles.
Resumo:
A large variety of cancer vaccines have undergone extensive testing in early-phase clinical trials. A limited number have also been tested in randomized phase II clinical trials. Encouraging trends toward increased survival in the vaccine arms have been recently observed for 2 vaccine candidates in patients with non-small-cell lung cancer. These have provided the impetus for the initiation of phase III trials in large groups of patients with lung cancer. These vaccines target 2 antigens widely expressed in lung carcinomas: melanoma-associated antigen 3, a cancer testis antigen; and mucin 1, an antigen overexpressed in a largely deglycosylated form in advanced tumors. Therapeutic cancer vaccines aim at inducing strong CD8 and CD4 T-cell responses. The majority of vaccines recently tested in phase I clinical trials show efficacy in terms of induction of specific tumor antigen immunity. However, clinical efficacy remains to be determined but appears limited. Efforts are thus aimed at understanding the basis for this apparent lack of effect on tumors. Two major factors are involved. On one hand, current vaccines are suboptimal. Strong adjuvant agents and appropriate tumor antigens are needed. Moreover, dose, route, and schedule also need optimization. On the other hand, it is now clear that large tumors often present a tolerogenic microenvironment that hampers effective antitumor immunity. The partial understanding of the molecular pathways leading to functional inactivation of T cells at tumor sites has provided new targets for intervention. In this regard, blockade of cytotoxic T-lymphocyte antigen-4 and programmed death-1 with humanized monoclonal antibodies has reached the clinical testing stage. In the future, more potent cancer vaccines will benefit from intense research in antigen discovery and adjuvant agents. Furthermore, it is likely that vaccines need to be combined with compounds that reverse major tolerogenic pathways that are constitutively active at the tumor site. Developing these combined approaches to vaccination in cancer promises new, exciting findings and, at the same time, poses important challenges to academic research institutions and the pharmaceutical industry.
Resumo:
BACKGROUND: Elderly patients are emerging as a population at high risk for infective endocarditis (IE). However, adequately sized prospective studies on the features of IE in elderly patients are lacking. METHODS: In this multinational, prospective, observational cohort study within the International Collaboration on Endocarditis, 2759 consecutive patients were enrolled from June 15, 2000, to December 1, 2005; 1056 patients with IE 65 years or older were compared with 1703 patients younger than 65 years. Risk factors, predisposing conditions, origin, clinical features, course, and outcome of IE were comprehensively analyzed. RESULTS: Elderly patients reported more frequently a hospitalization or an invasive procedure before IE onset. Diabetes mellitus and genitourinary and gastrointestinal cancer were the major predisposing conditions. Blood culture yield was higher among elderly patients with IE. The leading causative organism was Staphylococcus aureus, with a higher rate of methicillin resistance. Streptococcus bovis and enterococci were also significantly more prevalent. The clinical presentation of elderly patients with IE was remarkable for lower rates of embolism, immune-mediated phenomena, or septic complications. At both echocardiography and surgery, fewer vegetations and more abscesses were found, and the gain in the diagnostic yield of transesophageal echocardiography was significantly larger. Significantly fewer elderly patients underwent cardiac surgery (38.9% vs 53.5%; P < .001). Elderly patients with IE showed a higher rate of in-hospital death (24.9% vs 12.8%; P < .001), and age older than 65 years was an independent predictor of mortality. CONCLUSIONS: In this large prospective study, increasing age emerges as a major determinant of the clinical characteristics of IE. Lower rates of surgical treatment and high mortality are the most prominent features of elderly patients with IE. Efforts should be made to prevent health care-associated acquisition and improve outcomes in this major subgroup of patients with IE.
Resumo:
Precise MEG estimates of neuronal current flow are undermined by uncertain knowledge of the head location with respect to the MEG sensors. This is either due to head movements within the scanning session or systematic errors in co-registration to anatomy. Here we show how such errors can be minimized using subject-specific head-casts produced using 3D printing technology. The casts fit the scalp of the subject internally and the inside of the MEG dewar externally, reducing within session and between session head movements. Systematic errors in matching to MRI coordinate system are also reduced through the use of MRI-visible fiducial markers placed on the same cast. Bootstrap estimates of absolute co-registration error were of the order of 1mm. Estimates of relative co-registration error were <1.5mm between sessions. We corroborated these scalp based estimates by looking at the MEG data recorded over a 6month period. We found that the between session sensor variability of the subject's evoked response was of the order of the within session noise, showing no appreciable noise due to between-session movement. Simulations suggest that the between-session sensor level amplitude SNR improved by a factor of 5 over conventional strategies. We show that at this level of coregistration accuracy there is strong evidence for anatomical models based on the individual rather than canonical anatomy; but that this advantage disappears for errors of greater than 5mm. This work paves the way for source reconstruction methods which can exploit very high SNR signals and accurate anatomical models; and also significantly increases the sensitivity of longitudinal studies with MEG.
Resumo:
Owl pellets contain a good skeletal record of the small mammals consumed, and correspond to the undigested portions of prey which are regurgitated. These pellets are easy to find at the roosting site of owls. As it has been demonstrated that amplifiable DNA can be isolated from ancient bone remains, the possibility of using owl pellets as a source of DNA for small mammal genetics studies via the polymerase chain reaction has been investigated. The main uncertainties when isolating DNA from such a material are firstly the possibility that the extracted DNA would be too degraded during the digestion in the stomach of the owl, and secondly that extensive cross-contaminations could occur among the different prey consumed. The results obtained clearly demonstrate that cross-contamination does not occur, and that mitochondrial and nuclear DNA can be amplified using skulls of small mammals found in owl pellets as a source of DNA. The relative efficiency of two methods of DNA extraction is estimated and discussed. Thus, owl pellets represent a non-invasive sampling technique which provides a valuable source of DNA for studying population genetics of small mammals.
Resumo:
Understanding the extent of genomic transcription and its functional relevance is a central goal in genomics research. However, detailed genome-wide investigations of transcriptome complexity in major mammalian organs have been scarce. Here, using extensive RNA-seq data, we show that transcription of the genome is substantially more widespread in the testis than in other organs across representative mammals. Furthermore, we reveal that meiotic spermatocytes and especially postmeiotic round spermatids have remarkably diverse transcriptomes, which explains the high transcriptome complexity of the testis as a whole. The widespread transcriptional activity in spermatocytes and spermatids encompasses protein-coding and long noncoding RNA genes but also poorly conserves intergenic sequences, suggesting that it may not be of immediate functional relevance. Rather, our analyses of genome-wide epigenetic data suggest that this prevalent transcription, which most likely promoted the birth of new genes during evolution, is facilitated by an overall permissive chromatin in these germ cells that results from extensive chromatin remodeling.
Resumo:
Allergen-specific immunotherapy is the only immunomodulatory and etiological therapy of allergy and asthma. Conventional specific immunotherapy (SIT) with whole-allergen extract is antigen specific, effective on multiple organs, efficient on asthma in defined conditions, provides long-lasting protection and is cost effective. Moreover, SIT is able to prevent the course of rhinitis to asthma. SIT has its drawbacks: the long duration of treatment, the unsatisfactory standardization of allergen extracts and a questionable safety level. Novel approaches are aimed at drastically reducing adverse anaphylactic events, shortening the duration of therapy and improving its efficacy. Novel promising approaches have based their formulation on a limited set of recombinant allergens or chimeric molecules as well as on hypoallergenic allergen fragments or peptides. The simultaneous use of adjuvants with immunomodulatory properties may contribute to improve both the safety and efficacy of allergen-SIT of allergy and asthma.
Resumo:
Community-level patterns of functional traits relate to community assembly and ecosystem functioning. By modelling the changes of different indices describing such patterns - trait means, extremes and diversity in communities - as a function of abiotic gradients, we could understand their drivers and build projections of the impact of global change on the functional components of biodiversity. We used five plant functional traits (vegetative height, specific leaf area, leaf dry matter content, leaf nitrogen content and seed mass) and non-woody vegetation plots to model several indices depicting community-level patterns of functional traits from a set of abiotic environmental variables (topographic, climatic and edaphic) over contrasting environmental conditions in a mountainous landscape. We performed a variation partitioning analysis to assess the relative importance of these variables for predicting patterns of functional traits in communities, and projected the best models under several climate change scenarios to examine future potential changes in vegetation functional properties. Not all indices of trait patterns within communities could be modelled with the same level of accuracy: the models for mean and extreme values of functional traits provided substantially better predictive accuracy than the models calibrated for diversity indices. Topographic and climatic factors were more important predictors of functional trait patterns within communities than edaphic predictors. Overall, model projections forecast an increase in mean vegetation height and in mean specific leaf area following climate warming. This trend was important at mid elevation particularly between 1000 and 2000 m asl. With this study we showed that topographic, climatic and edaphic variables can successfully model descriptors of community-level patterns of plant functional traits such as mean and extreme trait values. However, which factors determine the diversity of functional traits in plant communities remains unclear and requires more investigations.
Resumo:
The treatment of writer's cramp, a task-specific focal hand dystonia, needs new approaches. A deficiency of inhibition in the motor cortex might cause writer's cramp. Transcranial direct current stimulation modulates cortical excitability and may provide a therapeutic alternative. In this randomized, double-blind, sham-controlled study, we investigated the efficacy of cathodal stimulation of the contralateral motor cortex in 3 sessions in 1 week. Assessment over a 2-week period included clinical scales, subjective ratings, kinematic handwriting analysis, and neurophysiological evaluation. Twelve patients with unilateral dystonic writer's cramp were investigated; 6 received transcranial direct current and 6 sham stimulation. Cathodal transcranial direct current stimulation had no favorable effects on clinical scales and failed to restore normal handwriting kinematics and cortical inhibition. Subjective worsening remained unexplained, leading to premature study termination. Repeated sessions of cathodal transcranial direct current stimulation of the motor cortex yielded no favorable results supporting a therapeutic potential in writer's cramp.
Resumo:
Locally advanced prostate cancer (LAPC) is a heterogeneous entity usually embracing T3-4 and/or pelvic lymph-node-positive disease in the absence of established metastases. Outcomes for LAPC with single therapies have traditionally been poor, leading to the investigation of adjuvant therapies. Prostate cancer is a hormonally sensitive tumour, which usually responds to pharmacological manipulation of the androgen receptor or its testosterone-related ligands. As such, androgen deprivation therapy (ADT) has become an important adjuvant strategy for the treatment of LAPC, particularly for patients managed primarily with radiotherapy. Such results have generally not been replicated in surgical patients. With increased use of ADT has come improved awareness of the numerous toxicities associated with long-term use of these agents, as well as the development of strategies for minimizing ADT exposure and actively managing adverse effects. Several trials are exploring agents to enhance radiation cell sensitivity as well as the application of adjuvant docetaxel, an agent with proven efficacy in the metastatic, castrate-resistant setting. The recent work showing activity of cabazitaxel, sipuleucel-T and abiraterone for castrate-resistant disease in the post-docetaxel setting will see these agents investigated in conjunction with definitive surgery and radiotherapy.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
During the past decades, anticancer immunotherapy has evolved from a promising therapeutic option to a robust clinical reality. Many immunotherapeutic regimens are now approved by the US Food and Drug Administration and the European Medicines Agency for use in cancer patients, and many others are being investigated as standalone therapeutic interventions or combined with conventional treatments in clinical studies. Immunotherapies may be subdivided into "passive" and "active" based on their ability to engage the host immune system against cancer. Since the anticancer activity of most passive immunotherapeutics (including tumor-targeting monoclonal antibodies) also relies on the host immune system, this classification does not properly reflect the complexity of the drug-host-tumor interaction. Alternatively, anticancer immunotherapeutics can be classified according to their antigen specificity. While some immunotherapies specifically target one (or a few) defined tumor-associated antigen(s), others operate in a relatively non-specific manner and boost natural or therapy-elicited anticancer immune responses of unknown and often broad specificity. Here, we propose a critical, integrated classification of anticancer immunotherapies and discuss the clinical relevance of these approaches.