102 resultados para Combinatorial reasoning
Resumo:
Les différents pays membres de l'UE connaissent des politiques dites de « conciliation de la vie professionnelle et familiale » qui correspondent à un ensemble de dispositifs hétéroclites, plus ou moins complexes, mais rarement cohérents. Alliant des objectifs tels que la hausse de la natalité, la protection des mères et des enfants, l'égalité entre femmes et hommes, la lutte contre la pauvreté des enfants et des familles monoparentales et l'activation des femmes, ces politiques sont fortement ancrées dans des traditions nationales de politiques familiales, d'emploi et fiscales. Ces politiques portent en elles l'héritage et les tensions de l'histoire d'un pays. Au moment où un nouvel acteur international, l'Union européenne, intervient de manière de plus en plus explicite dans le débat et dans la définition de ces politiques, la présente étude tend à analyser l'influence exercées par les référentiels européens en matière de politiques de conciliation sur les discours et politiques nationales de l'Italie et de la France. A partir d'une analyse cognitive du processus d'européanisation, nous montrons que les référentiels développés au sein de l'UE, par leur caractère abstrait et flou, n'ont eu jusqu'ici qu'une faible influence sur les discours et politiques en Italie et en France. Croisant une perspective néo-institutionnaliste historique et discursive, notre recherche a été construite autour de deux axes de réflexion. Premièrement, il a été question d'analyser, d'une part, l'évolution du discours tenu par les différentes instances européennes (notamment de la Commission européenne, le Conseil européen et le Fonds Social européen) et, d'autre part, questionner comment un consensus a pu émerger entre des pays et des acteurs qui ont des traditions extrêmement différentes en matière de politique sociale, de politique familiale et de convention de genre. Deuxièmement, il a été question d'analyser si et comment un cadre de référence conçu au niveau communautaire a pu influencer les discours et politiques au niveau national. - The reconciliation of work and family life policies forms, in the EU's member States, a plurality of politics, more or less complex, but rarely coherent. Combining different objectives such as fertility increase, mothers and children protection, equality between men and women, fight against children and lone-parent families poverty and women activation, these policies are part of the national traditions of family, employment and tax policy and bear the heritage and the tensions of the country history. At a moment when a new global player, the European Union, interferes increasingly explicitly in the debate and the definition of reconciling work and family life policies, the question at the heart of this thesis was to define what kind of influence the référentiels of European discourses have on reconciliation policies since the late 1990s, in the Italian and French discourses and policies. Starting from a cognitive analysis of the Europeanization process, we show that the référentiels developed within the EU, by their abstract and vague nature, have had little influence in Italy and France. Crossing an historical and a discursive neo-institutionalist perspective, our research was based on two axes of reasoning. First, we have analysed, on the one hand, the evolution of various European institutions' discoursed (including the European Commission, the European Council and the European Social Fund) and, on the other hand, we have questioned how a consensus has emerged between countries and actors who have very different traditions in social policy, family policy and gender conventions. Secondly, we have observed if and how a framework developed at Community level, as a kind of ideal to strive for, has influenced discourses and policies at the national level.
Resumo:
A criminal investigation requires to search and to interpret vestiges of a criminal act that happened in a past time. The forensic investigator arises in this context as a critical reader of the investigation scene, in search of physical traces that should enable her to tell a story of the offence/crime which allegedly occurred. The challenge of any investigator is to detect and recognise relevant physical traces in order to provide forensic clues for investigation and intelligence purposes. Inspired by this obser- vation, the current research focuses on the following questions : What is a relevant physical trace? And, how does the forensic investigator know she is facing one ? The interest of such questions is to provide a definition of a dimension often used in forensic science but never studied in its implications and operations. This doctoral research investigates scientific paths that are not often explored in forensic science, by using semiotic and sociological tools combined with statistical data analysis. The results are shown following a semiotic path, strongly influenced by Peir- ce's studies, and a second track, called empirical, where investigations data were analysed and forensic investigators interviewed about their work practices in the field. By the semiotic track, a macroscopic view is given of a signification process running from the discove- red physical trace at the scene to what is evaluated as being relevant for the investigator. The physical trace is perceived in the form of several signs, whose meaning is culturally codified. The reasoning should consist of three main steps : 1- What kind of source does the discovered physical trace refer to ? 2- What cause/activity is at the origin of this source in the specific context of the case ? 3- What story can be told from these observations ? The stage 3 requires to reason in creating hypotheses that should explain the presence of the discovered trace coming from an activity ; the specific activity that is related to the investigated case. To validate this assumption, it would depend on their ability to respond to a rule of relevancy. The last step is the symbolisation of the relevancy. The rule would consist of two points : the recognition of the factual/circumstantial relevancy (Is the link between the trace and the case recognised with the formulated hypothesis ? ) and appropriate relevancy (What investment is required to collect and analyse the discovered trace considering the expected outcome at the investigation/intelligence level?). This process of meaning is based on observations and a conjectural reasoning subject to many influences. In this study, relevancy in forensic science is presented as a conventional dimension that is symbolised and conditioned by the context, the forensic investigator's practice and her workplace environment (culture of the place). In short, the current research states relevancy results of the interactions between parameters from situational, structural (or organisational) and individual orders. The detection, collection and analysis of relevant physical traces at scenes depends on the knowledge and culture mastered by the forensic investigator. In the study of the relation relevant trace-forensic investigator, this research introduces the KEE model as a conceptual map that illustrates three major areas of forensic knowledge and culture acquisition, involved in the research and evaluation of the relevant physical trace. Through the analysis of the investigation data and interviews, the relationship between those three parameters and the relevancy was highlighted. K, for knowing, embodies a rela- tionship to the immediate knowledge allowing to give an overview of the reality at a specific moment ; an important point since relevancy is signified in a context. E, for education, is considered through its relationship with relevancy via a culture that tends to become institutionalised ; it represents the theoretical knowledge. As for the parameter E, for experience, it exists in its relation to relevancy in the adjustments of the strategies of intervention (i.e a practical knowledge) of each practitioner having modulated her work in the light of success and setbacks case after case. The two E parameters constitute the library resources for the semiotic recognition process and the K parameter ensures the contextualisation required to set up the reasoning and to formulate explanatory hypotheses for the discovered physical traces, questioned in their relevancy. This research demonstrates that the relevancy is not absolute. It is temporal and contextual; it is a conventional and relative dimension that must be discussed. This is where the whole issue of the meaning of what is relevant to each stakeholder of the investigation process rests. By proposing a step by step approach to the meaning process from the physical trace to the forensic clue, this study aims to provide a more advanced understanding of the reasoning and its operation, in order to streng- then forensic investigators' training. This doctoral research presents a set of tools critical to both pedagogical and practical aspects for crime scene management while identifying key-influences with individual, structural and situational dimensions. - Une enquête criminelle consiste à rechercher et à faire parler les vestiges d'un acte incriminé passé. L'investigateur forensique se pose dans ce cadre comme un lecteur critique des lieux à la recherche de traces devant lui permettre de former son récit, soit l'histoire du délit/crime censé s'être produit. Le challenge de tout investigateur est de pouvoir détecter et reconnaître les traces dites pertinentes pour fournir des indices forensiques à buts d'enquête et de renseignement. Inspirée par un tel constat, la présente recherche pose au coeur de ses réflexions les questions suivantes : Qu'est-ce qu'une trace pertinente ? Et, comment fait le forensicien pour déterminer qu'il y fait face ? L'intérêt de tels questionnements se comprend dans la volonté de définir une dimension souvent utili- sée en science forensique, mais encore jamais étudiée dans ses implications et fonctionnements. Cette recherche se lance dans des voies d'étude encore peu explorées en usant d'outils sémiotiques et des pratiques d'enquêtes sociologiques combinés à des traitements statistiques de données. Les résultats sont représentés en suivant une piste sémiotique fortement influencée par les écrits de Peirce et une seconde piste dite empirique où des données d'interventions ont été analysées et des investigateurs forensiques interviewés sur leurs pratiques de travail sur le terrain. Par la piste sémiotique, une vision macroscopique du processus de signification de la trace en élément pertinent est représentée. La trace est perçue sous la forme de plusieurs signes dont la signification est codifiée culturellement. Le raisonnement se formaliserait en trois principales étapes : 1- Quel type de source évoque la trace détectée? 2- Quelle cause/activité est à l'origine de cette source dans le contexte précis du cas ? 3- Quelle histoire peut être racontée à partir de ces observations ? Cette dernière étape consiste à raisonner en créant des hypothèses devant expliquer la présence de la trace détectée suite à une activité posée comme étant en lien avec le cas investigué. Pour valider ces hypothèses, cela dépendrait de leur capacité à répondre à une règle, celle de la pertinence. Cette dernière étape consiste en la symbolisation de la pertinence. La règle se composerait de deux points : la reconnaissance de la pertinence factuelle (le lien entre la trace et le cas est-il reconnu dans l'hypothèse fournie?) et la pertinence appropriée (quel est l'investissement à fournir dans la collecte et l'exploitation de la trace pour le bénéfice attendu au niveau de l'investigation/renseignement?). Tout ce processus de signification se base sur des observations et un raisonnement conjectural soumis à de nombreuses influences. Dans cette étude, la pertinence en science forensique se formalise sous les traits d'une dimension conventionnelle, symbolisée, conditionnée par le contexte, la pratique de l'investigateur forensique et la culture du milieu ; en somme cette recherche avance que la pertinence est le fruit d'une interaction entre des paramètres d'ordre situationnel, structurel (ou organisationnel) et individuel. Garantir la détection, la collecte et l'exploitation des traces pertinentes sur les lieux dépend de la connaissance et d'une culture maîtrisées par le forensicien. Dans l'étude du rapport trace pertinente-investigateur forensique, la présente recherche pose le modèle SFE comme une carte conceptuelle illustrant trois grands axes d'acquisition de la connaissance et de la culture forensiques intervenant dans la recherche et l'évaluation de la trace pertinente. Par l'analyse des données d'in- terventions et des entretiens, le rapport entre ces trois paramètres et la pertinence a été mis en évidence. S, pour savoir, incarne un rapport à la connaissance immédiate pour se faire une représentation d'une réalité à un instant donné, un point important pour une pertinence qui se comprend dans un contexte. F, pour formation, se conçoit dans son rapport à la pertinence via cette culture qui tend à s'institutionnaliser (soit une connaissance théorique). Quant au paramètre E, pour expérience, il se comprend dans son rapport à la pertinence dans cet ajustement des stratégies d'intervention (soit une connaissance pratique) de chaque praticien ayant modulé leur travail au regard des succès et échecs enregistrés cas après cas. F et E formeraient la bibliothèque de ressources permettant le processus de reconnaissance sémiotique et S assurerait la contextualisation nécessaire pour poser le raisonnement et formuler les hypothèses explicatives pour les traces détectées et questionnées dans leur pertinence. Ce travail démontre que la pertinence n'est pas absolue. Elle est temporelle et contextuelle, c'est une dimension conventionnelle relative et interprétée qui se doit d'être discutée. C'est là que repose toute la problématique de la signification de ce qui est pertinent pour chaque participant du processus d'investigation. En proposant une lecture par étapes du processus de signification depuis la trace à l'indice, l'étude vise à offrir une compréhension plus poussée du raisonnement et de son fonctionnement pour renforcer la formation des intervenants forensiques. Cette recherche présente ainsi un ensemble d'outils critiques à portée tant pédagogiques que pratiques pour la gestion des lieux tout en identifiant des influences-clé jouées par des dimensions individuelles, structurelles et situationnelles.
Resumo:
The current lack of general practitioners in Switzerland is the result of health care policy which aimed in the past years to reduce the number of medical students and physicians in private practice. Furthermore, during the past decades, the Swiss Medical Schools emphasized on the transmission of medical care by specialists and neglected primary care medicine. The Faculty of medicine at the University of Lausanne recently decided to renew the curriculum. The Department of ambulatory care and community medicine (Policlinique Médicale Universitaire) of Lausanne is committed to the elaboration of this move. The biomedical model, essential to the acquisition of clinical competence, is still taught to the students. Nevertheless, from the beginning to the end of the curriculum, an emphasis is now put on the clinical skills and the clinical reasoning.
Resumo:
Aim Structure of the Thesis In the first article, I focus on the context in which the Homo Economicus was constructed - i.e., the conception of economic actors as fully rational, informed, egocentric, and profit-maximizing. I argue that the Homo Economicus theory was developed in a specific societal context with specific (partly tacit) values and norms. These norms have implicitly influenced the behavior of economic actors and have framed the interpretation of the Homo Economicus. Different factors however have weakened this implicit influence of the broader societal values and norms on economic actors. The result is an unbridled interpretation and application of the values and norms of the Homo Economicus in the business environment, and perhaps also in the broader society. In the second article, I show that the morality of many economic actors relies on isomorphism, i.e., the attempt to fit into the group by adopting the moral norms surrounding them. In consequence, if the norms prevailing in a specific group or context (such as a specific region or a specific industry) change, it can be expected that actors with an 'isomorphism morality' will also adapt their ethical thinking and their behavior -for the 'better' or for the 'worse'. The article further describes the process through which corporations could emancipate from the ethical norms prevailing in the broader society, and therefore develop an institution with specific norms and values. These norms mainly rely on mainstream business theories praising the economic actor's self-interest and neglecting moral reasoning. Moreover, because of isomorphism morality, many economic actors have changed their perception of ethics, and have abandoned the values prevailing in the broader society in order to adopt those of the economic theory. Finally, isomorphism morality also implies that these economic actors will change their morality again if the institutional context changes. The third article highlights the role and responsibility of business scholars in promoting a systematic reflection and self-critique of the business system and develops alternative models to fill the moral void of the business institution and its inherent legitimacy crisis. Indeed, the current business institution relies on assumptions such as scientific neutrality and specialization, which seem at least partly challenged by two factors. First, self-fulfilling prophecy provides scholars with an important (even if sometimes undesired) normative influence over practical life. Second, the increasing complexity of today's (socio-political) world and interactions between the different elements constituting our society question the strong specialization of science. For instance, economic theories are not unrelated to psychology or sociology, and economic actors influence socio-political structures and processes, e.g., through lobbying (Dobbs, 2006; Rondinelli, 2002), or through marketing which changes not only the way we consume, but more generally tries to instill a specific lifestyle (Cova, 2004; M. K. Hogg & Michell, 1996; McCracken, 1988; Muniz & O'Guinn, 2001). In consequence, business scholars are key actors in shaping both tomorrow's economic world and its broader context. A greater awareness of this influence might be a first step toward an increased feeling of civic responsibility and accountability for the models and theories developed or taught in business schools.
Resumo:
This article presents a global vision of images in forensic science. The proliferation of perspectives on the use of images throughout criminal investigations and the increasing demand for research on this topic seem to demand a forensic science-based analysis. In this study, the definitions of and concepts related to material traces are revisited and applied to images, and a structured approach is used to persuade the scientific community to extend and improve the use of images as traces in criminal investigations. Current research efforts focus on technical issues and evidence assessment. This article provides a sound foundation for rationalising and explaining the processes involved in the production of clues from trace images. For example, the mechanisms through which these visual traces become clues of presence or action are described. An extensive literature review of forensic image analysis emphasises the existing guidelines and knowledge available for answering investigative questions (who, what, where, when and how). However, complementary developments are still necessary to demystify many aspects of image analysis in forensic science, including how to review and select images or use them to reconstruct an event or assist intelligence efforts. The hypothetico-deductive reasoning pathway used to discover unknown elements of an event or crime can also help scientists understand the underlying processes involved in their decision making. An analysis of a single image in an investigative or probative context is used to demonstrate the highly informative potential of images as traces and/or clues. Research efforts should be directed toward formalising the extraction and combination of clues from images. An appropriate methodology is key to expanding the use of images in forensic science.
Resumo:
Copy-number variants (CNVs) represent a significant interpretative challenge, given that each CNV typically affects the dosage of multiple genes. Here we report on five individuals with coloboma, microcephaly, developmental delay, short stature, and craniofacial, cardiac, and renal defects who harbor overlapping microdeletions on 8q24.3. Fine mapping localized a commonly deleted 78 kb region that contains three genes: SCRIB, NRBP2, and PUF60. In vivo dissection of the CNV showed discrete contributions of the planar cell polarity effector SCRIB and the splicing factor PUF60 to the syndromic phenotype, and the combinatorial suppression of both genes exacerbated some, but not all, phenotypic components. Consistent with these findings, we identified an individual with microcephaly, short stature, intellectual disability, and heart defects with a de novo c.505C>T variant leading to a p.His169Tyr change in PUF60. Functional testing of this allele in vivo and in vitro showed that the mutation perturbs the relative dosage of two PUF60 isoforms and, subsequently, the splicing efficiency of downstream PUF60 targets. These data inform the functions of two genes not associated previously with human genetic disease and demonstrate how CNVs can exhibit complex genetic architecture, with the phenotype being the amalgam of both discrete dosage dysfunction of single transcripts and also of binary genetic interactions.
Resumo:
In my thesis I present the findings of a multiple-case study on the CSR approach of three multinational companies, applying Basu and Palazzo's (2008) CSR-character as a process model of sensemaking, Suchman's (1995) framework on legitimation strategies, and Habermas (1996) concept of deliberative democracy. The theoretical framework is based on the assumption of a postnational constellation (Habermas, 2001) which sends multinational companies onto a process of sensemaking (Weick, 1995) with regards to their responsibilities in a globalizing world. The major reason is that mainstream CSR-concepts are based on the assumption of a liberal market economy embedded in a nation state that do not fit the changing conditions for legitimation of corporate behavior in a globalizing world. For the purpose of this study, I primarily looked at two research questions: (i) How can the CSR approach of a multinational corporation be systematized empirically? (ii) What is the impact of the changing conditions in the postnational constellation on the CSR approach of the studied multinational corporations? For the analysis, I adopted a holistic approach (Patton, 1980), combining elements of a deductive and inductive theory building methodology (Eisenhardt, 1989b; Eisenhardt & Graebner, 2007; Glaser & Strauss, 1967; Van de Ven, 1992) and rigorous qualitative data analysis. Primary data was collected through 90 semi-structured interviews in two rounds with executives and managers in three multinational companies and their respective stakeholders. Raw data originating from interview tapes, field notes, and contact sheets was processed, stored, and managed using the software program QSR NVIVO 7. In the analysis, I applied qualitative methods to strengthen the interpretative part as well as quantitative methods to identify dominating dimensions and patterns. I found three different coping behaviors that provide insights into the corporate mindset. The results suggest that multinational corporations increasingly turn towards relational approaches of CSR to achieve moral legitimacy in formalized dialogical exchanges with their stakeholders since legitimacy can no longer be derived only from a national framework. I also looked at the degree to which they have reacted to the postnational constellation by the assumption of former state duties and the underlying reasoning. The findings indicate that CSR approaches become increasingly comprehensive through integrating political strategies that reflect the growing (self-) perception of multinational companies as political actors. Based on the results, I developed a model which relates the different dimensions of corporate responsibility to the discussion on deliberative democracy, global governance and social innovation to provide guidance for multinational companies in a postnational world. With my thesis, I contribute to management research by (i) delivering a comprehensive critique of the mainstream CSR-literature and (ii) filling the gap of thorough qualitative research on CSR in a globalizing world using the CSR-character as an empirical device, and (iii) to organizational studies by further advancing a deliberative view of the firm proposed by Scherer and Palazzo (2008).
Resumo:
A multicomponent indicator displacement assay ( MIDA) based on an organometallic receptor and three dyes can be used for the identification and quantification of nucleotides in aqueous solution at neutral pH.
Resumo:
Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the jointevaluation of several distinct items of forensic evidence has to date received some punctual, ratherthan systematic, attention. Questions about the (i) relationships among a set of (usually unobservable)propositions and a set of (observable) items of scientific evidence, (ii) the joint probative valueof a collection of distinct items of evidence as well as (iii) the contribution of each individual itemwithin a given group of pieces of evidence still represent fundamental areas of research. To somedegree, this is remarkable since both, forensic science theory and practice, yet many daily inferencetasks, require the consideration of multiple items if not masses of evidence. A recurrent and particularcomplication that arises in such settings is that the application of probability theory, i.e. the referencemethod for reasoning under uncertainty, becomes increasingly demanding. The present paper takesthis as a starting point and discusses graphical probability models, i.e. Bayesian networks, as frameworkwithin which the joint evaluation of scientific evidence can be approached in some viable way.Based on a review of existing main contributions in this area, the article here aims at presentinginstances of real case studies from the author's institution in order to point out the usefulness andcapacities of Bayesian networks for the probabilistic assessment of the probative value of multipleand interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference,their representation as well as their graphical probabilistic analysis. Attention is also drawnto inferential interactions, such as redundancy, synergy and directional change. These distinguish thejoint evaluation of evidence from assessments of isolated items of evidence. Together, these topicspresent aspects of interest to both, domain experts and recipients of expert information, because theyhave bearing on how multiple items of evidence are meaningfully and appropriately set into context.
Resumo:
At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.
Resumo:
MOTIVATION: The analysis of molecular coevolution provides information on the potential functional and structural implication of positions along DNA sequences, and several methods are available to identify coevolving positions using probabilistic or combinatorial approaches. The specific nucleotide or amino acid profile associated with the coevolution process is, however, not estimated, but only known profiles, such as the Watson-Crick constraint, are usually considered a priori in current measures of coevolution. RESULTS: Here, we propose a new probabilistic model, Coev, to identify coevolving positions and their associated profile in DNA sequences while incorporating the underlying phylogenetic relationships. The process of coevolution is modeled by a 16 × 16 instantaneous rate matrix that includes rates of transition as well as a profile of coevolution. We used simulated, empirical and illustrative data to evaluate our model and to compare it with a model of 'independent' evolution using Akaike Information Criterion. We showed that the Coev model is able to discriminate between coevolving and non-coevolving positions and provides better specificity and specificity than other available approaches. We further demonstrate that the identification of the profile of coevolution can shed new light on the process of dependent substitution during lineage evolution.
Resumo:
BACKGROUND: Elucidating disease and developmental dysfunction requires understanding variation in phenotype. Single-species model organism anatomy ontologies (ssAOs) have been established to represent this variation. Multi-species anatomy ontologies (msAOs; vertebrate skeletal, vertebrate homologous, teleost, amphibian AOs) have been developed to represent 'natural' phenotypic variation across species. Our aim has been to integrate ssAOs and msAOs for various purposes, including establishing links between phenotypic variation and candidate genes. RESULTS: Previously, msAOs contained a mixture of unique and overlapping content. This hampered integration and coordination due to the need to maintain cross-references or inter-ontology equivalence axioms to the ssAOs, or to perform large-scale obsolescence and modular import. Here we present the unification of anatomy ontologies into Uberon, a single ontology resource that enables interoperability among disparate data and research groups. As a consequence, independent development of TAO, VSAO, AAO, and vHOG has been discontinued. CONCLUSIONS: The newly broadened Uberon ontology is a unified cross-taxon resource for metazoans (animals) that has been substantially expanded to include a broad diversity of vertebrate anatomical structures, permitting reasoning across anatomical variation in extinct and extant taxa. Uberon is a core resource that supports single- and cross-species queries for candidate genes using annotations for phenotypes from the systematics, biodiversity, medical, and model organism communities, while also providing entities for logical definitions in the Cell and Gene Ontologies. THE ONTOLOGY RELEASE FILES ASSOCIATED WITH THE ONTOLOGY MERGE DESCRIBED IN THIS MANUSCRIPT ARE AVAILABLE AT: http://purl.obolibrary.org/obo/uberon/releases/2013-02-21/ CURRENT ONTOLOGY RELEASE FILES ARE AVAILABLE ALWAYS AVAILABLE AT: http://purl.obolibrary.org/obo/uberon/releases/
Resumo:
Biological scaling analyses employing the widely used bivariate allometric model are beset by at least four interacting problems: (1) choice of an appropriate best-fit line with due attention to the influence of outliers; (2) objective recognition of divergent subsets in the data (allometric grades); (3) potential restrictions on statistical independence resulting from phylogenetic inertia; and (4) the need for extreme caution in inferring causation from correlation. A new non-parametric line-fitting technique has been developed that eliminates requirements for normality of distribution, greatly reduces the influence of outliers and permits objective recognition of grade shifts in substantial datasets. This technique is applied in scaling analyses of mammalian gestation periods and of neonatal body mass in primates. These analyses feed into a re-examination, conducted with partial correlation analysis, of the maternal energy hypothesis relating to mammalian brain evolution, which suggests links between body size and brain size in neonates and adults, gestation period and basal metabolic rate. Much has been made of the potential problem of phylogenetic inertia as a confounding factor in scaling analyses. However, this problem may be less severe than suspected earlier because nested analyses of variance conducted on residual variation (rather than on raw values) reveals that there is considerable variance at low taxonomic levels. In fact, limited divergence in body size between closely related species is one of the prime examples of phylogenetic inertia. One common approach to eliminating perceived problems of phylogenetic inertia in allometric analyses has been calculation of 'independent contrast values'. It is demonstrated that the reasoning behind this approach is flawed in several ways. Calculation of contrast values for closely related species of similar body size is, in fact, highly questionable, particularly when there are major deviations from the best-fit line for the scaling relationship under scrutiny.
Resumo:
INTRODUCTION: Dendritic cells (DCs) are the most important antigen-presenting cell population for activating antitumor T-cell responses; therefore, they offer a unique opportunity for specific targeting of tumors. AREAS COVERED: We will discuss the critical factors for the enhancement of DC vaccine efficacy: different DC subsets, types of in vitro DC manufacturing protocol, types of tumor antigen to be loaded and finally different adjuvants for activating them. We will cover potential combinatorial strategies with immunomodulatory therapies: depleting T-regulatory (Treg) cells, blocking VEGF and blocking inhibitory signals. Furthermore, recommendations to incorporate these criteria into DC-based tumor immunotherapy will be suggested. EXPERT OPINION: Monocyte-derived DCs are the most widely used DC subset in the clinic, whereas Langerhans cells and plasmacytoid DCs are two emerging DC subsets that are highly effective in eliciting cytotoxic T lymphocyte responses. Depending on the type of tumor antigens selected for loading DCs, it is important to optimize a protocol that will generate highly potent DCs. The future aim of DC-based immunotherapy is to combine it with one or more immunomodulatory therapies, for example, Treg cell depletion, VEGF blockage and T-cell checkpoint blockage, to elicit the most optimal antitumor immunity to induce long-term remission or even cure cancer patients.
Resumo:
Despite decades of research, the exact pathogenic mechanisms underlying acute mountain sickness (AMS) are still poorly understood. This fact frustrates the search for novel pharmacological prophylaxis for AMS. The prevailing view is that AMS results from an insufficient physiological response to hypoxia and that prophylaxis should aim at stimulating the response. Starting off from the opposite hypothesis that AMS may be caused by an initial excessive response to hypoxia, we suggest that directly or indirectly blunting-specific parts of the response might provide promising research alternatives. This reasoning is based on the observations that (i) humans, once acclimatized, can climb Mt Everest experiencing arterial partial oxygen pressures (PaO2 ) as low as 25 mmHg without AMS symptoms; (ii) paradoxically, AMS usually develops at much higher PaO2 levels; and (iii) several biomarkers, suggesting initial activation of specific pathways at such PaO2 , are correlated with AMS. Apart from looking for substances that stimulate certain hypoxia triggered effects, such as the ventilatory response to hypoxia, we suggest to also investigate pharmacological means aiming at blunting certain other specific hypoxia-activated pathways, or stimulating their agonists, in the quest for better pharmacological prophylaxis for AMS.