37 resultados para Semi-infinite and infinite programming

em Universit


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In alkaline lavas, the chemical zoning of megacrystals of spinel is due to the cationic exchange between the latter and the host lava. The application of Fick's law to cationic diffusion profiles allows to calculate the time these crystals have stayed in the lava. Those which are in a chemical equilibrium were in contact with the lava during 20 to 30 days, whereas megacrystals lacking this equilibrium were in contact only for 3 or 4 days. The duration of the rise of an ultrabasic nodule in the volcanic chimney was calculated by applying Stokes' law.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are controversial reports about the effect of aging on movement preparation, and it is unclear to which extent cognitive and/or motor related cerebral processes may be affected. This study examines the age effects on electro-cortical oscillatory patterns during various motor programming tasks, in order to assess potential differences according to the mode of action selection. Twenty elderly (EP, 60-84 years) and 20 young (YP, 20-29 years) participants with normal cognition underwent 3 pre-cued response tasks (S1-S2 paradigm). S1 carried either complete information on response side (Full; stimulus-driven motor preparation), no information (None; general motor alertness), or required free response side selection (Free; internally-driven motor preparation). Electroencephalogram (EEG) was recorded using 64 surface electrodes. Alpha (8-12 Hz) desynchronization (ERD)/synchronization (ERS) and motor-related amplitude asymmetries (MRAA) were analyzed during the S1-S2 interval. Reaction times (RTs) to S2 were slower in EP than YP, and in None than in the other 2 tasks. There was an Age x Task interaction due to increased RTs in Free compared to Full in EP only. Central bilateral and midline activation (alpha ERD) was smaller in EP than YP in None. In Full just before S2, readiness to move was reflected by posterior midline inhibition (alpha ERS) in both groups. In Free, such inhibition was present only in YP. Moreover, MRAA showed motor activity lateralization in both groups in Full, but only in YP in Free. The results indicate reduced recruitment of motor regions for motor alertness in the elderly. They further show less efficient cerebral processes subtending free selection of movement in elders, suggesting reduced capacity for internally-driven action with age.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High altitude constitutes an exciting natural laboratory for medical research. While initially, the aim of high-altitude research was to understand the adaptation of the organism to hypoxia and find treatments for altitude-related diseases, over the past decade or so, the scope of this research has broadened considerably. Two important observations led to the foundation for the broadening of the scientific scope of high-altitude research. First, high-altitude pulmonary edema (HAPE) represents a unique model which allows studying fundamental mechanisms of pulmonary hypertension and lung edema in humans. Secondly, the ambient hypoxia associated with high-altitude exposure facilitates the detection of pulmonary and systemic vascular dysfunction at an early stage. Here, we review studies that, by capitalizing on these observations, have led to the description of novel mechanisms underpinning lung edema and pulmonary hypertension and to the first direct demonstration of fetal programming of vascular dysfunction in humans.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Endurance athletes are advised to optimize nutrition prior to races. Little is known about actual athletes' beliefs, knowledge and nutritional behaviour. We monitored nutritional behaviour of amateur ski-mountaineering athletes during 4 days prior to a major competition to compare it with official recommendations and with the athletes' beliefs. METHODS: Participants to the two routes of the 'Patrouille des Glaciers' were recruited (A, 26 km, ascent 1881 m, descent 2341 m, max altitude 3160 m; Z, 53 km, ascent 3994 m, descent 4090 m, max altitude 3650 m). Dietary intake diaries of 40 athletes (21 A, 19 Z) were analysed for energy, carbohydrate, fat, protein and liquid; ten were interviewed about their pre-race nutritional beliefs and behaviour. RESULTS: Despite belief that pre-race carbohydrate, energy and fluid intake should be increased, energy consumption was 2416 ± 696 (mean ± SD) kcal · day(-1), 83 ± 17 % of recommended intake, carbohydrate intake was only 46 ± 13 % of minimal recommended (10 g · kg(-1) · day(-1)) and fluid intake only 2.7 ± 1.0 l · day(-1). CONCLUSIONS: Our sample of endurance athletes did not comply with pre-race nutritional recommendations despite elementary knowledge and belief to be compliant. In these athletes a clear and reflective nutritional strategy was lacking. This suggests a potential for improving knowledge and compliance with recommendations. Alternatively, some recommendations may be unrealistic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The integration of the differential equation of the second law of Fick applied to the diffusion of chemical elements in a semi-infinite solid made it easier to estimate the time of stay of olivine mega-cristals in contact with the host lava The results of this research show the existence of two groups of olivine. The first remained in contact with the magmatic liquid during 19 to 22 days, while the second remained so during only 5 to 9 days. This distinction is correlative to that based on the qualitative observation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Up to 10% of the patients in whom suspected betalactam hypersensitivity (HS) has been excluded by skin and challenge tests report suspected allergic reactions during subsequent treatments with the same or very similar betalactams. It has been suggested that the reactions may result from a resensitization induced by the challenge performed at the time of the allergological work-up. However, most patients did not undergo a second allergological work-up, to determine if the reactions resulted from betalactam HS or not. OBJECTIVES: We aimed to determine if children diagnosed nonallergic to betalactams have tolerated subsequent treatments with the initially suspected and/or other betalactams, and, in case of a reaction, if the reaction resulted from betalactam HS. Methods: We sent a questionnaire concerning the clinical history of their children to the parents of 256 children previously diagnosed nonallergic to betalactams. A second allergological work-up was performed in the children reporting suspected allergic reactions during subsequent treatments with the same and/or other betalactams. Skin tests were performed with the soluble form of the suspected (or very similar) betalactams and other betalactams from the same and other classes. Skin test responses were assessed at 15-20 min (immediate), 6-8 h (semi-late) and 48-72 h (late). Oral challenge (OC) was performed in children with negative skin tests, either at the hospital (immediate and accelerated reactions), or at home (delayed reactions). RESULTS: A response was obtained from 141 children (55.3%). Forty-eight (34%) of those children had not been treated with the betalactams for whom a diagnosis of allergy had been ruled out previously. Seven (7.5%) of the 93 children who had been treated again reported suspected allergic reactions. Skin tests and OC were performed in six of those children, and gave negative results in five children. In one child previously diagnosed nonallergic to amoxicillin associated with clavulanic acid, we diagnosed a delayed HS to clavulanic acid and a serum sickness-like disease to cefaclor. Thus, the frequency of reactions resulting from betalactam HS in children with negative skin and challenge tests is very low, and does not exceed 2.1% (2/93) if we consider that the child which refused a second allergological work-up is really allergic to betalactams. CONCLUSION: Our results in a very large number of children show that reactions presumed to result from betalactam HS are rare in children in whom the diagnosis of betalactam allergy has been ruled out previously. Moreover, they suggest that, as shown for the initial reactions, most of the reactions during subsequent treatments are rather a consequence of the infectious diseases for whom betalactams have been prescribed than a result of betalactam HS. Finally, they suggest that the risk of resensitization by OC is very low, and do not support the notion that skin testing should be repeated in children diagnosed nonallergic to betalactams.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An increasing number of patients suffering from cardiovascular disease, especially coronary artery disease (CAD), are treated with aspirin and/or clopidogrel for the prevention of major adverse events. Unfortunately, there are no specific, widely accepted recommendations for the perioperative management of patients receiving antiplatelet therapy. Therefore, members of the Perioperative Haemostasis Group of the Society on Thrombosis and Haemostasis Research (GTH), the Perioperative Coagulation Group of the Austrian Society for Anesthesiology, Reanimation and Intensive Care (ÖGARI) and the Working Group Thrombosis of the European Society of Cardiology (ESC) have created this consensus position paper to provide clear recommendations on the perioperative use of anti-platelet agents (specifically with semi-urgent and urgent surgery), strongly supporting a multidisciplinary approach to optimize the treatment of individual patients with coronary artery disease who need major cardiac and non-cardiac surgery. With planned surgery, drug eluting stents (DES) should not be used unless surgery can be delayed for ≥12 months after DES implantation. If surgery cannot be delayed, surgical revascularisation, bare-metal stents or pure balloon angioplasty should be considered. During ongoing antiplatelet therapy, elective surgery should be delayed for the recommended duration of treatment. In patients with semi-urgent surgery, the decision to prematurely stop one or both antiplatelet agents (at least 5 days pre-operatively) has to be taken after multidisciplinary consultation, evaluating the individual thrombotic and bleeding risk. Urgently needed surgery has to take place under full antiplatelet therapy despite the increased bleeding risk. A multidisciplinary approach for optimal antithrombotic and haemostatic patient management is thus mandatory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans les dernières années du 20ème siècle, l'aluminium a fait l'objet de beaucoup de communications outrancières et divergentes cautionnées par des scientifiques et des organismes faisant autorité. En 1986, la société PECHINEY le décrète perpétuel tel le mouvement « L'aluminium est éternel. Il est recyclable indéfiniment sans que ses propriétés soient altérées », ce qui nous avait alors irrité. Peu de temps après, en 1990, une communication tout aussi outrancière et irritante d'une grande organisation environnementale, le World Wild Fund, décrète que « le recyclage de l'aluminium est la pire menace pour l'environnement. Il doit être abandonné ». C'est ensuite à partir de la fin des années 1990, l'explosion des publications relatives au développement durable, le bien mal nommé. Au développement, synonyme de croissance obligatoire, nous préférons société ou organisation humaine et à durable, mauvaise traduction de l'anglais « sustainable », nous préférons supportable : idéalement, nous aurions souhaité parler de société durable, mais, pour être compris de tous, nous nous sommes limités à parler dorénavant de développement supportable. Pour l'essentiel, ces publications reconnaissent les très graves défauts de la métallurgie extractive de l'aluminium à partir du minerai et aussi les mérites extraordinaires du recyclage de l'aluminium puisqu'il représente moins de 10% de la consommation d'énergie de la métallurgie extractive à partir du minerai (on verra que c'est aussi moins de 10% de la pollution et du capital). C'est précisément sur le recyclage que se fondent les campagnes de promotion de l'emballage boisson, en Suisse en particulier. Cependant, les données concernant le recyclage de l'aluminium publiées par l'industrie de l'aluminium reflètent seulement en partie ces mérites. Dans les années 1970, les taux de croissance de la production recyclée sont devenus plus élevés que ceux de la production électrolytique. Par contre, les taux de recyclage, établis à indicateur identique, sont unanimement tous médiocres comparativement à d'autres matériaux tels le cuivre et le fer. Composante de l'industrie de l'aluminium, le recyclage bénéficie d'une image favorable auprès du grand public, démontrant le succès des campagnes de communication. A l'inverse, à l'intérieur de l'industrie de l'aluminium, c'est une image dévalorisée. Les opinions émises par tous les acteurs, commerçants, techniciens, dirigeants, encore recueillies pendant ce travail, sont les suivantes : métier de chiffonnier, métier misérable, métier peu technique mais très difficile (un recycleur 15 d'aluminium n'a-t-il pas dit que son métier était un métier d'homme alors que celui du recycleur de cuivre était un jeu d'enfant). A notre avis ces opinions appartiennent à un passé révolu qu'elles retraduisent cependant fidèlement car le recyclage est aujourd'hui reconnu comme une contribution majeure au développement supportable de l'aluminium. C'est bien pour cette raison que, en 2000, l'industrie de l'aluminium mondiale a décidé d'abandonner le qualificatif « secondaire » jusque là utilisé pour désigner le métal recyclé. C'est en raison de toutes ces données discordantes et parfois contradictoires qu'a débuté ce travail encouragé par de nombreuses personnalités. Notre engagement a été incontestablement facilité par notre connaissance des savoirs indispensables (métallurgie, économie, statistiques) et surtout notre expérience acquise au cours d'une vie professionnelle menée à l'échelle mondiale dans (recherche et développement, production), pour (recherche, développement, marketing, stratégie) et autour (marketing, stratégie de produits connexes, les ferro-alliages, et concurrents, le fer) de l'industrie de l'aluminium. Notre objectif est de faire la vérité sur le recyclage de l'aluminium, un matériau qui a très largement contribué à faire le 20ème siècle, grâce à une revue critique embrassant tous les aspects de cette activité méconnue ; ainsi il n'y a pas d'histoire du recyclage de l'aluminium alors qu'il est plus que centenaire. Plus qu'une simple compilation, cette revue critique a été conduite comme une enquête scientifique, technique, économique, historique, socio-écologique faisant ressortir les faits principaux ayant marqué l'évolution du recyclage de l'aluminium. Elle conclut sur l'état réel du recyclage, qui se révèle globalement satisfaisant avec ses forces et ses faiblesses, et au-delà du recyclage sur l'adéquation de l'aluminium au développement supportable, adéquation largement insuffisante. C'est pourquoi, elle suggère les thèmes d'études intéressant tous ceux scientifiques, techniciens, historiens, économistes, juristes concernés par une industrie très représentative de notre monde en devenir, un monde où la place de l'aluminium dépendra de son aptitude à satisfaire les critères du développement supportable. ABSTRACT Owing to recycling, the aluminium industry's global energetic and environmental prints are much lower than its ore extractive metallurgy's ones. Likewise, recycling will allow the complete use of the expected avalanche of old scraps, consequently to the dramatic explosion of aluminium consumption since the 50's. The recycling state is characterized by: i) raw materials split in two groups :one, the new scrap, internal and prompt, proportional to semi-finished and finished products quantities, exhibits a fairly good and regular quality. The other, the old scrap, proportional to the finished products arrivïng at their end-of--life, about 22 years later on an average, exhibits a variable quality depending on the collect mode. ii) a poor recycling rate, near by that of steel. The aluminium industry generates too much new internal scrap and doesn't collect all the availa~e old scrap. About 50% of it is not recycled (when steel is recycling about 70% of the old scrap flow). iii) recycling techniques, all based on melting, are well handled in spite of aluminium atiiníty to oxygen and the practical impossibility to purify aluminium from any impurity. Sorting and first collect are critical issues before melting. iv) products and markets of recycled aluminium :New scraps have still been recycled in the production lines from where there are coming (closed loop). Old scraps, mainly those mixed, have been first recycled in different production lines (open loop) :steel deoxidation products followed during the 30's, with the development of the foundry alloys, by foundry pieces of which the main market is the automotive industry. During the 80's, the commercial development of the beverage can in North America has permitted the first old scrap recycling closed loop which is developing. v) an economy with low and erratic margins because the electrolytic aluminium quotation fixes scrap purchasing price and recycled aluminium selling price. vi) an industrial organisation historically based on the scrap group and the loop mode. New scrap is recycled either by the transformation industry itself or by the recycling industry, the remelter, old scrap by the refiner, the other component of the recycling industry. The big companies, the "majors" are often involved in the closed loop recycling and very seldom in the open loop one. To-day, aluminium industry's global energetic and environmental prints are too unbeara~ e and the sustainaЫe development criteria are not fully met. Critical issues for the aluminium industry are to better produce, to better consume and to better recycle in order to become a real sustainaЫe development industry. Specific issues to recycling are a very efficient recycling industry, a "sustainaЫe development" economy, a complete old scrap collect favouring the closed loop. Also, indirectly connected to the recycling, are a very efficient transformation industry generating much less new scrap and a finished products industry delivering only products fulfilling sustainaЫe development criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exacerbations of COPD (ECOPD) represent a major burden for patients and health care systems. Innovative sampling techniques have led to the identification of several pulmonary biomarkers. Although some molecules are promising, their usefulness in clinical practice is not yet established. Medline and Highwire databases were used to identify studies evaluating pulmonary sampled biomarkers in ECOPD. We combined 3 terms for ECOPD, 3 for biomarkers and 6 for the sampling method. Seventy-nine studies were considered eligible for inclusion in the review and were analyzed further. Pulmonary biomarkers sampled with non-invasive, semi-invasive and invasive methods were evaluated for their potential to illustrate the disease's clinical course, to correlate to clinical variables and to predict clinical outcomes, ECOPD etiology and response to treatment. According to published data several pulmonary biomarkers assessed in ECOPD have the potential to illustrate the natural history of disease through the modification of their levels. Among the clinically relevant molecules, those that have been studied the most and appear to be promising are spontaneous and induced sputum biomarkers for reflecting clinical severity and symptomatic recovery, as well as for directing towards an etiological diagnosis. Current evidence on the clinical usefulness of exhaled breath condensate and bronchoalveolar lavage biomarkers in ECOPD is limited. In conclusion, pulmonary biomarkers have the potential to provide information on the mechanisms underlying ECOPD, and several correlate with clinical variables and outcomes. However, on the basis of published evidence, no single molecule is adequately validated for wide clinical use. Clinical trials that incorporate biomarkers in decisional algorithms are required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: The outer limiting membrane (OLM) is considered to play a role in maintaining the structure of the retina through mechanical strength. However, the observation of junction proteins located at the OLM and its barrier permeability properties may suggest that the OLM may be part of the retinal barrier. MATERIAL AND METHODS: Normal and diabetic rat, monkey, and human retinas were used to analyze junction proteins at the OLM. Proteome analyses were performed using immunohistochemistry on sections and flat-mounted retinas and western blotting on protein extracts obtained from laser microdissection of the photoreceptor layers. Semi-thin and ultrastructure analyses were also reported. RESULTS: In the rat retina, in the subapical region zonula occludens-1 (ZO-1), junction adhesion molecule (JAM), an atypical protein kinase C, is present and the OLM shows dense labeling of occludin, JAM, and ZO-1. The presence of occludin has been confirmed using western blot analysis of the microdissected OLM region. In diabetic rats, occludin expression is decreased and glial cells junctions are dissociated. In the monkey retina, occludin, JAM, and ZO-1 are also found in the OLM. Junction proteins have a specific distribution around cone photoreceptors and Müller glia. Ultrastructural analyses suggest that structures like tight junctions may exist between retinal glial Müller cells and photoreceptors. CONCLUSIONS: In the OLM, heterotypic junctions contain proteins from both adherent and tight junctions. Their structure suggests that tight junctions may exist in the OLM. Occludin is present in the OLM of the rat and monkey retina and it is decreased in diabetes. The OLM should be considered as part of the retinal barrier that can be disrupted in pathological conditions contributing to fluid accumulation in the macula.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: b-value is the parameter characterizing the intensity of the diffusion weighting during image acquisition. Data acquisition is usually performed with low b value (b~1000 s/mm2). Evidence shows that high b-values (b>2000 s/mm2) are more sensitive to the slow diffusion compartment (SDC) and maybe more sensitive in detecting white matter (WM) anomalies in schizophrenia.Methods: 12 male patients with schizophrenia (mean age 35 +/-3 years) and 16 healthy male controls matched for age were scanned with a low b-value (1000 s/mm2) and a high b-value (4000 s/mm2) protocol. Apparent diffusion coefficient (ADC) is a measure of the average diffusion distance of water molecules per time unit (mm2/s). ADC maps were generated for all individuals. 8 region of interests (frontal and parietal region bilaterally, centrum semi-ovale bilaterally and anterior and posterior corpus callosum) were manually traced blind to diagnosis.Results: ADC measures acquired with high b-value imaging were more sensitive in detecting differences between schizophrenia patients and healthy controls than low b-value imaging with a gain in significance by a factor of 20- 100 times despite the lower image Signal-to-noise ratio (SNR). Increased ADC was identified in patient's WM (p=0.00015) with major contributions from left and right centrum semi-ovale and to a lesser extent right parietal region.Conclusions: Our results may be related to the sensitivity of high b-value imaging to the SDC believed to reflect mainly the intra-axonal and myelin bound water pool. High b-value imaging might be more sensitive and specific to WM anomalies in schizophrenia than low b-value imaging

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flow structures above vegetation canopies have received much attention within terrestrial and aquatic literature. This research has led to a good process understanding of mean and turbulent canopy flow structure. However, much of this research has focused on rigid or semi-rigid vegetation with relatively simple morphology. Aquatic macrophytes differ from this form, exhibiting more complex morphologies, predominantly horizontal posture in the flow and a different force balance. While some recent studies have investigated such canopies, there is still the need to examine the relevance and applicability of general canopy layer theory to these types of vegetation. Here, we report on a range of numerical experiments, using both semi-rigid and highly flexible canopies. The results for the semi-rigid canopies support existing canopy layer theory. However, for the highly flexible vegetation, the flow pattern is much more complex and suggests that a new canopy model may be required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Natural selection may favor two very different types of social behaviors that have costs in vital rates (fecundity and/or survival) to the actor: helping behaviors, which increase the vital rates of recipients, and harming behaviors, which reduce the vital rates of recipients. Although social evolutionary theory has mainly dealt with helping behaviors, competition for limited resources creates ecological conditions in which an actor may benefit from expressing behaviors that reduce the vital rates of neighbors. This may occur if the reduction in vital rates decreases the intensity of competition experienced by the actor or that experienced by its offspring. Here, we explore the joint evolution of neutral recognition markers and marker-based costly conditional harming whereby actors express harming, conditional on actor and recipient bearing different conspicuous markers. We do so for two complementary demographic scenarios: finite panmictic and infinite structured populations. We find that marker-based conditional harming can evolve under a large range of recombination rates and group sizes under both finite panmictic and infinite structured populations. A direct comparison with results for the evolution of marker-based conditional helping reveals that, if everything else is equal, marker-based conditional harming is often more likely to evolve than marker-based conditional helping.