74 resultados para SimPly


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This volume is the result of a collective desire to pay homage to Neil Forsyth, whose work has significantly contributed to scholarship on Satan. This volume is "after" Satan in more ways than one, tracing the afterlife of both the satanic figure in literature and of Neil Forsyth's contribution to the field, particularly in his major books The Old Enemy: Satan and the Combat Myth (Princeton University Press, 1987, revised 1990) and The Satanic Epic (Princeton University Press, 2003). The essays in this volume draw on Forsyth's work as a focus for their analyses of literary encounters with evil or with the Devil himself, reflecting the richness and variety of contemporary approaches to the age-old question of how to represent evil. All the contributors acknowledge Neil Forsyth's influence in the study of both the Satan-figure and Milton's Paradise Lost. But beyond simply paying homage to Neil Forsyth, the articles collected here trace the lineage of the Satan figure through literary history, showing how evil can function as a necessary other against which a community may define itself. They chart the demonised other through biblical history and medieval chronicle, Shakespeare and Milton, to nineteenth-century fiction and the contemporary novel. Many of the contributors find that literary evil is mediated through the lens of the Satan of Paradise Lost, and their articles address the notion, raised by Neil Forsyth in The Satanic Epic, that the literary Devil-figures under consideration are particularly interested in linguistic ambivalence and the twisted texture of literary works themselves. The multiple responses to evil and the continuous reinvention of the devil figure through the centuries all reaffirm the textual presence of the Devil, his changing forms necessarily inscribed in the shifting history of western literary culture. These essays are a tribute to the work of Neil Forsyth, whose scholarship has illuminated and guided the study of the Devil in English and other literatures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Medicine counterfeiting is a serious worldwide issue, involving networks of manufacture and distribution that are an integral part of industrialized organized crime. Despite the potentially devastating health repercussions involved, legal sanctions are often inappropriate or simply not applied. The difficulty in agreeing on a definition of counterfeiting, the huge profits made by the counterfeiters and the complexity of the market are the other main reasons for the extent of the phenomenon. Above all, international cooperation is needed to thwart the spread of counterfeiting. Moreover effort is urgently required on the legal, enforcement and scientific levels. Pharmaceutical companies and agencies have developed measures to protect the medicines and allow fast and reliable analysis of the suspect products. Several means, essentially based on chromatography and spectroscopy, are now at the disposal of the analysts to enable the distinction between genuine and counterfeit products. However the determination of the components and the use of analytical data for forensic purposes still constitute a challenge. The aim of this review article is therefore to point out the intricacy of medicine counterfeiting so that a better understanding can provide solutions to fight more efficiently against it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Previous studies have shown that glucose increases the glucose transporter (GLUT2) mRNA expression in the liver in vivo and in vitro. Here we report an analysis of the effects of glucose metabolism on GLUT2 gene expression. GLUT2 mRNA accumulation by glucose was not due to stabilization of its transcript but rather was a direct effect on gene transcription. A proximal fragment of the 5' regulatory region of the mouse GLUT2 gene linked to a reporter gene was transiently transfected into liver GLUT2-expressing cells. Glucose stimulated reporter gene expression in these cells, suggesting that glucose-responsive elements were included within the proximal region of the promoter. A dose-dependent effect of glucose on GLUT2 expression was observed over 10 mM glucose irrespective of the hexokinase isozyme (glucokinase K(m) 16 mM; hexokinase I K(m) 0.01 mM) present in the cell type used. This suggests that the correlation between extracellular glucose and GLUT2 mRNA concentrations is simply a reflection of an activation of glucose metabolism. The mediators and the mechanism responsible for this response remain to be determined. In conclusion, glucose metabolism is required for the proper induction of the GLUT2 gene in the liver and this effect is transcriptionally regulated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AIM: Inulin clearance (Cin) is the gold standard for assessing glomerular filtration rate (GFR). Other methods are based on the plasma creatinine concentration (Pcreat), creatinine clearance (Ccreat), the Haycock-Schwartz formula and the plasma concentration of cystatin C (PcysC), a 13 kDa basic protein produced at a constant rate by all nucleated cells. The present prospective study was thus designed to evaluate the reliability of PcysC as a marker of GFR in comparison with that of Pcreat, Ccreat and the Haycock-Schwartz formula, using Cin as the gold standard. METHODS: Ninety-nine children (51 m/48 f), with a median age of 8.3 y (1.0-17.9) were studied. Using a cut-off for Cin of 100 ml/min per 1.73 m2, 54 children (54.5%) had impaired GFR. Those with normal GFR were comparable for age, height, weight and body mass index. RESULTS: Logistic regression, ROC analysis and linear regression all showed that Ccreat was the best parameter to discriminate between impaired and normal GFR, followed by the Haycock-Schwartz formula, PcysC, and finally Pcreat, each one being significantly more predictive than the next. CONCLUSION: GFR is better assessed by the Haycock-Schwartz formula than by PcysC or Pcreat alone. It is therefore concluded that when urine collection is not possible, simply measuring the child's Pcreat and height is the best, easiest and cheapest way to assess GFR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il est important pour les entreprises de compresser les informations détaillées dans des sets d'information plus compréhensibles. Au chapitre 1, je résume et structure la littérature sur le sujet « agrégation d'informations » en contrôle de gestion. Je récapitule l'analyse coûts-bénéfices que les comptables internes doivent considérer quand ils décident des niveaux optimaux d'agrégation d'informations. Au-delà de la perspective fondamentale du contenu d'information, les entreprises doivent aussi prendre en considération des perspectives cogni- tives et comportementales. Je développe ces aspects en faisant la part entre la comptabilité analytique, les budgets et plans, et la mesure de la performance. Au chapitre 2, je focalise sur un biais spécifique qui se crée lorsque les informations incertaines sont agrégées. Pour les budgets et plans, des entreprises doivent estimer les espérances des coûts et des durées des projets, car l'espérance est la seule mesure de tendance centrale qui est linéaire. A la différence de l'espérance, des mesures comme le mode ou la médiane ne peuvent pas être simplement additionnés. En considérant la forme spécifique de distributions des coûts et des durées, l'addition des modes ou des médianes résultera en une sous-estimation. Par le biais de deux expériences, je remarque que les participants tendent à estimer le mode au lieu de l'espérance résultant en une distorsion énorme de l'estimati¬on des coûts et des durées des projets. Je présente également une stratégie afin d'atténuer partiellement ce biais. Au chapitre 3, j'effectue une étude expérimentale pour comparer deux approches d'esti¬mation du temps qui sont utilisées en comptabilité analytique, spécifiquement « coûts basés sur les activités (ABC) traditionnelles » et « time driven ABC » (TD-ABC). Au contraire des affirmations soutenues par les défenseurs de l'approche TD-ABC, je constate que cette dernière n'est pas nécessairement appropriée pour les calculs de capacité. Par contre, je démontre que le TD-ABC est plus approprié pour les allocations de coûts que l'approche ABC traditionnelle. - It is essential for organizations to compress detailed sets of information into more comprehensi¬ve sets, thereby, establishing sharp data compression and good decision-making. In chapter 1, I review and structure the literature on information aggregation in management accounting research. I outline the cost-benefit trade-off that management accountants need to consider when they decide on the optimal levels of information aggregation. Beyond the fundamental information content perspective, organizations also have to account for cognitive and behavi¬oral perspectives. I elaborate on these aspects differentiating between research in cost accounti¬ng, budgeting and planning, and performance measurement. In chapter 2, I focus on a specific bias that arises when probabilistic information is aggregated. In budgeting and planning, for example, organizations need to estimate mean costs and durations of projects, as the mean is the only measure of central tendency that is linear. Different from the mean, measures such as the mode or median cannot simply be added up. Given the specific shape of cost and duration distributions, estimating mode or median values will result in underestimations of total project costs and durations. In two experiments, I find that participants tend to estimate mode values rather than mean values resulting in large distortions of estimates for total project costs and durations. I also provide a strategy that partly mitigates this bias. In the third chapter, I conduct an experimental study to compare two approaches to time estimation for cost accounting, i.e., traditional activity-based costing (ABC) and time-driven ABC (TD-ABC). Contrary to claims made by proponents of TD-ABC, I find that TD-ABC is not necessarily suitable for capacity computations. However, I also provide evidence that TD-ABC seems better suitable for cost allocations than traditional ABC.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Among the types of remote sensing acquisitions, optical images are certainly one of the most widely relied upon data sources for Earth observation. They provide detailed measurements of the electromagnetic radiation reflected or emitted by each pixel in the scene. Through a process termed supervised land-cover classification, this allows to automatically yet accurately distinguish objects at the surface of our planet. In this respect, when producing a land-cover map of the surveyed area, the availability of training examples representative of each thematic class is crucial for the success of the classification procedure. However, in real applications, due to several constraints on the sample collection process, labeled pixels are usually scarce. When analyzing an image for which those key samples are unavailable, a viable solution consists in resorting to the ground truth data of other previously acquired images. This option is attractive but several factors such as atmospheric, ground and acquisition conditions can cause radiometric differences between the images, hindering therefore the transfer of knowledge from one image to another. The goal of this Thesis is to supply remote sensing image analysts with suitable processing techniques to ensure a robust portability of the classification models across different images. The ultimate purpose is to map the land-cover classes over large spatial and temporal extents with minimal ground information. To overcome, or simply quantify, the observed shifts in the statistical distribution of the spectra of the materials, we study four approaches issued from the field of machine learning. First, we propose a strategy to intelligently sample the image of interest to collect the labels only in correspondence of the most useful pixels. This iterative routine is based on a constant evaluation of the pertinence to the new image of the initial training data actually belonging to a different image. Second, an approach to reduce the radiometric differences among the images by projecting the respective pixels in a common new data space is presented. We analyze a kernel-based feature extraction framework suited for such problems, showing that, after this relative normalization, the cross-image generalization abilities of a classifier are highly increased. Third, we test a new data-driven measure of distance between probability distributions to assess the distortions caused by differences in the acquisition geometry affecting series of multi-angle images. Also, we gauge the portability of classification models through the sequences. In both exercises, the efficacy of classic physically- and statistically-based normalization methods is discussed. Finally, we explore a new family of approaches based on sparse representations of the samples to reciprocally convert the data space of two images. The projection function bridging the images allows a synthesis of new pixels with more similar characteristics ultimately facilitating the land-cover mapping across images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Adverse Outcome Pathway (AOP) framework provides a template that facilitates understanding of complex biological systems and the pathways of toxicity that result in adverse outcomes (AOs). The AOP starts with an molecular initiating event (MIE) in which a chemical interacts with a biological target(s), followed by a sequential series of KEs, which are cellular, anatomical, and/or functional changes in biological processes, that ultimately result in an AO manifest in individual organisms and populations. It has been developed as a tool for a knowledge-based safety assessment that relies on understanding mechanisms of toxicity, rather than simply observing its adverse outcome. A large number of cellular and molecular processes are known to be crucial to proper development and function of the central (CNS) and peripheral nervous systems (PNS). However, there are relatively few examples of well-documented pathways that include causally linked MIEs and KEs that result in adverse outcomes in the CNS or PNS. As a first step in applying the AOP framework to adverse health outcomes associated with exposure to exogenous neurotoxic substances, the EU Reference Laboratory for Alternatives to Animal Testing (EURL ECVAM) organized a workshop (March 2013, Ispra, Italy) to identify potential AOPs relevant to neurotoxic and developmental neurotoxic outcomes. Although the AOPs outlined during the workshop are not fully described, they could serve as a basis for further, more detailed AOP development and evaluation that could be useful to support human health risk assessment in a variety of ways.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent findings suggest that the visuo-spatial sketchpad (VSSP) may be divided into two sub-components processing dynamic or static visual information. This model may be useful to elucidate the confusion of data concerning the functioning of the VSSP in schizophrenia. The present study examined patients with schizophrenia and matched controls in a new working memory paradigm involving dynamic (the Ball Flight Task - BFT) or static (the Static Pattern Task - SPT) visual stimuli. In the BFT, the responses of the patients were apparently based on the retention of the last set of segments of the perceived trajectory, whereas control subjects relied on a more global strategy. We assume that the patients' performances are the result of a reduced capacity in chunking visual information since they relied mainly on the retention of the last set of segments. This assumption is confirmed by the poor performance of the patients in the static task (SPT), which requires a combination of stimulus components into object representations. We assume that the static/dynamic distinction may help us to understand the VSSP deficits in schizophrenia. This distinction also raises questions about the hypothesis that visuo-spatial working memory can simply be dissociated into visual and spatial sub-components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary Ecotones are sensitive to change because they contain high numbers of species living at the margin of their environmental tolerance. This is equally true of tree-lines, which are determined by attitudinal or latitudinal temperature gradients. In the current context of climate change, they are expected to undergo modifications in position, tree biomass and possibly species composition. Attitudinal and latitudinal tree-lines differ mainly in the steepness of the underlying temperature gradient: distances are larger at latitudinal tree-lines, which could have an impact on the ability of tree species to migrate in response to climate change. Aside from temperature, tree-lines are also affected on a more local level by pressure from human activities. These are also changing as a consequence of modifications in our societies and may interact with the effects of climate change. Forest dynamics models are often used for climate change simulations because of their mechanistic processes. The spatially-explicit model TreeMig was used as a base to develop a model specifically tuned for the northern European and Alpine tree-line ecotones. For the latter, a module for land-use change processes was also added. The temperature response parameters for the species in the model were first calibrated by means of tree-ring data from various species and sites at both tree-lines. This improved the growth response function in the model, but also lead to the conclusion that regeneration is probably more important than growth for controlling tree-line position and species' distributions. The second step was to implement the module for abandonment of agricultural land in the Alps, based on an existing spatial statistical model. The sensitivity of its most important variables was tested and the model's performance compared to other modelling approaches. The probability that agricultural land would be abandoned was strongly influenced by the distance from the nearest forest and the slope, bath of which are proxies for cultivation costs. When applied to a case study area, the resulting model, named TreeMig-LAb, gave the most realistic results. These were consistent with observed consequences of land-abandonment such as the expansion of the existing forest and closing up of gaps. This new model was then applied in two case study areas, one in the Swiss Alps and one in Finnish Lapland, under a variety of climate change scenarios. These were based on forecasts of temperature change over the next century by the IPCC and the HadCM3 climate model (ΔT: +1.3, +3.5 and +5.6 °C) and included a post-change stabilisation period of 300 years. The results showed radical disruptions at both tree-lines. With the most conservative climate change scenario, species' distributions simply shifted, but it took several centuries reach a new equilibrium. With the more extreme scenarios, some species disappeared from our study areas (e.g. Pinus cembra in the Alps) or dwindled to very low numbers, as they ran out of land into which they could migrate. The most striking result was the lag in the response of most species, independently from the climate change scenario or tree-line type considered. Finally, a statistical model of the effect of reindeer (Rangifer tarandus) browsing on the growth of Pinus sylvestris was developed, as a first step towards implementing human impacts at the boreal tree-line. The expected effect was an indirect one, as reindeer deplete the ground lichen cover, thought to protect the trees against adverse climate conditions. The model showed a small but significant effect of browsing, but as the link with the underlying climate variables was unclear and the model was not spatial, it was not usable as such. Developing the TreeMig-LAb model allowed to: a) establish a method for deriving species' parameters for the growth equation from tree-rings, b) highlight the importance of regeneration in determining tree-line position and species' distributions and c) improve the integration of social sciences into landscape modelling. Applying the model at the Alpine and northern European tree-lines under different climate change scenarios showed that with most forecasted levels of temperature increase, tree-lines would suffer major disruptions, with shifts in distributions and potential extinction of some tree-line species. However, these responses showed strong lags, so these effects would not become apparent before decades and could take centuries to stabilise. Résumé Les écotones son sensibles au changement en raison du nombre élevé d'espèces qui y vivent à la limite de leur tolérance environnementale. Ceci s'applique également aux limites des arbres définies par les gradients de température altitudinaux et latitudinaux. Dans le contexte actuel de changement climatique, on s'attend à ce qu'elles subissent des modifications de leur position, de la biomasse des arbres et éventuellement des essences qui les composent. Les limites altitudinales et latitudinales diffèrent essentiellement au niveau de la pente des gradients de température qui les sous-tendent les distance sont plus grandes pour les limites latitudinales, ce qui pourrait avoir un impact sur la capacité des espèces à migrer en réponse au changement climatique. En sus de la température, la limite des arbres est aussi influencée à un niveau plus local par les pressions dues aux activités humaines. Celles-ci sont aussi en mutation suite aux changements dans nos sociétés et peuvent interagir avec les effets du changement climatique. Les modèles de dynamique forestière sont souvent utilisés pour simuler les effets du changement climatique, car ils sont basés sur la modélisation de processus. Le modèle spatialement explicite TreeMig a été utilisé comme base pour développer un modèle spécialement adapté pour la limite des arbres en Europe du Nord et dans les Alpes. Pour cette dernière, un module servant à simuler des changements d'utilisation du sol a également été ajouté. Tout d'abord, les paramètres de la courbe de réponse à la température pour les espèces inclues dans le modèle ont été calibrées au moyen de données dendrochronologiques pour diverses espèces et divers sites des deux écotones. Ceci a permis d'améliorer la courbe de croissance du modèle, mais a également permis de conclure que la régénération est probablement plus déterminante que la croissance en ce qui concerne la position de la limite des arbres et la distribution des espèces. La seconde étape consistait à implémenter le module d'abandon du terrain agricole dans les Alpes, basé sur un modèle statistique spatial existant. La sensibilité des variables les plus importantes du modèle a été testée et la performance de ce dernier comparée à d'autres approches de modélisation. La probabilité qu'un terrain soit abandonné était fortement influencée par la distance à la forêt la plus proche et par la pente, qui sont tous deux des substituts pour les coûts liés à la mise en culture. Lors de l'application en situation réelle, le nouveau modèle, baptisé TreeMig-LAb, a donné les résultats les plus réalistes. Ceux-ci étaient comparables aux conséquences déjà observées de l'abandon de terrains agricoles, telles que l'expansion des forêts existantes et la fermeture des clairières. Ce nouveau modèle a ensuite été mis en application dans deux zones d'étude, l'une dans les Alpes suisses et l'autre en Laponie finlandaise, avec divers scénarios de changement climatique. Ces derniers étaient basés sur les prévisions de changement de température pour le siècle prochain établies par l'IPCC et le modèle climatique HadCM3 (ΔT: +1.3, +3.5 et +5.6 °C) et comprenaient une période de stabilisation post-changement climatique de 300 ans. Les résultats ont montré des perturbations majeures dans les deux types de limites de arbres. Avec le scénario de changement climatique le moins extrême, les distributions respectives des espèces ont subi un simple glissement, mais il a fallu plusieurs siècles pour qu'elles atteignent un nouvel équilibre. Avec les autres scénarios, certaines espèces ont disparu de la zone d'étude (p. ex. Pinus cembra dans les Alpes) ou ont vu leur population diminuer parce qu'il n'y avait plus assez de terrains disponibles dans lesquels elles puissent migrer. Le résultat le plus frappant a été le temps de latence dans la réponse de la plupart des espèces, indépendamment du scénario de changement climatique utilisé ou du type de limite des arbres. Finalement, un modèle statistique de l'effet de l'abroutissement par les rennes (Rangifer tarandus) sur la croissance de Pinus sylvestris a été développé, comme première étape en vue de l'implémentation des impacts humains sur la limite boréale des arbres. L'effet attendu était indirect, puisque les rennes réduisent la couverture de lichen sur le sol, dont on attend un effet protecteur contre les rigueurs climatiques. Le modèle a mis en évidence un effet modeste mais significatif, mais étant donné que le lien avec les variables climatiques sous jacentes était peu clair et que le modèle n'était pas appliqué dans l'espace, il n'était pas utilisable tel quel. Le développement du modèle TreeMig-LAb a permis : a) d'établir une méthode pour déduire les paramètres spécifiques de l'équation de croissance ä partir de données dendrochronologiques, b) de mettre en évidence l'importance de la régénération dans la position de la limite des arbres et la distribution des espèces et c) d'améliorer l'intégration des sciences sociales dans les modèles de paysage. L'application du modèle aux limites alpines et nord-européennes des arbres sous différents scénarios de changement climatique a montré qu'avec la plupart des niveaux d'augmentation de température prévus, la limite des arbres subirait des perturbations majeures, avec des glissements d'aires de répartition et l'extinction potentielle de certaines espèces. Cependant, ces réponses ont montré des temps de latence importants, si bien que ces effets ne seraient pas visibles avant des décennies et pourraient mettre plusieurs siècles à se stabiliser.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trail pheromones do more than simply guide social insect workers from point A to point B. Recent research has revealed additional ways in which they help to regulate colony foraging, often via positive and negative feedback processes that influence the exploitation of the different resources that a colony has knowledge of. Trail pheromones are often complementary or synergistic with other information sources, such as individual memory. Pheromone trails can be composed of two or more pheromones with different functions, and information may be embedded in the trail network geometry. These findings indicate remarkable sophistication in how trail pheromones are used to regulate colony-level behavior, and how trail pheromones are used and deployed at the individual level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During an infection the antigen-nonspecific memory CD8 T cell compartment is not simply an inert pool of cells, but becomes activated and cytotoxic. It is unknown how these cells contribute to the clearance of an infection. We measured the strength of T cell receptor (TCR) signals that bystander-activated, cytotoxic CD8 T cells (BA-CTLs) receive in vivo and found evidence of limited TCR signaling. Given this marginal contribution of the TCR, we asked how BA-CTLs identify infected target cells. We show that target cells express NKG2D ligands following bacterial infection and demonstrate that BA-CTLs directly eliminate these target cells in an innate-like, NKG2D-dependent manner. Selective inhibition of BA-CTL-mediated killing led to a significant defect in pathogen clearance. Together, these data suggest an innate role for memory CD8 T cells in the early immune response before the onset of a de novo generated, antigen-specific CD8 T cell response.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary : International comparisons in the area of victimization, particularly in the field of violence against women, are fraught with methodological problems that previous research has not systematically addressed, and whose answer does not seem to be agreed up~n. For obvious logistic and financial reasons, international studies on violence against women (i.e. studies that administer the same instrument in different countries). are seldom; therefore, researchers are bound to resort to secondary comparisons. Many studies simply juxtapose their results to the ones of previous wòrk or to findings obtained in different contexts, in order to offer an allegedly comparative perspective to their conclusions. If, most of the time, researchers indicate the methodological limitations of a direct comparison, it is not rare that these do not result in concrete methodological controls. Yet, many studies have shown the influence of surveys methodological parameters on findings, listing recommendations fora «best practice» of research. Although, over the past decades, violence against women surveys have become more and more similar -tending towards a sort of uniformization that could be interpreted as a passive consensus -these instruments retain more or less subtle differences that are still susceptible to influence the validity of a comparison. Yet, only a small number of studies have directly worked on the comparability of violence against women data, striving to control the methodological parameters of the surveys in order to guarantee the validity of their comparisons. The goal of this work is to compare data from two national surveys on violence against women: the Swiss component of the International Violence Against Women Survey [CH-IVAWS] and the National Violence Against Women Survey [NVAWS] administered in the United States. The choice of these studies certainly ensues from the author's affiliations; however, it is far from being trivial. Indeed, the criminological field currently endows American and Anglo-Saxon literature with a predominant space, compelling researchers from other countries to almost do the splits to interpret their results in the light of previous work or to develop effective interventions in their own context. Turning to hypotheses or concepts developed in a specific framework inevitably raises the issue of their applicability to another context, i.e. the Swiss context, if not at least European. This problematic then takes on an interest that goes beyond the particular topic of violence against women, adding to its relevance. This work articulates around three axes. First, it shows the way survey characteristics influence estimates. The comparability of the nature of the CH-IVAWS and NVAWS, their sampling design and the characteristics of their administration are discussed. The definitions used, the operationalization of variables based on comparable items, the control of reference periods, as well as the nature of the victim-offender relationship are included among controlled factors. This study establishes content validity within and across studies, presenting a systematic process destined to maximize the comparability of secondary data. Implications of the process are illustrated with the successive presentation of comparable and non-comparable operationalizations of computed variables. Measuring violence against. women in Switzerland and the United-States, this work compares the prevalence of different forms (threats, physical violence and sexual violence) and types of violence (partner and nonpartner violence). Second, it endeavors to analyze concepts of multivictimization (i.e. experiencing different forms of victimization), repeat victimization (i.e. experiencing the same form of violence more than once), and revictimization (i.e. the link between childhood and adulthood victimization) in a comparative -and comparable -approach. Third, aiming at understanding why partner violence appears higher in the United States, while victims of nonpartners are more frequent in Switzerland, as well as in other European countries, different victimization correlates are examined. This research contributes to a better understanding of the relevance of controlling methodological parameters in comparisons across studies, as it illustrates, systematically, the imposed controls and their implications on quantitative data. Moreover, it details how ignoring these parameters might lead to erroneous conclusions, statistically as well as theoretically. The conclusion of the study puts into a wider perspective the discussion of differences and similarities of violence against women in Switzerland and the United States, and integrates recommendations as to the relevance and validity of international comparisons, whatever the'field they are conducted in. Résumé: Les comparaisons internationales dans le domaine de la victimisation, et plus particulièrement en ce qui concerne les violences envers les femmes, se caractérisent par des problèmes méthodologiques que les recherches antérieures n'ont pas systématiquement adressés, et dont la réponse ne semble pas connaître de consensus. Pour des raisons logistiques et financières évidentes, les études internationales sur les violences envers les femmes (c.-à-d. les études utilisant un même instrument dans différents pays) sont rares, aussi les chercheurs sont-ils contraints de se tourner vers des comparaisons secondaires. Beaucoup de recherches juxtaposent alors simplement leurs résultats à ceux de travaux antérieurs ou à des résultats obtenus dans d'autres contextes, afin d'offrir à leurs conclusions une perspective prétendument comparative. Si, le plus souvent, les auteurs indiquent les limites méthodologiques d'une comparaison directe, il est fréquent que ces dernières ne se traduisent pas par des contrôles méthodologiques concrets. Et pourtant, quantité de travaux ont mis en évidence l'influence des paramètres méthodologiques des enquêtes sur les résultats obtenus, érigeant des listes de recommandations pour une «meilleure pratique» de la recherche. Bien que, ces dernières décennies, les sondages sur les violences envers les femmes soient devenus de plus en plus similaires -tendant, vers une certaine uniformisation que l'on peut interpréter comme un consensus passif-, il n'en demeure pas moins que ces instruments possèdent des différences plus ou moins subtiles, mais toujours susceptibles d'influencer la validité d'une comparaison. Pourtant, seules quelques recherches ont directement travaillé sur la comparabilité des données sur les violences envers les femmes, ayant à coeur de contrôler les paramètres méthodologiques des études utilisées afin de garantir la validité de leurs comparaisons. L'objectif de ce travail est la comparaison des données de deux sondages nationaux sur les violences envers les femmes: le composant suisse de l'International Violence Against Women Survey [CHIVAWSj et le National Violence Against Women Survey [NVAWS) administré aux États-Unis. Le choix de ces deux études découle certes des affiliations de l'auteure, cependant il est loin d'être anodin. Le champ criminologique actuel confère, en effet, une place prépondérante à la littérature américaine et anglo-saxonne, contraignant ainsi les chercheurs d'autres pays à un exercice proche du grand écart pour interpréter leurs résultats à la lumière des travaux antérieurs ou développer des interventions efficaces dans leur propre contexte. Le fait de recourir à des hypothèses et des concepts développés dans un cadre spécifique pose inévitablement la question de leur applicabilité à un autre contexte, soit ici le contexte suisse, sinon du moins européen. Cette problématique revêt alors un intérêt qui dépasse la thématique spécifique des violences envers les femmes, ce qui ajoute à sa pertinence. Ce travail s'articule autour de trois axes. Premièrement, il met en évidence la manière dont les caractéristiques d'un sondage influencent les estimations qui en découlent. La comparabilité de la nature du CH-IVAWS et du NVAWS, de leur processus d'échantillonnage et des caractéristiques de leur administration est discutée. Les définitions utilisées, l'opérationnalisation des variables sur la base d'items comparables, le contrôle des périodes de référence, ainsi que la nature de la relation victime-auteur figurent également parmi les facteurs contrôlés. Ce travail établit ainsi la validité de contenu intra- et inter-études, offrant un processus systématique destiné à maximiser la comparabilité des données secondaires. Les implications de cette démarche sont illustrées avec la présentation successive d'opérationnalisations comparables et non-comparables des variables construites. Mesurant les violences envers les femmes en Suisse et aux États-Unis, ce travail compare la prévalence de plusieurs formes (menaces, violences physiques et violences sexuelles) et types de violence (violences partenaires et non-partenaires). 11 s'attache également à analyser les concepts de multivictimisation (c.-à-d. le fait de subir plusieurs formes de victimisation), victimisation répétée (c.-à.-d. le fait de subir plusieurs incidents de même forme) et revictimisation (c.-à-d. le lien entre la victimisation dans l'enfance et à l'âge adulte) dans une approche comparative - et comparable. Dans un troisième temps, cherchant à comprendre pourquoi la violence des partenaires apparaît plus fréquente aux États-Unis, tandis que les victimes de non-partenaires sont plus nombreuses en Suisse, et dans d'autres pays européens, différents facteurs associés à la victimisation sont évalués. Cette recherche participe d'une meilleure compréhension de la pertinence du contrôle des paramètres méthodologiques dans les comparaisons entre études puisqu'elle illustre, pas à pas, les contrôles imposés et leurs effets sur les données quantitatives, et surtout comment l'ignorance de ces paramètres peut conduire à des conclusions erronées, tant statistiquement que théoriquement. La conclusion replace, dans un contexte plus large, la discussion des différences et des similitudes observées quant à la prévalence des violences envers les femmes en Suisse et aux États-Unis, et intègre des recommandations quant à la pertinence et à la validité des comparaisons internationales, cela quel que soit le domaine considéré.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

By definition, obesity corresponds to the presence of a mass of fatty tissue that is excessive with respect to the body mass. Body fat can be calculated in terms of age and sex by measuring the skinfold thickness in several different places. During the MONICA project, the survey of cardiovascular risk factor prevalence enabled us to measure the thickness of four skinfolds (biceps, triceps, subscapular, suprailiac) in 263 inhabitants of Lausanne (125 men, 138 women). In men aged 25-34, 21 +/- 5% of the body mass was composed of fat, in women 29 +/- 4%. The proportion of fat increases to 31 +/- 7% in men and 41 +/- 6% in women aged 55-64. A robust regression allows body fat to be simply expressed in terms of the body mass index. This allows us to confirm the validity of this index for evaluating the degree of obesity during an epidemiological study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Huntington's disease (HD) pathology is well understood at a histological level but a comprehensive molecular analysis of the effect of the disease in the human brain has not previously been available. To elucidate the molecular phenotype of HD on a genome-wide scale, we compared mRNA profiles from 44 human HD brains with those from 36 unaffected controls using microarray analysis. Four brain regions were analyzed: caudate nucleus, cerebellum, prefrontal association cortex [Brodmann's area 9 (BA9)] and motor cortex [Brodmann's area 4 (BA4)]. The greatest number and magnitude of differentially expressed mRNAs were detected in the caudate nucleus, followed by motor cortex, then cerebellum. Thus, the molecular phenotype of HD generally parallels established neuropathology. Surprisingly, no mRNA changes were detected in prefrontal association cortex, thereby revealing subtleties of pathology not previously disclosed by histological methods. To establish that the observed changes were not simply the result of cell loss, we examined mRNA levels in laser-capture microdissected neurons from Grade 1 HD caudate compared to control. These analyses confirmed changes in expression seen in tissue homogenates; we thus conclude that mRNA changes are not attributable to cell loss alone. These data from bona fide HD brains comprise an important reference for hypotheses related to HD and other neurodegenerative diseases.