911 resultados para Nussbaum, Martha Craven, 1947- -- Contributions in philosophy


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Size-controlled, catalytically active PVP-stabilised Pd nanoparticles have been studied by operando liquid phase XAS during the Suzuki cross-coupling of iodonanisole and phenylboronic acid in MeOH-toluene using KOMe base. XAS reveals nanoparticles are stable to metal leaching throughout the reaction, with surface density Pd defect sites directly implicated in the catalytic cycle. The efficacy of popular selective chemical and structural poisons for distinguishing heterogeneous and homogeneous contributions in Pd catalysed cross-couplings is also explored. © 2010 The Royal Society of Chemistry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AMS Subj. Classification: 62P10, 62H30, 68T01

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Welcome to the Second International Workshop on Multimedia Communications and Networking held in conjunction with IUCC-2012 during 25 June – 27 June 2012 in Liverpool, UK. MultiCom-2012 is dedicated to address the challenges in the areas of elivering multimedia content using modern communication and networking techniques. The multimedia & networking computing domain emerges from the integration of multimedia content such as audio and video with content distribution technologies. This workshop aims to cover contributions in both design and analysis aspects in the context of multimedia, wired/wireless/heterogeneous networks, and quality evaluation. It also intends to bring together researchers and practitioners from academia and industry to share their latest achievements in this field with others and establish new collaborations for future developments. All papers received are peer reviewed by three members of the Technical Programme Committee. The papers are assessed by their originality, technical quality, presentation and relevance to the theme of the workshop. Based on the criteria set, four papers have been accepted for presentation at the workshop and will appear in the IUCC conference proceedings. We would like to take this opportunity to thank the IUCC-2012 Organizing Committee, the TPC members of MultiCom-2012 and the authors for their s upport, dedicated work and contributions. Finally, we look forward to meeting you at the workshop in Liverpool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advance directives are one mechanism for preserving the rights of individuals to exercise some control over their health care when serious illness may prevent them from direct participation. Nurses, as the health care providers with the closest and most sustained contact with critically ill and dying patients, are positioned to assist patients to plan for future health care needs. Although a majority of nurses favor the concept of advance directives for their patients and for themselves, they have not played a significant role in facilitating advance health care planning with their patients nor implemented advance health care planning for themselves.^ Research has also shown that differing forms of education and counseling increase the completion rates for advance directives in selected populations, mostly the elderly and seriously ill. Not yet developed are effective educational strategies to assist nurses and nurse students to make optimal contributions in assisting their clients' plans for future health care decision-making. This study sought to determine whether specific learning strategies (a) increased the involvement of nurses and nurse students in facilitating advance care planning with patients and (b) increased the percentage of the nurses' and nurse students' own personal advance care planning activities.^ The study compared two learning interventions and two populations, nurses and nurse students. The participants were randomly assigned to one of the two learning interventions, L1 or L2. Participants in L1 received a lecture, discussion and exploration of the forces impacting on advance directive behavior. Participants in L2 received the same intervention components with the additional component of group practice completing advance directives.^ Analysis of the data by chi-square and logistic regression did not support the hypotheses that the practice component would make a difference in the participants' facilitation of advance care planning with patients or in their own personal advance care planning activities. There were significant differences in post-intervention behavior between the nurse and nurse student groups. The nurses in the study did significantly more facilitation of advance care planning with patients and completed significantly more advance care documents than the nurse students post-intervention. However, the nurse students held more post-intervention family discussions than did the nurses. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the performance of series of two geomagnetic indices and series synthesized from a semi-empirical model of magnetospheric currents, in explaining the geomagnetic activity observed at Northern Hemipshere's mid-latitude ground-based stations. We analyse data, for the 2007 to 2014 period, from four magnetic observatories (Coimbra, Portugal; Panagyurishte, Bulgary; Novosibirsk, Russia and Boulder, USA), at geomagnetic latitudes between 40° and 50° N. The quiet daily (QD) variation is firstly removed from the time series of the geomagnetic horizontal component (H) using natural orthogonal components (NOC) tools. We compare the resulting series with series of storm-time disturbance (Dst) and ring current (RC) indices and with H series synthesized from the Tsyganenko and Sitnov (2005, doi:10.1029/2004JA010798) (TS05) semi-empirical model of storm-time geomagnetic field. In the analysis, we separate days with low and high local K-index values. Our results show that NOC models are as efficient as standard models of QD variation in preparing raw data to be compared with proxies, but with much less complexity. For the two stations in Europe, we obtain indication that NOC models could be able to separate ionospheric and magnetospheric contributions. Dst and RC series explain the four observatory H-series successfully, with values for the mean of significant correlation coefficients, from 0.5 to 0.6 during low geomagnetic activity (K less than 4) and from 0.6 to 0.7 for geomagnetic active days (K greater than or equal to 4). With regard to the performance of TS05, our results show that the four observatories separate into two groups: Coimbra and Panagyurishte, in one group, for which the magnetospheric/ionospheric ratio in QD variation is smaller, a dominantly QD ionospheric contribution can be removed and TS05 simulations are the best proxy; Boulder and Novosibirsk,in the other group, for which the ionospheric and magnetospheric contributions in QD variation can not be differentiated and correlations with TS05 series can not be made to improve. The main contributor to magnetospheric QD signal are Birkeland currents. The relatively good success of TS05 model in explaining ground-based irregular geomagnetic activity at mid-latitudes makes it an effective tool to classify storms according to their main sources. For Coimbra and Panagyurishte in particular, where ionospheric and magnetospheric daily contributions seem easier to separate, we can aspire to use the TS05 model for ensemble generation in space weather (SW) forecasting and interpretation of past SW events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The demand side growth accounting studies the demand aggregate component contributions in the Gross Domestic Product (GDP). Traditionally, international and national organizations that uses the traditional method for calculating such contributions. However, this method does not take into account the effect the induction of imports by the various components of aggregate demand on the calculation of these. As an alternative to this method are presented others studies that consider this effect, as the alternative method proposed by Lara (2013), the attribution method, proposed by Kranendonk and Verbruggen (2005) and Hoekstra and van der Helm (2010), and the method the sraffian supermultiplier, by Freitas and Dweck (2013). Was made a summary of these methods, demonstrating the similarities and differences between them. Also, in the aim to contribute to the study of the subject was developed the “method of distribution of imports” that aims to distribute imports for the various components of aggregate demand, through the information set forth in the input-output matrices and tables of resources and uses. Were accounted the contributions to the growth of macroeconomic aggregates for Brazil from 2001 to 2009 using the method of distribution, and realized comparison with the traditional method, understanding the reasons for the differences in contributions. Later was done comparisons with all the methods presented in this work, between the calculated contributions to the growth of the components of aggregate demand and the domestic and external sectors. Was verified that the methods that exist in the literature was not enough to deal with this question, and given the alternatives for contributions to the growth presented throughout this work, it is believed that the method of distribution provides the best estimates for the account of contributions by aggregate demand sector. In particular, the main advantage of this method to the others is the breakdown of the contribution of imports, separated by aggregate demand component, which allows the analysis of contribution of each component to GDP growth. Thus, this type of analysis helps to study the pattern of growth of the Brazilian economy, not just the theoretical point of view, but also empirical and basis for the decision to economic policies

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our thesis tracks down and explores a method of reading Walter Benjamin´s texts and examines his proposal to a "coming philosophy" by highlighting his higher concept of "experience" (Erfahrung) for the foundation of a metaphysics for the Now-Time. Accordingly, it starts first from the application of the method and analyzes what is to be rescued from the Kantian legacy, as indicated in Benjamin´s youth "Program for a coming Philosophy" (1917-1918). Then, to discuss and evaluate Benjamin´s project of a "coming philosophy" several youth´s texts are compared with later texts which propose a new concept of experience as both aesthetic and historical basis for a philosophy of "NowTime" (Jetztzeit). Finally, by analogy with Kant´s critical attitude against dogmatic metaphysics the challenge of metaphysics today is discussed, especially Benjamin´s proposal to "Now-Time-Metaphysics" within the context of crisis and bankruptcy of explanatory models of reality in philosophy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we examine the extent to which the concept of emergence can be applied to questions about the nature and moral justification of territorial borders. Although the term is used with many different senses in philosophy, the concept of “weak emergence” - advocated by, for example, Sawyer (2002, 2005) and Bedau (1997 ) - is especially applicable, since it forces a distinction between prediction and explanation that connects with several issues in the discussion of territory. In particular, we argue, weak emergentism about borders allows us to distinguish between (a) using a theory of territory to say where a border should be drawn, and (b) looking at an existing border and saying whether or not it is justified (Miller, 2012; Nine, 2012; Stilz, 2011). Many authors conflate these two factors, or identify them by claiming that having one without the other is in some sense incoherent. But on our account - given the concept of emergence - one might unproblematically be able to have (b) without (a); at the very least, the distinction between these two issues is much more significant than has often been recognised, and more importantly gives us some reason to prefer “statist” as opposed to “cultural” theories of territorial borders. We conclude with some further reflections on related matters concerning, firstly, the apparent causal powers of borders, and secondly, the different ways in which borders are physically implemented (e.g., land vs. water).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation focuses on methods for the evaluation of complex policies. In particular, it focuses on evaluating interactions between policies and the extent to which two or more interacting policies mutually reinforce or hinder one another, in the area of environmental sustainability. Environmental sustainability is increasingly gaining recognition as a complex policy area, requiring a more systemic perspective and approach (e.g. European Commission, 2011). Current trends in human levels of resource consumption are unsustainable, and single solutions which target isolated issues independently of the broader context have so far fallen short. Instead there is a growing call among both academics and policy practitioners for systemic change which acknowledges and engages with the complex interactions, barriers and opportunities across the different actors, sectors, and drivers of production and consumption. Policy mixes, and the combination and ordering of policies within, therefore become an important focus for those aspiring to design and manage transitions to sustainability. To this end, we need a better understanding of the interactions, synergies and conflicts between policies (Cunningham et al., 2013; Geels, 2014). As a contribution to this emerging field of research and to inform its next steps, I present a review on what methods are available to try to quantify the impacts of complex policy interactions, since there is no established method among practitioners, and I explore the merits or value of such attempts. The presentation builds on key works in the field of complexity science (e.g. Anderson, 1972), revisiting and combining these with more recent contributions in the emerging field of policy and complex systems, and evaluation (e.g. Johnstone et al., 2010). With a coalition of UK Government departments, agencies and Research Councils soon to announce the launch of a new internationally-leading centre to pioneer, test and promote innovative and inclusive methods for policy evaluation across the energy-environment-food nexus, the contribution is particularly timely.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social media is changing the way we interact, present ideas and information and judge the quality of content and contributions. In recent years there have been hundreds of platforms to freely share all kinds of information and connect across networks. These new tools generate activity statistics and interactions among users such as mentions, retweets, conversations, comments on blogs or Facebook; managers references showing popularity ratings of more references shared by other researchers or repositories that generate statistics of visits or downloads of articles. This paper analyzes that have meaning and implications altmetrics, what are its advantages and critical platforms (Almetric.com, ImpactStory, Plos altmetrics, PlumX), reports progress and benefits for authors, publishers and librarians. It concluded that the value of alternative metrics as a complementary tool citation analysis is evident, although it is suggested that you should dig deeper into this issue to unravel the meaning and the potential value of these indicators to assess their potential.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research on women prisoners and drug use is scarce in our context and needs theoretical tools to understand their life paths. In this article, I introduce an intersectional perspective on the experiences of women in prison, with particular focus on drug use. To illustrate this, I draw on the life story of one of the women interviewed in prison, in order to explore the axes of inequality in the lives of women in prison. These are usually presented as accumulated and articulated in complex and diverse ways. The theoretical tool of intersectionality allows us to gain an understanding of the phenomenon of women prisoners who have used drugs. This includes both the structural constraints in which they were embedded and the decisions they made, considering the circumstances of disadvantage in which they were immersed. This is a perspective which has already been intuitively present since the dawn of feminist criminology in the English-speaking world and can now be developed further due to new contributions in this field of gender studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La percepción clara y distinta es el elemento sobre el que se asienta la certeza metafísica de Descartes. Con todo, el planteamiento de los argumentos escépticos referidos a la duda metódica cartesiana ha evidenciado la necesidad de hallar una justificación al propio criterio de la percepción clara y distinta. Frente a los intentos basados en la indubitabilidad de la percepción o en la garantía surgida de la bondad divina, se defenderá una justificación alternativa pragmatista.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

the article explores the putatively non-metaphysical – non-voluntarist, and even non-causal – concept of freedom outlined in Hegel’s work and discusses its influential interpretation by robert Pippin as an ‘essentially practical’ concept. I argue that Hegel’s affirmation of freedom must be distinguished from that of Kant and Fichte, since it does not rely on a prior understanding of self-consciousness as an originally teleological relation and it has not the nature of a claim ‘from a practical point of view’.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta investigación analiza el uso del sufijo diminutivo en un corpus oral de jóvenes de la República Dominicana. El material procede de la transcripción de veinte entrevistas orales realizadas en los años noventa en Santo Domingo. En este estudio se realiza un análisis de las ocurrencias documentadas, su morfología, sus preferencias en cuanto a la selección de las clases de palabras que se toman como base para la formación de diminutivos, sus posibles valores semánticos y comunicativos, y, por último, se determina la frecuencia de uso del diminutivo en función del sexo de los hablantes.