862 resultados para Worth


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Self-control is a prerequisite for complex cognitive processes such as cooperation and planning. As such, comparative studies of self-control may help elucidate the evolutionary origin of these capacities. A variety of methods have been developed to test for self-control in non-human primates that include some variation of foregoing an immediate reward in order to gain a more favorable reward. We used a token exchange paradigm to test for self-control in capuchin monkeys (Cebus apella). Animals were trained that particular tokens could be exchanged for food items worth different values. To test for self-control, a monkey was provided with a token that was associated with a lower-value food. When the monkey exchanged the token, the experimenter provided the monkey with a choice between the lower-value food item associated with the token or another token that was associated with a higher-value food. If the monkey chose the token, they could then exchange it for the higher-value food. Of seven monkeys trained to exchange tokens, five demonstrated that they attributed value to the tokens by differentially selecting tokens for higher-value foods over tokens for lower-value foods. When provided with a choice between a food item or a token for a higher-value food, two monkeys selected the token significantly more than expected by chance. The ability of capuchin monkeys to forego an immediate food reward and select a token that could then be traded for a more preferred food demonstrated some degree of self-control. Thus, results suggest a token exchange paradigm could be a successful technique for assessing self-control in this New World species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Peptide hormones of the glucagon-like peptide (GLP) family play an increasing clinical role, such as GLP-1 in diabetes therapy. Moreover, GLP receptors are overexpressed in various human tumor types and therefore represent molecular targets for important clinical applications. In particular, virtually all benign insulinomas highly overexpress GLP-1 receptors (GLP-1R). Targeting GLP-1R with the stable GLP-1 analogs (111)In-DOTA/DPTA-exendin-4 offers a new approach to successfully localize these small tumors. This non-invasive technique has the potential to replace the invasive localization of insulinomas by selective arterial stimulation and venous sampling. Malignant insulinomas, in contrast to their benign counterparts, express GLP-1R in only one-third of the cases, while they more often express the somatostatin type 2 receptors. Importantly, one of the two receptors appears to be always expressed in malignant insulinomas. The GLP-1R overexpression in selected cancers is worth to be kept in mind with regard to the increasing use of GLP-1 analogs for diabetes therapy. While the functional role of GLP-1R in neoplasia is not known yet, it may be safe to monitor patients undergoing GLP-1 therapy carefully.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Twenty-five years have passed since the first randomised controlled trial began its recruitment for screening for abdominal aortic aneurysm (AAA) in men aged 65 and above. Since this and other randomised trials, all launched in the late 80s and 90s of the last century, the epidemiologic profile of abdominal aortic aneurysm may have changed. The trials reported an AAA prevalence in the range of 4-7% for men aged 65 years or more. AAA-related mortality was significantly improved by screening, and after 13 years, the largest trial showed a benefit for all-cause mortality. Screening also was shown to be cost-effective. Today, there are studies showing a substantial decrease of AAA prevalence to sometimes less than 2% in men aged ≥ 65 years and there is evidence that the incidence of ruptured aneurysm and mortality from AAA is also declining. This decline preceded the implementation of screening programmes but may be due to a change in risk factor management. The prevalence of smoking has decreased and there has been improvement in the control of hypertension and a rising use of statins for cardiovascular risk prevention. Additionally, there is a shift of the burden to the older age group of ≥ 75 years. Such radical changes may influence screening policy and it is worth reflecting on the optimum age of screening - it might be better to screen at ages >65 years - or rescreening 5 to 10 years after the first screen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, I will document and analyze historical aspects of the British debate over adopting a common currency with the European Community primarily during the last half of the twentieth century until the present. More specifically, while on the surface such a decision would seem to turn on economic or political considerations, I will show that this historic British decision not to surrender their pound sterling in exchange for the euro was rooted in the nation's cultural identity. During this decades long British debate over the euro, two opposing, but strongly held, positions developed; one side believed that Britain had a compelling interest in bonding with the rest of Europe economically as well as politically, the other side believed that Britain's independent heritage was deeply rooted in many of its traditions including maintaining control of its own monetary matters, which included keeping its pound sterling. As part of this thesis, I have conducted interviews with business leaders, economists, and social scientists as well as researched public records in order to assess many of the arguments favoring and opposing Britain's adoption of the euro. Many Britons strongly believed that it was time to join other Europeans, who were willing to sacrifice their sovereign currency to a bold common currency experiment, while other Britons viewed the pound sterling as too integral a part of British heritage to abandon. Ultimately, British leaders and citizens had to determine whether such a currency tradeoff would be worth it to them as a nation. It was a gamble that twelve other nations (at the time of the euro's 2002 launch) were ready to take, optimistically calculating that easier credit and reduced exchange transaction costs would lead to greater economic prosperity. Many asserted that only with ! ! such a united European monetary coalition would Europe's nations be able to compete trade-wise with powerful economic nations like the United States and China. My conclusion is that Britain's refusal to join the euro was a decision that had less to do with economic opportunity or political motivations and much more to do with how the British people viewed themselves culturally and their identity as an independent nation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The response of some Argentine workers to the 2001 crisis of neoliberalism gave rise to a movement of worker-recovered enterprises (empresas recuperadas por sus trabajadores or ERTs). The ERTs have emerged as former employees took over the control of generally fraudulently bankrupt factories and enterprises. The analysis of the ERT movement within the neoliberal global capitalist order will draw from William Robinson’s (2004) neo-Gramscian concept of hegemony. The theoretical framework of neo-Gramscian hegemony will be used in exposing the contradictions of capitalism on the global, national, organizational and individual scales and the effects they have on the ERT movement. The ERT movement has demonstrated strong level of resilience, despite the numerous economic, social, political and cultural challenges and limitations it faces as a consequence of the implementation of neoliberalism globally. ERTs have shown that through non-violent protests, democratic principles of management and social inclusion, it is possible to start constructing an alternative social order that is based on the cooperative principles of “honesty, openness, social responsibility and caring for others” (ICA 2007) as opposed to secrecy, exclusiveness, individualism and self-interestedness. In order to meet this “utopian” vision, it is essential to push the limits of the possible within the current social order and broaden the alliance to include the organized members of the working class, such as the members of trade unions, and the unorganized, such as the unemployed and underemployed. Though marginal in number and size, the members of ERTs have given rise to a model that is worth exploring in other countries and regions burdened by the contradictory workings of capitalism. Today, ERTs serve as living proofs that workers too are capable of successfully running businesses, not capitalists alone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary objective of this thesis is to demonstrate the pernicious impact that moral hierarchies have on our perception and subsequent treatment of non-human animals. Moral hierarchies in general are characterized by a dynamic in which one group is considered to be fundamentally superior to a lesser group. This thesis focuses specifically on the moral hierarchies that arise when humans are assumed to be superior to non-human animals in virtue of their advanced mental capabilities. The operative hypothesis of this thesis is essentially that moral hierarchies thwart the provision of justice to non-human animals in that they function as a justification for otherwise impermissible actions. When humans are assumed to be fundamentally superior to non-human animals then it becomes morally permissible for humans to kill non-human animals and utilize them as mere instrumentalities. This thesis is driven primarily by an in-depth analysis of the approaches to animal rights that are provided by Peter Singer, Tom Regan, and Gary Francione. Each of these thinkers claim that they overcome anthropocentrism and provide approaches that preclude the establishment of a moral hierarchy. One of the major findings of this thesis, however, is that Singer and Regan offer approaches that remain highly anthropocentric despite the fact that each thinker claims that they have overcome anthropocentrism. The anthropocentrism persists in these respective approaches in that each thinkers gives humans Regan and Singer have different conceptions of the criteria that are required to afford a being moral worth, but they both give preference to beings that have the cognitive ability to form desires regarding the future.. As a result, a moral hierarchy emerges in which humans are regarded to be fundamentally superior. Francione, however, provides an approach that does not foster a moral hierarchy. Francione creates such an approach by applying the principle of equal consideration of interests in a consistent manner. Moreover, Francione argues that mere sentience is both a necessary and sufficient condition for being eligible and subsequently receiving moral consideration. The upshot of this thesis is essentially that the moral treatment of animals is not compatible with the presence of a moral hierarchy. As a result, this thesis demonstrates that future approaches to animal rights must avoid the establishment of moral hierarchies. The research and analysis within this thesis demonstrates that this is not a possibility, however, unless all theories of justice that are to accommodate animals abandon the notion that cognition matters morally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent history, there has been a trend of increasing partisan polarization throughout most of the American political system. Some of the impacts of this polarization are obvious; however, there is reason to believe that we miss some of the indirect effects of polarization. Accompanying the trend of increased polarization has been an increase in the contentiousness of the Supreme Court confirmation process. I believe that these two trends are related. Furthermore, I argue that these trends have an impact on judicial behavior. This is an issue worth exploring, since the Supreme Court is the most isolated branch of the federal government. The Constitution structured the Supreme Court to ensure that it was as isolated as possible from short-term political pressures and interests. This study attempts to show how it may be possible that those goals are no longer being fully achieved. My first hypothesis in this study is that increases in partisan polarization are a direct cause of the increase in the level of contention during the confirmation process. I then hypothesize that the more contention a justice faces during his or her confirmation process, the more ideologically extreme that justice will then vote on the bench. This means that a nominee appointed by a Republican president will tend to vote even more conservatively than was anticipated following a contentious confirmation process, and vice versa for Democratic appointees. In order to test these hypotheses, I developed a data set for every Supreme Court nominee dating back to President Franklin D. Roosevelt¿s appointments (1937). With this data set, I ran a series of regression models to analyze these relationships. Statistically speaking, the results support my first hypothesis in a fairly robust manner. My regression results for my second hypothesis indicate that the trend I am looking for is present for Republican nominees. For Democratic nominees, the impacts are less robust. Nonetheless, as the results will show, contention during the confirmation process does seem to have some impact on judicial behavior. Following my quantitative analysis, I analyze a series of case studies. These case studies serve to provide tangible examples of these statistical trends as well as to explore what else may be going on during the confirmation process and subsequent judicial decision-making. I use Justices Stevens, Rehnquist, and Alito as the subjects for these case studies. These cases will show that the trends described above do seem to be identifiable at the level of an individual case. These studies further help to indicate other potential impacts on judicial behavior. For example, following Justice Rehnquist¿s move from Associate to Chief Justice, we see a marked change in his behavior. Overall, this study serves as a means of analyzing some of the more indirect impacts of partisan polarization in modern politics. Further, the study offers a means of exploring some of the possible constraints (both conscious and subconscious) that Supreme Court justices may feel while they decide how to cast a vote in a particular case. Given the wide-reaching implications of Supreme Court decisions, it is important to try to grasp a full view of how these decisions are made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conflict has marked civilization from Biblical times to the present day. Each of us, with our different and competing interests, and our desires to pursue those interests, have over time wronged another person. Not surprisingly then, forgiveness is a concern of individuals and groups¿communities, countries, religious groups, races¿yet it is a complex idea that philosophers, theologians, political scientists, and psychologists have grappled with. Some have argued that forgiveness is a therapeutic means for overcoming guilt, pain, and anger. Forgiveness is often portrayed as a coping mechanism¿how often we hear the phrase, ¿forgive and forget,¿ as an arrangement to help two parties surmount the complications of disagreement. But forgiveness is not simply a modus vivendi; the ability to forgive and conversely to ask for forgiveness, is counted as an admirable trait and virtue. This essay will explore the nature of forgiveness, which in Christian dogma is often posited as an unqualified virtue. The secular world has appropriated the Christian notion of forgiveness as such a virtue¿but are there instances wherein offering forgiveness is morally inappropriate or dangerous? I will consider the situations in which forgiveness, understood in this essay as the overcoming of resentment, may not be a virtue¿when perhaps maintaining resentment is as virtuous, if not more virtuous, than forgiving. I will explain the various ethical frameworks involved in understanding forgiveness as a virtue, and the relationship between them. I will argue that within Divine Command Theory forgiveness is a virtue¿and thus morally right¿because God commands it. This ethical system has established forgiveness as unconditional, an idea which has been adopted into popular culture. With virtue ethics in mind, which holds virtues to be those traits which benefit the person who possesses them, contributing to the good life, I will argue unqualified forgiveness is not always a virtue, as it will not always benefit the victim. Because there is no way to avoid wrongdoing, humans are confronted with the question of forgiveness with every indiscretion. Its limits, its possibilities, its relationship to one¿s character¿forgiveness is a concern of all people at some time if for no other reason than the plain fact that the past cannot be undone. I will be evaluating the idea of forgiveness as a virtue, in contrast to its counterpart, resentment. How can forgiveness be a response to evil, a way to renounce resentment, and a means of creating a positive self-narrative? And what happens when a sense of moral responsibility is impossible to reconcile with the Christian (and now, secularized imperative of) forgiveness? Is it ever not virtuous to forgive? In an attempt to answer that question I will argue that there are indeed times when forgiveness is not a virtue, specifically: when forgiveness compromises one¿s own self-respect; when it is not compatible with respect for the moral community; and when the offender is unapologetic. The kind of offense I have in mind is a dehumanizing one, one that intends to diminish another person¿s worth or humanity. These are moral injuries, to which I will argue resentment is a better response than forgiveness when the three qualifications cannot be met.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Taking the three basic systems of Yes/No particles the group looked at the relative deep and surface structures, and asked what types of systems are present in the Georgian, Polish and Armenian languages. The choice of languages was of particular interest as the Caucasian and Indo-European languages usually have different question-answering systems, but Georgian (Caucasian) and Polish (Indo-European) in fact share the same system. The Armenian language is Indo-European, but the country is situated in the southern Caucasus, on Georgia's southern border, making it worth analysing Armenian in comparison with Georgian (from the point of view of language interference) and with Polish (as two relative languages). The group identified two different deep structures, tracing the occurrence of these in different languages, and showed that one is more natural in the majority of languages. They found no correspondence between relative languages and their question-answer systems and demonstrated that languages in the same typological class may show different systems, as with Georgian and the North Caucasian languages. It became clear that Georgian, Armenian and Polish all have an agree/disagree question-answering system defined by the same deep structure. From this they conclude that the lingual mentalities of Georgians, Armenians and Poles are more oriented to the communicative act. At the same time the Yes/No system, in which a positive particle stands for a positive answer and a negative particle for a negative answer, also functions in these languages, indicating that the second deep structure identified also functions alongside the first.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mr. Kubon's project was inspired by the growing need for an automatic, syntactic analyser (parser) of Czech, which could be used in the syntactic processing of large amounts of texts. Mr. Kubon notes that such a tool would be very useful, especially in the field of corpus linguistics, where creating a large-scale "tree bank" (a collection of syntactic representations of natural language sentences) is a very important step towards the investigation of the properties of a given language. The work involved in syntactically parsing a whole corpus in order to get a representative set of syntactic structures would be almost inconceivable without the help of some kind of robust (semi)automatic parser. The need for the automatic natural language parser to be robust increases with the size of the linguistic data in the corpus or in any other kind of text which is going to be parsed. Practical experience shows that apart from syntactically correct sentences, there are many sentences which contain a "real" grammatical error. These sentences may be corrected in small-scale texts, but not generally in the whole corpus. In order to be able to complete the overall project, it was necessary to address a number of smaller problems. These were; 1. the adaptation of a suitable formalism able to describe the formal grammar of the system; 2. the definition of the structure of the system's dictionary containing all relevant lexico-syntactic information, and the development of a formal grammar able to robustly parse Czech sentences from the test suite; 3. filling the syntactic dictionary with sample data allowing the system to be tested and debugged during its development (about 1000 words); 4. the development of a set of sample sentences containing a reasonable amount of grammatical and ungrammatical phenomena covering some of the most typical syntactic constructions being used in Czech. Number 3, building a formal grammar, was the main task of the project. The grammar is of course far from complete (Mr. Kubon notes that it is debatable whether any formal grammar describing a natural language may ever be complete), but it covers the most frequent syntactic phenomena, allowing for the representation of a syntactic structure of simple clauses and also the structure of certain types of complex sentences. The stress was not so much on building a wide coverage grammar, but on the description and demonstration of a method. This method uses a similar approach as that of grammar-based grammar checking. The problem of reconstructing the "correct" form of the syntactic representation of a sentence is closely related to the problem of localisation and identification of syntactic errors. Without a precise knowledge of the nature and location of syntactic errors it is not possible to build a reliable estimation of a "correct" syntactic tree. The incremental way of building the grammar used in this project is also an important methodological issue. Experience from previous projects showed that building a grammar by creating a huge block of metarules is more complicated than the incremental method, which begins with the metarules covering most common syntactic phenomena first, and adds less important ones later, especially from the point of view of testing and debugging the grammar. The sample of the syntactic dictionary containing lexico-syntactical information (task 4) now has slightly more than 1000 lexical items representing all classes of words. During the creation of the dictionary it turned out that the task of assigning complete and correct lexico-syntactic information to verbs is a very complicated and time-consuming process which would itself be worth a separate project. The final task undertaken in this project was the development of a method allowing effective testing and debugging of the grammar during the process of its development. The problem of the consistency of new and modified rules of the formal grammar with the rules already existing is one of the crucial problems of every project aiming at the development of a large-scale formal grammar of a natural language. This method allows for the detection of any discrepancy or inconsistency of the grammar with respect to a test-bed of sentences containing all syntactic phenomena covered by the grammar. This is not only the first robust parser of Czech, but also one of the first robust parsers of a Slavic language. Since Slavic languages display a wide range of common features, it is reasonable to claim that this system may serve as a pattern for similar systems in other languages. To transfer the system into any other language it is only necessary to revise the grammar and to change the data contained in the dictionary (but not necessarily the structure of primary lexico-syntactic information). The formalism and methods used in this project can be used in other Slavic languages without substantial changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recovering the architecture is the first step towards reengineering a software system. Many reverse engineering tools use top-down exploration as a way of providing a visual and interactive process for architecture recovery. During the exploration process, the user navigates through various views on the system by choosing from several exploration operations. Although some sequences of these operations lead to views which, from the architectural point of view, are mode relevant than others, current tools do not provide a way of predicting which exploration paths are worth taking and which are not. In this article we propose a set of package patterns which are used for augmenting the exploration process with in formation about the worthiness of the various exploration paths. The patterns are defined based on the internal package structure and on the relationships between the package and the other packages in the system. To validate our approach, we verify the relevance of the proposed patterns for real-world systems by analyzing their frequency of occurrence in six open-source software projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to measure gene expression on a genome-wide scale is one of the most promising accomplishments in molecular biology. Microarrays, the technology that first permitted this, were riddled with problems due to unwanted sources of variability. Many of these problems are now mitigated, after a decade’s worth of statistical methodology development. The recently developed RNA sequencing (RNA-seq) technology has generated much excitement in part due to claims of reduced variability in comparison to microarrays. However, we show RNA-seq data demonstrates unwanted and obscuring variability similar to what was first observed in microarrays. In particular, we find GC-content has a strong sample specific effect on gene expression measurements that, if left uncorrected, leads to false positives in downstream results. We also report on commonly observed data distortions that demonstrate the need for data normalization. Here we describe statistical methodology that improves precision by 42% without loss of accuracy. Our resulting conditional quantile normalization (CQN) algorithm combines robust generalized regression to remove systematic bias introduced by deterministic features such as GC-content, and quantile normalization to correct for global distortions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Breast cancer occurring in women before the age of menopause continues to be a major medical and psychological challenge. Endocrine therapy has emerged as the mainstay of adjuvant treatment for women with estrogen receptor-positive tumours. Although the suppression of ovarian function (by oophorectomy, irradiation of the ovaries or gonadotropin releasing factor analogues) is effective as adjuvant therapy if used alone, its value has not been proven after chemotherapy. This is presumably because of the frequent occurrence of chemotherapy-induced amenorrhoea. Tamoxifen reduces the risk of recurrence by approximately 40%, irrespective of age and the ovarian production of estrogens. The worth of ovarian function suppression in combination with tamoxifen is unproven and is being investigated in an intergroup randomised clinical trial (SOFT [Suppression of Ovarian Function Trial]). Aromatase inhibitors are more effective than tamoxifen in postmenopausal women but are only being investigated in younger patients. The use of chemotherapies is identical in younger and older patients; however, at present the efficacy of chemotherapy in addition to ovarian function suppression plus tamoxifen is unknown in premenopausal patients with endocrine responsive disease. 'Targeted' therapies such as monoclonal antibodies to human epidermal growth factor receptor (HER)-2, HER1 and vascular endothelial growth factor, 'small molecule' inhibitors of tyrosine kinases and breast cancer vaccines are rapidly emerging. Their use depends on the function of the targeted pathways and is presently limited to clinical trials. Premenopausal patients are best treated in the framework of a clinical trial.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While the 1913-1914 copper country miners’ strike undoubtedly plays an important role in the identity of the Keweenaw Peninsula, it is worth noting that the model of mining corporations employing large numbers of laborers was not a foregone conclusion in the history of American mining. Between 1807 and 1847, public mineral lands in Missouri, in the Upper Mississippi Valley, and along the southern shore of Lake Superior were reserved from sale and subject to administration by the nation’s executive branch. By decree of the federal government, miners in these regions were lessees, not landowners. Yet, in the Wisconsin lead region especially, federal authorities reserved for independent “diggers” the right to prospect virtually unencumbered. In doing so, they preserved a comparatively egalitarian system in which the ability to operate was determined as much by luck as by financial resources. A series of revolts against federal authority in the early nineteenth century gradually encouraged officers in Washington to build a system in the copper country in which only wealthy investors could marshal the resources to both obtain permits and actually commence mining operations. This paper will therefore explore the role of the federal government in establishing a leasing system for public mineral lands in the years previous to the California Gold Rush, highlighting the development of corporate mining which ultimately set a stage for the wave of miners’ strikes in the late nineteenth and early twentieth centuries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The smelting of complex lead ores is a difficult operation, especially when they contain considerable amounts of iron and zinc. When these ores are smelted, all of the zinc, which is valuable and well worth recovering, goes into the slag. With the advent of the flotation processes, and the ability of these processes to concentrate the lead and zinc minerals into separate products, the smelting of complex lead ores was to a great extent simplified.