914 resultados para Learn how to program
Resumo:
Over the last decades, research on narcissism was dominated with a focus on grandiose narcissism as measured by the NPI (Raskin & Terry, 1988), however, recent discussions emphasize the broad range of manifestations of narcissism, in particular more vulnerable aspects. As a result, new questionnaires were developed to cover the full range of these aspects. One example is the Pathological Narcissism Scale (PNI, Pincus et al. 2009), a 52 item questionnaire with seven subscales covering both grandiose and vulnerable aspects. Validation studies show that narcissism as measured with the PNI differs substantially from narcissism as measured with the NPI. Moreover, a discussion concerning the composition of grandiose and vulnerable narcissism has evolved from these data. In our study we demonstrate how scores on narcissism and narcissism subtypes are associated with a broad variety of personality and clinical measures. In a sample of 1837 participants (1240 female, 597 male; mean age 26.8 years) we investigated the correlation patterns of both PNI and NPI subscales with constructs like FFM, aggression, emotions, clinical symptoms, and well-being. Results show that the assignment of subscales to grandiose and vulnerable subtypes are not unambiguous. We therefore conclude that the decision of how to measure narcissism needs further investigation.
Resumo:
BACKGROUND: Despite long-standing calls to disseminate evidence-based treatments for generalized anxiety (GAD), modest progress has been made in the study of how such treatments should be implemented. The primary objective of this study was to test three competing strategies on how to implement a cognitive behavioral treatment (CBT) for out-patients with GAD (i.e., comparison of one compensation vs. two capitalization models). METHODS: For our three-arm, single-blinded, randomized controlled trial (implementation of CBT for GAD [IMPLEMENT]), we recruited adults with GAD using advertisements in high-circulation newspapers to participate in a 14-session cognitive behavioral treatment (Mastery of your Anxiety and Worry, MAW-packet). We randomly assigned eligible patients using a full randomization procedure (1:1:1) to three different conditions of implementation: adherence priming (compensation model), which had a systematized focus on patients' individual GAD symptoms and how to compensate for these symptoms within the MAW-packet, and resource priming and supportive resource priming (capitalization model), which had systematized focuses on patients' strengths and abilities and how these strengths can be capitalized within the same packet. In the intention-to-treat population an outcome composite of primary and secondary symptoms-related self-report questionnaires was analyzed based on a hierarchical linear growth model from intake to 6-month follow-up assessment. This trial is registered at ClinicalTrials.gov (identifier: NCT02039193) and is closed to new participants. FINDINGS: From June 2012 to Nov. 2014, from 411 participants that were screened, 57 eligible participants were recruited and randomly assigned to three conditions. Forty-nine patients (86%) provided outcome data at post-assessment (14% dropout rate). All three conditions showed a highly significant reduction of symptoms over time. However, compared with the adherence priming condition, both resource priming conditions indicated faster symptom reduction. The observer ratings of a sub-sample of recorded videos (n = 100) showed that the therapists in the resource priming conditions conducted more strength-oriented interventions in comparison with the adherence priming condition. No patients died or attempted suicide. INTERPRETATION: To our knowledge, this is the first trial that focuses on capitalization and compensation models during the implementation of one prescriptive treatment packet for GAD. We have shown that GAD related symptoms were significantly faster reduced by the resource priming conditions, although the limitations of our study included a well-educated population. If replicated, our results suggest that therapists who implement a mental health treatment for GAD might profit from a systematized focus on capitalization models. FUNDING: Swiss Science National Foundation (SNSF-Nr. PZ00P1_136937/1) awarded to CF. KEYWORDS: Cognitive behavioral therapy; Evidence-based treatment; Implementation strategies; Randomized controlled trial
Resumo:
Cell competition is a conserved mechanism where slow proliferating cells (so called losers) are eliminated by faster proliferating neighbors (so called winners) through apoptosis.(1) It is an important process which prevents developmental malformations and maintains tissue fitness in aging adults.(2) Recently, we have shown that the probability of elimination of loser cells correlates with the surface of contact between losers and winners in Myc-induced competition.(3) Moreover, we have characterized an active mechanism that increases the surface of contact between losers and winners, hence accelerating the elimination of loser cells. This is the first indication that cell shape and mechanics can influence cell competition. Here, we will discuss the consequence of the relationship between shape and competition, as well as the relevance of this model for other modes of competition.
Resumo:
What some view as overly-generous funding of the Scottish parliament results from Scotland.s credible threat to secede from the United Kingdom. Scotland is shown to benefit from a second mover advantage in a non-cooperative sequential game over the allocation of public funds. Various reform proposals are criticized for not recognizing that reform of Scottish government finances must be consistent with Scotland.s credible threat. Fiscal autonomy -- in which the Scottish parliament finances a much greater proportion of its spending from Scottish-sourced taxes, is demonstrated to be a viable reform within the existing political context and, in some circumstances, could remove Scotland.s second mover advantage. We also use a cooperative bargaining game model to demonstrate that an Australian style grants commission would not be a viable reform in the British context.
Resumo:
At the University of Connecticut, we have been enticing graduate students to join graduate student trainers to learn how to answer the following questions and improve the breadth of their research: Do you need to find articles published outside your primary discipline? What are some seminal articles in your field? Have you ever wanted to know who cited an article you wrote? We are participating in Elsevier's Student Ambassador Program (SAmP) in which graduate students train their peers on "citation searching" research using Scopus and Web of Science, two tremendous citation databases. We are in the fourth semester of these training programs, and they are wildly successful: We have offered more than 30 classes and taught more than 350 students from March 2007 through March 2008.
Resumo:
Transnational Corporations (TNCs) have played a vital role in fostering rapid industrialisation in many developing countries. The Philippines is the case. However, the country has been far lagging behind other ASEAN members in economic performance. The present study examines this issue, mainly focusing on the linkage formation between TNCs affiliates and Philippine local suppliers. Three factors are proposed to determine the overall performance of linkage formation; i.e., outsourcing strategies of TNCs’ local affiliates, local entrepreneurial response, and host government policies. An economic enclave structure is clearly identified in the Philippines, in which only a few locally-owned suppliers have emerged. Extremely weak local entrepreneurship in the Philippines is identified to explain the poor performance of linkage formation.
Resumo:
Global linear instability theory is concerned with the temporal or spatial development of small-amplitude perturbations superposed upon laminar steady or time-periodic threedimensional flows, which are inhomogeneous in two (and periodic in one) or all three spatial directions.1 The theory addresses flows developing in complex geometries, in which the parallel or weakly nonparallel basic flow approximation invoked by classic linear stability theory does not hold. As such, global linear theory is called to fill the gap in research into stability and transition in flows over or through complex geometries. Historically, global linear instability has been (and still is) concerned with solution of multi-dimensional eigenvalue problems; the maturing of non-modal linear instability ideas in simple parallel flows during the last decade of last century2–4 has given rise to investigation of transient growth scenarios in an ever increasing variety of complex flows. After a brief exposition of the theory, connections are sought with established approaches for structure identification in flows, such as the proper orthogonal decomposition and topology theory in the laminar regime and the open areas for future research, mainly concerning turbulent and three-dimensional flows, are highlighted. Recent results obtained in our group are reported in both the time-stepping and the matrix-forming approaches to global linear theory. In the first context, progress has been made in implementing a Jacobian-Free Newton Krylov method into a standard finite-volume aerodynamic code, such that global linear instability results may now be obtained in compressible flows of aeronautical interest. In the second context a new stable very high-order finite difference method is implemented for the spatial discretization of the operators describing the spatial BiGlobal EVP, PSE-3D and the TriGlobal EVP; combined with sparse matrix treatment, all these problems may now be solved on standard desktop computers.
Resumo:
Program specialization optimizes programs for known valúes of the input. It is often the case that the set of possible input valúes is unknown, or this set is infinite. However, a form of specialization can still be performed in such cases by means of abstract interpretation, specialization then being with respect to abstract valúes (substitutions), rather than concrete ones. We study the múltiple specialization of logic programs based on abstract interpretation. This involves in principie, and based on information from global analysis, generating several versions of a program predicate for different uses of such predicate, optimizing these versions, and, finally, producing a new, "multiply specialized" program. While múltiple specialization has received theoretical attention, little previous evidence exists on its practicality. In this paper we report on the incorporation of múltiple specialization in a parallelizing compiler and quantify its effects. A novel approach to the design and implementation of the specialization system is proposed. The resulting implementation techniques result in identical specializations to those of the best previously proposed techniques but require little or no modification of some existing abstract interpreters. Our results show that, using the proposed techniques, the resulting "abstract múltiple specialization" is indeed a relevant technique in practice. In particular, in the parallelizing compiler application, a good number of run-time tests are eliminated and invariants extracted automatically from loops, resulting generally in lower overheads and in several cases in increased speedups.
Resumo:
Program specialization optimizes programs for known valúes of the input. It is often the case that the set of possible input valúes is unknown, or this set is infinite. However, a form of specialization can still be performed in such cases by means of abstract interpretation, specialization then being with respect to abstract valúes (substitutions), rather than concrete ones. This paper reports on the application of abstract múltiple specialization to automatic program parallelization in the &-Prolog compiler. Abstract executability, the main concept underlying abstract specialization, is formalized, the design of the specialization system presented, and a non-trivial example of specialization in automatic parallelization is given.
Resumo:
Cognitive linguistics have conscientiously pointed out the pervasiveness of conceptual mappings, particularly as conceptual blending and integration, that underlie language and that are unconsciously used in everyday speech (Fauconnier 1997, Fauconnier & Turner 2002; Rohrer 2007; Grady, Oakley & Coulson 1999). Moreover, as a further development of this work, there is a growing interest in research devoted to the conceptual mappings that make up specialized technical disciplines. Lakoff & Núñez 2000, for example, have produced a major breakthrough on the understanding of concepts in mathematics, through conceptual metaphor and as a result not of purely abstract concepts but rather of embodiment. On the engineering and architecture front, analyses on the use of metaphor, blending and categorization in English and Spanish have likewise appeared in recent times (Úbeda 2001, Roldán 1999, Caballero 2003a, 2003b, Roldán & Ubeda 2006, Roldán & Protasenia 2007). The present paper seeks to show a number of significant conceptual mappings underlying the language of architecture and civil engineering that seem to shape the way engineers and architects communicate. In order to work with a significant segment of linguistic expressions in this field, a corpus taken from a widely used technical Spanish engineering journal article was collected and analysed. The examination of the data obtained indicates that many tokens make a direct reference to therapeutic conceptual mappings, highlighting medical domains such as diagnosing,treating and curing. Hence, the paper illustrates how this notion is instantiated by the corresponding bodily conceptual integration. In addition, we wish to underline the function of visual metaphors in the world of modern architecture by evoking parts of human or animal anatomy, and how this is visibly noticeable in contemporary buildings and public works structures.
Resumo:
Over the last decade, Grid computing paved the way for a new level of large scale distributed systems. This infrastructure made it possible to securely and reliably take advantage of widely separated computational resources that are part of several different organizations. Resources can be incorporated to the Grid, building a theoretical virtual supercomputer. In time, cloud computing emerged as a new type of large scale distributed system, inheriting and expanding the expertise and knowledge that have been obtained so far. Some of the main characteristics of Grids naturally evolved into clouds, others were modified and adapted and others were simply discarded or postponed. Regardless of these technical specifics, both Grids and clouds together can be considered as one of the most important advances in large scale distributed computing of the past ten years; however, this step in distributed computing has came along with a completely new level of complexity. Grid and cloud management mechanisms play a key role, and correct analysis and understanding of the system behavior are needed. Large scale distributed systems must be able to self-manage, incorporating autonomic features capable of controlling and optimizing all resources and services. Traditional distributed computing management mechanisms analyze each resource separately and adjust specific parameters of each one of them. When trying to adapt the same procedures to Grid and cloud computing, the vast complexity of these systems can make this task extremely complicated. But large scale distributed systems complexity could only be a matter of perspective. It could be possible to understand the Grid or cloud behavior as a single entity, instead of a set of resources. This abstraction could provide a different understanding of the system, describing large scale behavior and global events that probably would not be detected analyzing each resource separately. In this work we define a theoretical framework that combines both ideas, multiple resources and single entity, to develop large scale distributed systems management techniques aimed at system performance optimization, increased dependability and Quality of Service (QoS). The resulting synergy could be the key 350 J. Montes et al. to address the most important difficulties of Grid and cloud management.