995 resultados para Printz, Johan, 1592-1663.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter shows that apart from changes at the systemic and institutional levels, successful reform implementation struggles with a gradual change in academic beliefs, attitudes and behaviours. Currently, visions of the university proposed by the Polish academic community and visions of it proposed by Polish reformers and policymakers (within ongoing reforms) are worlds apart. I shall study recent reforms in the context of specific academic self--protective narratives being produced in the last two decades (at the collective level of the academic profession) and in the context of the Ivory Tower university ideals predominant at the individual level (as studied comparatively through a large--scale European survey of the academic profession). Institutions change both swiftly, radically – and slowly, gradually. Research literature on institutional change until recently was focused almost exclusively on the role of radical changes caused by external shocks, leading to radical institutional reconfigurations. And research literature about the gradual, incremental institutional change have been emergent for about a decade and a half now (Mahoney and Thelen 2010; Streeck and Thelen 2005, 2009; Thelen 2003). Polish higher education provides interesting empirical grounds to test institutional theories. Both types of transformations (radical and gradual) may lead to equally permanent changes in the functioning of institutions, equally deep transformations of their fundamental rules, norms and operating procedures. Questions about institutional change are questions about characteristics of institutions undergoing changes. Endogenous institutional change is as important as exogenous change (Mahoney and Thelen 2010: 3). Moments in which there emerge opportunities of performing deep institutional reforms are short (in Poland these moments occurred in 2009-2012), and between them there are long periods of institutional stasis and stability (Pierson 2004: 134-135). The premises of theories of institutional change can be applied systematically to a system of higher education which shows an unprecedented rate of change and which is exposed to broad, fundamental reform programmes. There are many ways to discuss the Kudrycka reforms - and "constructing Polish universities as organizations" (rather than traditional academic "institutions") is one of more promising. In this account, Polish universities are under construction as organizations, and under siege as institutions. They are being rationalized as organizations, following instrumental rather than institutional logics. Polish academics in their views and attitudes are still following an institutional logic, while Polish reforms are following the new (New Public Management-led) instrumental logics. Both are on a collision course about basic values. Reforms and reformees seem to be worlds apart. I am discussing the the two contrasting visions of the university and describing the Kudrycka reforms as the reistitutionalization of the research mission of Polish universities. The core of reforms is a new level of funding and governance - the intermediary one (and no longer the state one), with four new peer-run institutions, with the KEJN, PKA and NCN in the lead. Poland has been beginning to follow the "global rules of the academic game" since 2009. I am also discussing two academic self-protection modes agains reforms: (Polish) "national academic traditions" and "institutional exceptionalism" (of Polish HE). Both discourses prevailed for two decades, none seems socially (and politically) acceptable any more. Old myths do not seem to fit new realities. In this context I am discussing briefly and through large-scale empirical data the low connectedness to the outside world of Polish HE institutions, low influence of the government on HE policies and the low level of academic entrepreneurialism, as seen through the EUROAC/CAP micro-level data. The conclusion is that the Kudrycka reforms are an imporant first step only - Poland is too slow in reforms, and reforms are both underfunded and inconsistent. Poland is still accumulating disadvantages as public funding and university reforms have not reached a critical point. Ever more efforts lead to ever less results, as macro-level data show. Consequently, it may be useful to construct universities as organizations in Poland to a higher degree than elsewhere in Europe, and especially in Western Europe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent measurements of local-area and wide-area traffic have shown that network traffic exhibits variability at a wide range of scales self-similarity. In this paper, we examine a mechanism that gives rise to self-similar network traffic and present some of its performance implications. The mechanism we study is the transfer of files or messages whose size is drawn from a heavy-tailed distribution. We examine its effects through detailed transport-level simulations of multiple TCP streams in an internetwork. First, we show that in a "realistic" client/server network environment i.e., one with bounded resources and coupling among traffic sources competing for resources the degree to which file sizes are heavy-tailed can directly determine the degree of traffic self-similarity at the link level. We show that this causal relationship is not significantly affected by changes in network resources (bottleneck bandwidth and buffer capacity), network topology, the influence of cross-traffic, or the distribution of interarrival times. Second, we show that properties of the transport layer play an important role in preserving and modulating this relationship. In particular, the reliable transmission and flow control mechanisms of TCP (Reno, Tahoe, or Vegas) serve to maintain the long-range dependency structure induced by heavy-tailed file size distributions. In contrast, if a non-flow-controlled and unreliable (UDP-based) transport protocol is used, the resulting traffic shows little self-similar characteristics: although still bursty at short time scales, it has little long-range dependence. If flow-controlled, unreliable transport is employed, the degree of traffic self-similarity is positively correlated with the degree of throttling at the source. Third, in exploring the relationship between file sizes, transport protocols, and self-similarity, we are also able to show some of the performance implications of self-similarity. We present data on the relationship between traffic self-similarity and network performance as captured by performance measures including packet loss rate, retransmission rate, and queueing delay. Increased self-similarity, as expected, results in degradation of performance. Queueing delay, in particular, exhibits a drastic increase with increasing self-similarity. Throughput-related measures such as packet loss and retransmission rate, however, increase only gradually with increasing traffic self-similarity as long as reliable, flow-controlled transport protocol is used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As new multi-party edge services are deployed on the Internet, application-layer protocols with complex communication models and event dependencies are increasingly being specified and adopted. To ensure that such protocols (and compositions thereof with existing protocols) do not result in undesirable behaviors (e.g., livelocks) there needs to be a methodology for the automated checking of the "safety" of these protocols. In this paper, we present ingredients of such a methodology. Specifically, we show how SPIN, a tool from the formal systems verification community, can be used to quickly identify problematic behaviors of application-layer protocols with non-trivial communication models—such as HTTP with the addition of the "100 Continue" mechanism. As a case study, we examine several versions of the specification for the Continue mechanism; our experiments mechanically uncovered multi-version interoperability problems, including some which motivated revisions of HTTP/1.1 and some which persist even with the current version of the protocol. One such problem resembles a classic degradation-of-service attack, but can arise between well-meaning peers. We also discuss how the methods we employ can be used to make explicit the requirements for hardening a protocol's implementation against potentially malicious peers, and for verifying an implementation's interoperability with the full range of allowable peer behaviors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been shown previously that female mice homozygous for an alpha-fetoprotein (AFP) null allele are sterile as a result of anovulation, probably due to a defect in the hypothalamic-pituitary axis. Here we show that these female mice exhibit specific anomalies in the expression of numerous genes in the pituitary, including genes involved in the gonadotropin-releasing hormone pathway, which are underexpressed. In the hypothalamus, the gonadotropin-releasing hormone gene, Gnrh1, was also found to be down-regulated. However, pituitary gene expression could be normalized and fertility could be rescued by blocking prenatal estrogen synthesis using an aromatase inhibitor. These results show that AFP protects the developing female brain from the adverse effects of prenatal estrogen exposure and clarify a long-running debate on the role of this fetal protein in brain sexual differentiation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To make adaptive choices, individuals must sometimes exhibit patience, forgoing immediate benefits to acquire more valuable future rewards [1-3]. Although humans account for future consequences when making temporal decisions [4], many animal species wait only a few seconds for delayed benefits [5-10]. Current research thus suggests a phylogenetic gap between patient humans and impulsive, present-oriented animals [9, 11], a distinction with implications for our understanding of economic decision making [12] and the origins of human cooperation [13]. On the basis of a series of experimental results, we reject this conclusion. First, bonobos (Pan paniscus) and chimpanzees (Pan troglodytes) exhibit a degree of patience not seen in other animals tested thus far. Second, humans are less willing to wait for food rewards than are chimpanzees. Third, humans are more willing to wait for monetary rewards than for food, and show the highest degree of patience only in response to decisions about money involving low opportunity costs. These findings suggest that core components of the capacity for future-oriented decisions evolved before the human lineage diverged from apes. Moreover, the different levels of patience that humans exhibit might be driven by fundamental differences in the mechanisms representing biological versus abstract rewards.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reminiscence bump is the tendency to recall more autobiographical memories from adolescence and early adulthood than from adjacent lifetime periods. In this online study, the robustness of the reminiscence bump was examined by looking at participants' judgements about the quality of football players. Dutch participants (N = 619) were asked who they thought the five best players of all time were. The participants could select the names from a list or enter the names when their favourite players were not on the list. Johan Cruijff, Pelé, and Diego Maradona were the three most often mentioned players. Participants frequently named football players who reached the midpoint of their career when the participants were adolescents (mode = 17). The results indicate that the reminiscence bump can also be identified outside the autobiographical memory domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computed tomography (CT) is one of the most valuable modalities for in vivo imaging because it is fast, high-resolution, cost-effective, and non-invasive. Moreover, CT is heavily used not only in the clinic (for both diagnostics and treatment planning) but also in preclinical research as micro-CT. Although CT is inherently effective for lung and bone imaging, soft tissue imaging requires the use of contrast agents. For small animal micro-CT, nanoparticle contrast agents are used in order to avoid rapid renal clearance. A variety of nanoparticles have been used for micro-CT imaging, but the majority of research has focused on the use of iodine-containing nanoparticles and gold nanoparticles. Both nanoparticle types can act as highly effective blood pool contrast agents or can be targeted using a wide variety of targeting mechanisms. CT imaging can be further enhanced by adding spectral capabilities to separate multiple co-injected nanoparticles in vivo. Spectral CT, using both energy-integrating and energy-resolving detectors, has been used with multiple contrast agents to enable functional and molecular imaging. This review focuses on new developments for in vivo small animal micro-CT using novel nanoparticle probes applied in preclinical research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

All biological phenomena depend on molecular recognition, which is either intermolecular like in ligand binding to a macromolecule or intramolecular like in protein folding. As a result, understanding the relationship between the structure of proteins and the energetics of their stability and binding with others (bio)molecules is a very interesting point in biochemistry and biotechnology. It is essential to the engineering of stable proteins and to the structure-based design of pharmaceutical ligands. The parameter generally used to characterize the stability of a system (the folded and unfolded state of the protein for example) is the equilibrium constant (K) or the free energy (deltaG(o)), which is the sum of enthalpic (deltaH(o)) and entropic (deltaS(o)) terms. These parameters are temperature dependent through the heat capacity change (deltaCp). The thermodynamic parameters deltaH(o) and deltaCp can be derived from spectroscopic experiments, using the van't Hoff method, or measured directly using calorimetry. Along with isothermal titration calorimetry (ITC), differential scanning calorimetry (DSC) is a powerful method, less described than ITC, for measuring directly the thermodynamic parameters which characterize biomolecules. In this article, we summarize the principal thermodynamics parameters, describe the DSC approach and review some systems to which it has been applied. DSC is much used for the study of the stability and the folding of biomolecules, but it can also be applied in order to understand biomolecular interactions and can thus be an interesting technique in the process of drug design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Se presenta un avance de una investigación de tipo cualitativo en la cual se busca identificar las características de razonamiento presentadas en estudiantes de grado quinto al momento de enfrentarse a situaciones de tipo variacional; dichas características se discuten a la luz del marco conceptual para la covariación propuesto por Carlson, Jacobs, Coe, Larsen, y Hsu (2003). Desde las situaciones, se desprenden algunas implicaciones y recomendaciones para su implementación en el aula de clase, específicamente para un acercamiento a nociones como: función y tasa de variación, las cuales se encuentran en las bases propias del razonamiento covariacional y pueden abordarse desde los primeros grados de escolaridad como una manera de crear cimientos en la comprensión de los conceptos más relevantes del cálculo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta investigación presenta la puesta en práctica de una propuesta pedagógica para apoyar la enseñanza del Cálculo mediante la resolución de problemas a nivel preuniversitarioen Costa Rica. El proyecto tiene su origen en las dificultades que presentan los estudiantes en la comprensión de conceptos básicos de Cálculo, específicamente el de límite y derivada. Esta experiencia se fundamentó en la elaboración de una “situación problema” que provocó un conflicto intelectual en los estudiantes, mientras que el docente fungió como mediador y aprovechó los descubrimientos hechos por los estudiantes para fundamentar teóricamente los diferentes conceptos luego de la aplicación de la propuesta. Los resultados obtenidos son muy positivos y justifican la necesidad de un cambio en las estrategias metodologías utilizadas para enseñar el Cálculo. Sin embargo, es necesario un acercamiento de los docentes hacia la Teoría de Resolución de problemas para aplicar con éxito este tipo de actividades.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En este trabajo se pretende evidenciar, mediante experiencias de aula, que la estrategia metodológica de Resolución de Problemas planteadas por Pólya (1965), Shoenfeld (1985) y Brousseau (1986), desarrolla competencias básicas, genéricas y específicas. Los resultados muestran que las actividades de resolución de problemas planteadas promovieron la comprensión lectora, el trabajo en equipo, la capacidad de razonamiento y argumentación frente a sus compañeros/as, la capacidad lógica de reconocimiento, el descubrimiento de patrones, exploración de problemas similares, reformulación de problemas, trabajo hacia atrás, la participación activa de los estudiantes y el desarrollo de líderes (Espinoza, et al., 2008)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Los programas de estudio de Matemática en Costa Rica, proponen la Resolución de Problemas en contextos reales como estrategia metodológica principal y el Planteamiento de Problemas como uno de los cinco procesos matemáticos. Así, este estudio analiza algunos elementos que intervienen en el proceso de enseñanza y aprendizaje de contenidos matemáticos empleando dicha estrategia y el papel del planteamiento de problemas como actividad complementaria en dicho proceso. Los resultados muestran la importancia del trabajo del profesor como organizador y guía de la clase y del estudiante como responsable de resolver el problema; así como del gran valor educativo que tiene el planteamiento de problemas en el proceso de resolución de problemas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Frecuentemente, al iniciar el estudio de conceptos básicos del análisis matemático, nos encontramos con dificultades y errores relacionados con la división por cero. La necesidad de dar respuesta a esta problemática, da origen a este trabajo que retoma las respuestas dadas por un grupo de alumnos de la escuela media que constituyen las evidencias sobre las cuales se inicia el proceso de investigación que se encuentra en su primera etapa de realización y cuyos resultados parciales se exponen aquí. Se enmarca la tarea en la perspectiva socioepistemológica indagando en los orígenes y evolución de este conocimiento, analizando los alcances y efectos del discurso matemático escolar vigente en la educación media y contemplando las concepciones de los alumnos acerca del cero y la división construidas en ambientes escolarizados y no escolarizados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of integrating computational mechanics (FEA and CFD) and optimization tools is to speed up dramatically the design process in different application areas concerning reliability in electronic packaging. Design engineers in the electronics manufacturing sector may use these tools to predict key design parameters and configurations (i.e. material properties, product dimensions, design at PCB level. etc) that will guarantee the required product performance. In this paper a modeling strategy coupling computational mechanics techniques with numerical optimization is presented and demonstrated with two problems. The integrated modeling framework is obtained by coupling the multi-physics analysis tool PHYSICA - with the numerical optimization package - Visua/DOC into a fuJly automated design tool for applications in electronic packaging. Thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and life-time under thermal cycling. Also a thermal management design based on multi-physics analysis with coupled thermal-flow-stress modeling is discussed. The Response Surface Modeling Approach in conjunction with Design of Experiments statistical tools is demonstrated and used subsequently by the numerical optimization techniques as a part of this modeling framework. Predictions for reliable electronic assemblies are achieved in an efficient and systematic manner.