958 resultados para Paradigm vitalist


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Even though the concept of incentive has become very popular in Finnish welfare politics since the economic crisis of the 1990s, the content of this concept is not clear. Fundamentally, it is a matter of controlling the behaviour of individuals to accord with the authorities' objectives and interests in gaining cooperative benefits. As early as in Plato's Republic, citizens were encouraged to use their abilities and skills in a way most beneficial to the society. Similarly, in today's welfare society citizens are urged to produce common goods and distribute welfare to enable a better life for all through cooperation. The fundamental question is to what extent society can shape individuals' preferences with incentives, and encourage them without external coercion to choose actions beneficial for both the society and the individuals themselves. The objective of the incentive institution is to gain cooperative benefits, but there are different views on how it should be implemented. For example, the incentive system in the Finnish welfare society includes several economic and social conceptions which adjust the distribution of welfare. From an economic perspective, the objective of the incentive system is economic efficiency, while from a social perspective it is the securing of social rights and citizens' equality. The market mechanism, for example, can at best lead to economically efficient activity, but it might sacrifice fairness and equality. In this research, the idea of activation policy expands to cover normative and social incentives, in addition to the economic factors affecting human choice and social actions. Desirable co-living and meaningful cooperation have some prerequisites. We need the expanded idea of activation to study them, and to maintain them in society. The themes discussed in all the ten chapters aim at evaluating the preconditions of a just society. This study provides tools to examine the changes in the welfare state, also from the viewpoint of normative ethics. This offers a morally and conceptually wider perspective than a normative viewpoint of economics alone. In terms of the values of our welfare society, it makes a difference how the relationship between the legalities of economics and citizens' well-being is understood. The research asks whether economic benefits to the society should be allowed to supersede the principles of human dignity Key words:incentives, activation policy, morality, social philosophy, social justice, policy paradigm

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Arts education research, as an interdisciplinary field, has developed in the shadows of a number of research traditions. However amid all the methodological innovation, I believe there is one particular, distinctive and radical research strategy which arts educators have created to research the practice of arts education: namely arts-based research. For many, and Elliot Eisner from Stanford University was among the first, arts education needed a research approach which could deal with the complex dynamics of arts education in the classroom. What was needed was ‘an approach to the conduct of educational research that was rooted in the arts and that used aesthetically crafted forms to reveal aspects of practice that mattered educationally’ (Eisner 2006: 11). While arts education researchers were crafting the principles and practices of arts-based research, fellow artist/researchers in the creative arts were addressing similar needs and fashioning their own exacting research strategies. This chapter aligns arts-based research with the complementary research practices established in creative arts studios and identifies the shared and truly radical nature of these moves. Finally, and in a contemporary turn many will find surprising, I will discuss how the radical aspects of these methodologies are now being held up as core elements of what is being called the fourth paradigm of scientific research, known as eScience. Could it be that the radical dynamics of arts-based research pre-figured the needs of eScience researchers who are currently struggling to manage the ‘deluge of Big Data’ which is disrupting their well-established scientific methods?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

TWIK-related K+ channel TREK1, a background leak K+ channel, has been strongly implicated as the target of several general and local anesthetics. Here, using the whole-cell and single-channel patch-clamp technique, we investigated the effect of lidocaine, a local anesthetic, on the human (h) TREK1 channel heterologously expressed in human embryonic kidney 293 cells by an adenoviral-mediated expression system. Lidocaine, at clinical concentrations, produced reversible, concentration-dependent inhibition of hTREK1 current, with IC50 value of 180 mu M, by reducing the single-channel open probability and stabilizing the closed state. We have identified a strategically placed unique aromatic couplet (Tyr352 and Phe355) in the vicinity of the protein kinase A phosphorylation site, Ser348, in the C-terminal domain (CTD) of hTREK1, that is critical for the action of lidocaine. Furthermore, the phosphorylation state of Ser348 was found to have a regulatory role in lidocaine-mediated inhibition of hTREK1. It is interesting that we observed strong intersubunit negative cooperativity (Hill coefficient = 0.49) and half-of-sites saturation binding stoichiometry (half-reaction order) for the binding of lidocaine to hTREK1. Studies with the heterodimer of wild-type (wt)-hTREK1 and Delta 119 C-terminal deletion mutant (hTREK1(wt)-Delta 119) revealed that single CTD of hTREK1 was capable of mediating partial inhibition by lidocaine, but complete inhibition necessitates the cooperative interaction between both the CTDs upon binding of lidocaine. Based on our observations, we propose a model that explains the unique kinetics and provides a plausible paradigm for the inhibitory action of lidocaine on hTREK1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A vast amount of public services and goods are contracted through procurement auctions. Therefore it is very important to design these auctions in an optimal way. Typically, we are interested in two different objectives. The first objective is efficiency. Efficiency means that the contract is awarded to the bidder that values it the most, which in the procurement setting means the bidder that has the lowest cost of providing a service with a given quality. The second objective is to maximize public revenue. Maximizing public revenue means minimizing the costs of procurement. Both of these goals are important from the welfare point of view. In this thesis, I analyze field data from procurement auctions and show how empirical analysis can be used to help design the auctions to maximize public revenue. In particular, I concentrate on how competition, which means the number of bidders, should be taken into account in the design of auctions. In the first chapter, the main policy question is whether the auctioneer should spend resources to induce more competition. The information paradigm is essential in analyzing the effects of competition. We talk of a private values information paradigm when the bidders know their valuations exactly. In a common value information paradigm, the information about the value of the object is dispersed among the bidders. With private values more competition always increases the public revenue but with common values the effect of competition is uncertain. I study the effects of competition in the City of Helsinki bus transit market by conducting tests for common values. I also extend an existing test by allowing bidder asymmetry. The information paradigm seems to be that of common values. The bus companies that have garages close to the contracted routes are influenced more by the common value elements than those whose garages are further away. Therefore, attracting more bidders does not necessarily lower procurement costs, and thus the City should not implement costly policies to induce more competition. In the second chapter, I ask how the auctioneer can increase its revenue by changing contract characteristics like contract sizes and durations. I find that the City of Helsinki should shorten the contract duration in the bus transit auctions because that would decrease the importance of common value components and cheaply increase entry which now would have a more beneficial impact on the public revenue. Typically, cartels decrease the public revenue in a significant way. In the third chapter, I propose a new statistical method for detecting collusion and compare it with an existing test. I argue that my test is robust to unobserved heterogeneity unlike the existing test. I apply both methods to procurement auctions that contract snow removal in schools of Helsinki. According to these tests, the bidding behavior of two of the bidders seems consistent with a contract allocation scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many species inhabit fragmented landscapes, resulting either from anthropogenic or from natural processes. The ecological and evolutionary dynamics of spatially structured populations are affected by a complex interplay between endogenous and exogenous factors. The metapopulation approach, simplifying the landscape to a discrete set of patches of breeding habitat surrounded by unsuitable matrix, has become a widely applied paradigm for the study of species inhabiting highly fragmented landscapes. In this thesis, I focus on the construction of biologically realistic models and their parameterization with empirical data, with the general objective of understanding how the interactions between individuals and their spatially structured environment affect ecological and evolutionary processes in fragmented landscapes. I study two hierarchically structured model systems, which are the Glanville fritillary butterfly in the Åland Islands, and a system of two interacting aphid species in the Tvärminne archipelago, both being located in South-Western Finland. The interesting and challenging feature of both study systems is that the population dynamics occur over multiple spatial scales that are linked by various processes. My main emphasis is in the development of mathematical and statistical methodologies. For the Glanville fritillary case study, I first build a Bayesian framework for the estimation of death rates and capture probabilities from mark-recapture data, with the novelty of accounting for variation among individuals in capture probabilities and survival. I then characterize the dispersal phase of the butterflies by deriving a mathematical approximation of a diffusion-based movement model applied to a network of patches. I use the movement model as a building block to construct an individual-based evolutionary model for the Glanville fritillary butterfly metapopulation. I parameterize the evolutionary model using a pattern-oriented approach, and use it to study how the landscape structure affects the evolution of dispersal. For the aphid case study, I develop a Bayesian model of hierarchical multi-scale metapopulation dynamics, where the observed extinction and colonization rates are decomposed into intrinsic rates operating specifically at each spatial scale. In summary, I show how analytical approaches, hierarchical Bayesian methods and individual-based simulations can be used individually or in combination to tackle complex problems from many different viewpoints. In particular, hierarchical Bayesian methods provide a useful tool for decomposing ecological complexity into more tractable components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The symbols, signs, and traces of copyright and related intellectual property laws that appear on everyday texts, objects, and artifacts have multiplied exponentially over the past 15 years. Digital spaces have revolutionized access to content and transformed the ways in which content is porous and malleable. In this volume, contributors focus on copyright as it relates to culture. The editors argue that what «counts» as property must be understood as shifting terrain deeply influenced by historical, economic, cultural, religious, and digital perspectives. Key themes addressed include issues of how: • Culture is framed, defined, and/or identified in conversations about intellectual property; • The humanities and other related disciplines are implicated in intellectual property issues; • The humanities will continue to rub up against copyright (e.g., issues of authorship, authorial agency, ownership of texts); • Different cultures and bodies of literature approach intellectual property, and how competing dynasties and marginalized voices exist beyond the dominant U.S. copyright paradigm. Offering a transnational and interdisciplinary perspective, Cultures of Copyright offers readers – scholars, researchers, practitioners, theorists, and others – key considerations to contemplate in terms of how we understand copyright’s past and how we chart its futures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiresolution synthetic aperture radar (SAR) image formation has been proven to be beneficial in a variety of applications such as improved imaging and target detection as well as speckle reduction. SAR signal processing traditionally carried out in the Fourier domain has inherent limitations in the context of image formation at hierarchical scales. We present a generalized approach to the formation of multiresolution SAR images using biorthogonal shift-invariant discrete wavelet transform (SIDWT) in both range and azimuth directions. Particularly in azimuth, the inherent subband decomposition property of wavelet packet transform is introduced to produce multiscale complex matched filtering without involving any approximations. This generalized approach also includes the formulation of multilook processing within the discrete wavelet transform (DWT) paradigm. The efficiency of the algorithm in parallel form of execution to generate hierarchical scale SAR images is shown. Analytical results and sample imagery of diffuse backscatter are presented to validate the method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: A paradigm shift in educational policy to create problem solvers and critical thinkers produced the games concept approach (GCA) in Singapore's Revised Syllabus for Physical Education (1999). A pilot study (2001) conducted on 11 primary school student teachers (STs) using this approach identified time management and questioning as two of the major challenges faced by novice teachers. Purpose: To examine the GCA from three perspectives: structure—lesson form in terms of teacher-time and pupil-time; product—how STs used those time fractions; and process—the nature of their questioning (type, timing, and target). Participants and setting: Forty-nine STs from three different PETE cohorts (two-year diploma, four-year degree, two-year post-graduate diploma) volunteered to participate in the study conducted during the penultimate week of their final practicum in public primary and secondary schools. Intervention: Based on the findings of the pilot study, PETE increased the emphasis on GCA content specific knowledge and pedagogical procedures. To further support STs learning to actualise the GCA, authentic micro-teaching experiences that were closely monitored by faculty were provided in schools nearby. Research design: This is a descriptive study of time-management and questioning strategies implemented by STs on practicum. Each lesson was segmented into a number of sub-categories of teacher-time (organisation, demonstration and closure) and pupil-time (practice time and game time). Questions were categorised as knowledge, technical, tactical or affective. Data collection: Each ST was video-taped teaching a GCA lesson towards the end of their final practicum. The STs individually determined the timing of the data collection and the lesson to be observed. Data analysis: Each lesson was segmented into a number of sub-categories of both teacher- and pupil-time. Duration recording using Noldus software (Observer 4.0) segmented the time management of different lesson components. Questioning was coded in terms of type, timing and target. Separate MANOVAs were used to measure the difference between programmes and levels (primary and secondary) in relation to time-management procedures and questioning strategies. Findings: No differences emerged between the programmes or levels in their time-management or questioning strategies. Using the GCA, STs generated more pupil time (53%) than teacher time (47%). STs at the primary level provided more technical practice, and those in secondary schools more small-sided game play. Most questions (58%) were asked during play or practice but were substantially low-order involving knowledge or recall (76%) and only 6.7% were open-ended or divergent and capable of developing tactical awareness. Conclusions: Although STs are delivering more pupil time (practice and game) than teacher-time, the lesson structure requires further fine-tuning to extend the practice task beyond technical drills. Many questions are being asked to generate knowledge about games but lack sufficient quality to enhance critical thinking and tactical awareness, as the GCA intends.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A large and growing body of literature has explored corporate environmental sustainability initiatives and their impacts locally, regionally and internationally. While the initiatives provide examples of environmental stewardship and cleaner production, a large proportion of the organisations considered in this literature have ‘sustainable practice’, ‘environmental stewardship’ or similar goals as add-ons to their core business strategy. Furthermore, there is limited evidence of organizations embracing and internalising sustainability principles throughout their activities, products or services. Many challenges and barriers impede outcomes as whole system design or holistic approach to address environmental issues, with some evidence to suggest that targeted initiatives could be useful in making progress. ‘Lean management’ and other lean thinking strategies are often put forward as part of such targeted approaches. Within this context, the authors have drawn on current literature to undertake a review of lean thinking practices and how these influence sustainable business practice, considering the balance of environmental and economic aspects of triple bottom line in sustainability. The review methodology comprised firstly identifying theoretical constructs to be studied, developing criteria for categorising the literature, evaluating the findings within each category and considering the implications of the findings for areas for future research. The evaluation revealed two main areas of consideration: - a) lean manufacturing tools and environmental performance, and; - b) integrated lean and green models and approaches. However the review highlighted the ad hoc use of lean thinking within corporate sustainability initiatives, and established a knowledge gap in the form of a system for being able to consider different categories of environmental impacts in different industries and choose best lean tools or models for a particular problem in a way to ensure holistic exploration. The findings included a specific typology of lean tools for different environmental impacts, drawing from multiple case studies. Within this research context, this paper presents the findings of the review; namely the emerging consensus on the relationships between lean thinking and sustainable business practice. The paper begins with an overview of the current literature regarding lean thinking and its documented role in sustainable business practice. The paper then includes an analysis of lean and green paradigms in different industries; and describes the typology of lean tools used to reduce specific environmental impacts and, integrated lean and green models and approaches. The paper intends to encourage industrial practitioners to consider the merits and potential risks with using specific lean tools to reduce context-specific environmental impacts. It also aims to highlight the potential for further investigation with regard to comparing different industries and conceptualising a generalizable system for ensuring lean thinking initiatives build towards sustainable business practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The modern diet has become highly sweetened, resulting in unprecedented levels of sugar consumption, particularly among adolescents. While chronic long-term sugar intake is known to contribute to the development of metabolic disorders including obesity and type II diabetes, little is known regarding the direct consequences of long-term, binge-like sugar consumption on the brain. Because sugar can cause the release of dopamine in the nucleus accumbens (NAc) similarly to drugs of abuse, we investigated changes in the morphology of neurons in this brain region following short- (4 weeks) and long-term (12 weeks) binge-like sucrose consumption using an intermittent two-bottle choice paradigm. We used Golgi-Cox staining to impregnate medium spiny neurons (MSNs) from the NAc core and shell of short- and long-term sucrose consuming rats and compared these to age-matched water controls. We show that prolonged binge-like sucrose consumption significantly decreased the total dendritic length of NAc shell MSNs compared to age-matched control rats. We also found that the restructuring of these neurons resulted primarily from reduced distal dendritic complexity. Conversely, we observed increased spine densities at the distal branch orders of NAc shell MSNs from long-term sucrose consuming rats. Combined, these results highlight the neuronal effects of prolonged binge-like intake of sucrose on NAc shell MSN morphology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Importance of the field: The shift in focus from ligand based design approaches to target based discovery over the last two to three decades has been a major milestone in drug discovery research. Currently, it is witnessing another major paradigm shift by leaning towards the holistic systems based approaches rather the reductionist single molecule based methods. The effect of this new trend is likely to be felt strongly in terms of new strategies for therapeutic intervention, new targets individually and in combinations, and design of specific and safer drugs. Computational modeling and simulation form important constituents of new-age biology because they are essential to comprehend the large-scale data generated by high-throughput experiments and to generate hypotheses, which are typically iterated with experimental validation. Areas covered in this review: This review focuses on the repertoire of systems-level computational approaches currently available for target identification. The review starts with a discussion on levels of abstraction of biological systems and describes different modeling methodologies that are available for this purpose. The review then focuses on how such modeling and simulations can be applied for drug target discovery. Finally, it discusses methods for studying other important issues such as understanding targetability, identifying target combinations and predicting drug resistance, and considering them during the target identification stage itself. What the reader will gain: The reader will get an account of the various approaches for target discovery and the need for systems approaches, followed by an overview of the different modeling and simulation approaches that have been developed. An idea of the promise and limitations of the various approaches and perspectives for future development will also be obtained. Take home message: Systems thinking has now come of age enabling a `bird's eye view' of the biological systems under study, at the same time allowing us to `zoom in', where necessary, for a detailed description of individual components. A number of different methods available for computational modeling and simulation of biological systems can be used effectively for drug target discovery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The incorporation of DNA into nucleosomes and higher-order forms of chromatin in vivo creates difficulties with respect to its accessibility for cellular functions such as transcription, replication, repair and recombination. To understand the role of chromatin structure in the process of homologous recombination, we have studied the interaction of nucleoprotein filaments, comprised of RecA protein and ssDNA, with minichromosomes. Using this paradigm, we have addressed how chromatin structure affects the search for homologous DNA sequences, and attempted to distinguish between two mutually exclusive models of DNA-DNA pairing mechanisms. Paradoxically, we found that the search for homologous sequences, as monitored by unwinding of homologous or heterologous duplex DNA, was facilitated by nucleosomes, with no discernible effect on homologous pairing. More importantly, unwinding of minichromosomes required the interaction of nucleoprotein filaments and led to the accumulation of circular duplex DNA sensitive to nuclease P1. Competition experiments indicated that chromatin templates and naked DNA served as equally efficient targets for homologous pairing. These and other findings suggest that nucleosomes do not impede but rather facilitate the search for homologous sequences and establish, in accordance with one proposed model, that unwinding of duplex DNA precedes alignment of homologous sequences at the level of chromatin. The potential application of this model to investigate the role of chromosomal proteins in the alignment of homologous sequences in the context of cellular recombination is considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper begins with the assertion that research grounded in creative practice constitutes a new paradigm. We argue both for and against the idea. We argue against the idea in terms of applying it to the idealised ‘lone artist’ engaged in the production of their art, whose focus of research is a self-reflection upon the art they produce, and whose art is also the findings of the research. Our position is that such an approach cannot be considered as anything other than a form of auto-phenomenography, that such efforts are part of qualitative research, and they are thus trivial in paradigmatic terms. However, we argue in the positive for understanding the artistic event – by which we mean any mass ecology of artistic practice – as being paradigmatically new in terms of research potentials and demands. Our exemplar for that argument is a practice-led, large-scale annual event called Indie 100 which has run for five years and has demonstrated a distinct paradigmatic ‘settling in’ over its duration while clearly pushing paradigmatic boundaries for research into creative practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The project consisted of two long-term follow-up studies of preterm children addressing the question whether intrauterine growth restriction affects the outcome. Assessment at 5 years of age of 203 children with a birth weight less than 1000 g born in Finland in 1996-1997 showed that 9% of the children had cognitive impairment, 14% cerebral palsy, and 4% needed a hearing aid. The intelligence quotient was lower (p<0.05) than the reference value. Thus, 20% exhibited major, 19% minor disabilities, and 61% had no functional abnormalities. Being small for gestational age (SGA) was associated with sub-optimal growth later. In children born before 27 gestational weeks, the SGA had more neuropsychological disabilities than those appropriate for gestational age (AGA). In another cohort with birth weight less than 1500 g assessed at 5 years of age, echocardiography showed a thickened interventricular septum and a decreased left ventricular end-diastolic diameter in both SGA and AGA born children. They also had a higher systolic blood pressure than the reference. Laser-Doppler flowmetry showed different endothelium-dependent and -independent vasodilation responses in the AGA children compared to those of the controls. SGA was not associated with cardio-vascular abnormalities. Auditory event-related potentials (AERPs) were recorded using an oddball paradigm with frequency deviants (standard tone 500 Hz and deviant 750-Hz with 10% probability). At term, the P350 was smaller in SGA and AGA infants than in controls. At 12 months, the automatic change detection peak (mismatch negativity, MMN) was observed in the controls. However, the pre-term infants had a difference positivity that correlated with their neurodevelopment scores. At 5 years of age, the P1-deflection, which reflects primary auditory processing, was smaller, and the MMN larger in the preterm than in the control children. Even with a challenging paradigm or a distraction paradigm, P1 was smaller in the preterm than in the control children. The SGA and AGA children showed similar AERP responses. Prematurity is a major risk factor for abnormal brain development. Preterm children showed signs of cardiovascular abnormality suggesting that prematurity per se may carry a risk for later morbidity. The small positive amplitudes in AERPs suggest persisting altered auditory processing in the preterm in-fants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The “distractor-frequency effect” refers to the finding that high-frequency (HF) distractor words slow picture naming less than low-frequency distractors in the picture–word interference paradigm. Rival input and output accounts of this effect have been proposed. The former attributes the effect to attentional selection mechanisms operating during distractor recognition, whereas the latter attributes it to monitoring/decision mechanisms operating on distractor and target responses in an articulatory buffer. Using high-density (128-channel) EEG, we tested hypotheses from these rival accounts. In addition to conducting stimulus- and response-locked whole-brain corrected analyses, we investigated the correct-related negativity, an ERP observed on correct trials at fronto-central electrodes proposed to reflect the involvement of domain general monitoring. The wholebrain ERP analysis revealed a significant effect of distractor frequency at inferior right frontal and temporal sites between 100 and 300-msec post-stimulus onset, during which lexical access is thought to occur. Response-locked, region of interest (ROI) analyses of fronto-central electrodes revealed a correct-related negativity starting 121 msec before and peaking 125 msec after vocal onset on the grand averages. Slope analysis of this component revealed a significant difference between HF and lowfrequency distractor words, with the former associated with a steeper slope on the time windowspanning from100 msec before to 100 msec after vocal onset. The finding of ERP effects in time windows and components corresponding to both lexical processing and monitoring suggests the distractor frequency effect is most likely associated with more than one physiological mechanism.