909 resultados para spårsökande paradigm
Resumo:
Multiresolution synthetic aperture radar (SAR) image formation has been proven to be beneficial in a variety of applications such as improved imaging and target detection as well as speckle reduction. SAR signal processing traditionally carried out in the Fourier domain has inherent limitations in the context of image formation at hierarchical scales. We present a generalized approach to the formation of multiresolution SAR images using biorthogonal shift-invariant discrete wavelet transform (SIDWT) in both range and azimuth directions. Particularly in azimuth, the inherent subband decomposition property of wavelet packet transform is introduced to produce multiscale complex matched filtering without involving any approximations. This generalized approach also includes the formulation of multilook processing within the discrete wavelet transform (DWT) paradigm. The efficiency of the algorithm in parallel form of execution to generate hierarchical scale SAR images is shown. Analytical results and sample imagery of diffuse backscatter are presented to validate the method.
Resumo:
Background: A paradigm shift in educational policy to create problem solvers and critical thinkers produced the games concept approach (GCA) in Singapore's Revised Syllabus for Physical Education (1999). A pilot study (2001) conducted on 11 primary school student teachers (STs) using this approach identified time management and questioning as two of the major challenges faced by novice teachers. Purpose: To examine the GCA from three perspectives: structure—lesson form in terms of teacher-time and pupil-time; product—how STs used those time fractions; and process—the nature of their questioning (type, timing, and target). Participants and setting: Forty-nine STs from three different PETE cohorts (two-year diploma, four-year degree, two-year post-graduate diploma) volunteered to participate in the study conducted during the penultimate week of their final practicum in public primary and secondary schools. Intervention: Based on the findings of the pilot study, PETE increased the emphasis on GCA content specific knowledge and pedagogical procedures. To further support STs learning to actualise the GCA, authentic micro-teaching experiences that were closely monitored by faculty were provided in schools nearby. Research design: This is a descriptive study of time-management and questioning strategies implemented by STs on practicum. Each lesson was segmented into a number of sub-categories of teacher-time (organisation, demonstration and closure) and pupil-time (practice time and game time). Questions were categorised as knowledge, technical, tactical or affective. Data collection: Each ST was video-taped teaching a GCA lesson towards the end of their final practicum. The STs individually determined the timing of the data collection and the lesson to be observed. Data analysis: Each lesson was segmented into a number of sub-categories of both teacher- and pupil-time. Duration recording using Noldus software (Observer 4.0) segmented the time management of different lesson components. Questioning was coded in terms of type, timing and target. Separate MANOVAs were used to measure the difference between programmes and levels (primary and secondary) in relation to time-management procedures and questioning strategies. Findings: No differences emerged between the programmes or levels in their time-management or questioning strategies. Using the GCA, STs generated more pupil time (53%) than teacher time (47%). STs at the primary level provided more technical practice, and those in secondary schools more small-sided game play. Most questions (58%) were asked during play or practice but were substantially low-order involving knowledge or recall (76%) and only 6.7% were open-ended or divergent and capable of developing tactical awareness. Conclusions: Although STs are delivering more pupil time (practice and game) than teacher-time, the lesson structure requires further fine-tuning to extend the practice task beyond technical drills. Many questions are being asked to generate knowledge about games but lack sufficient quality to enhance critical thinking and tactical awareness, as the GCA intends.
Resumo:
A large and growing body of literature has explored corporate environmental sustainability initiatives and their impacts locally, regionally and internationally. While the initiatives provide examples of environmental stewardship and cleaner production, a large proportion of the organisations considered in this literature have ‘sustainable practice’, ‘environmental stewardship’ or similar goals as add-ons to their core business strategy. Furthermore, there is limited evidence of organizations embracing and internalising sustainability principles throughout their activities, products or services. Many challenges and barriers impede outcomes as whole system design or holistic approach to address environmental issues, with some evidence to suggest that targeted initiatives could be useful in making progress. ‘Lean management’ and other lean thinking strategies are often put forward as part of such targeted approaches. Within this context, the authors have drawn on current literature to undertake a review of lean thinking practices and how these influence sustainable business practice, considering the balance of environmental and economic aspects of triple bottom line in sustainability. The review methodology comprised firstly identifying theoretical constructs to be studied, developing criteria for categorising the literature, evaluating the findings within each category and considering the implications of the findings for areas for future research. The evaluation revealed two main areas of consideration: - a) lean manufacturing tools and environmental performance, and; - b) integrated lean and green models and approaches. However the review highlighted the ad hoc use of lean thinking within corporate sustainability initiatives, and established a knowledge gap in the form of a system for being able to consider different categories of environmental impacts in different industries and choose best lean tools or models for a particular problem in a way to ensure holistic exploration. The findings included a specific typology of lean tools for different environmental impacts, drawing from multiple case studies. Within this research context, this paper presents the findings of the review; namely the emerging consensus on the relationships between lean thinking and sustainable business practice. The paper begins with an overview of the current literature regarding lean thinking and its documented role in sustainable business practice. The paper then includes an analysis of lean and green paradigms in different industries; and describes the typology of lean tools used to reduce specific environmental impacts and, integrated lean and green models and approaches. The paper intends to encourage industrial practitioners to consider the merits and potential risks with using specific lean tools to reduce context-specific environmental impacts. It also aims to highlight the potential for further investigation with regard to comparing different industries and conceptualising a generalizable system for ensuring lean thinking initiatives build towards sustainable business practice.
Resumo:
The modern diet has become highly sweetened, resulting in unprecedented levels of sugar consumption, particularly among adolescents. While chronic long-term sugar intake is known to contribute to the development of metabolic disorders including obesity and type II diabetes, little is known regarding the direct consequences of long-term, binge-like sugar consumption on the brain. Because sugar can cause the release of dopamine in the nucleus accumbens (NAc) similarly to drugs of abuse, we investigated changes in the morphology of neurons in this brain region following short- (4 weeks) and long-term (12 weeks) binge-like sucrose consumption using an intermittent two-bottle choice paradigm. We used Golgi-Cox staining to impregnate medium spiny neurons (MSNs) from the NAc core and shell of short- and long-term sucrose consuming rats and compared these to age-matched water controls. We show that prolonged binge-like sucrose consumption significantly decreased the total dendritic length of NAc shell MSNs compared to age-matched control rats. We also found that the restructuring of these neurons resulted primarily from reduced distal dendritic complexity. Conversely, we observed increased spine densities at the distal branch orders of NAc shell MSNs from long-term sucrose consuming rats. Combined, these results highlight the neuronal effects of prolonged binge-like intake of sucrose on NAc shell MSN morphology.
Resumo:
Importance of the field: The shift in focus from ligand based design approaches to target based discovery over the last two to three decades has been a major milestone in drug discovery research. Currently, it is witnessing another major paradigm shift by leaning towards the holistic systems based approaches rather the reductionist single molecule based methods. The effect of this new trend is likely to be felt strongly in terms of new strategies for therapeutic intervention, new targets individually and in combinations, and design of specific and safer drugs. Computational modeling and simulation form important constituents of new-age biology because they are essential to comprehend the large-scale data generated by high-throughput experiments and to generate hypotheses, which are typically iterated with experimental validation. Areas covered in this review: This review focuses on the repertoire of systems-level computational approaches currently available for target identification. The review starts with a discussion on levels of abstraction of biological systems and describes different modeling methodologies that are available for this purpose. The review then focuses on how such modeling and simulations can be applied for drug target discovery. Finally, it discusses methods for studying other important issues such as understanding targetability, identifying target combinations and predicting drug resistance, and considering them during the target identification stage itself. What the reader will gain: The reader will get an account of the various approaches for target discovery and the need for systems approaches, followed by an overview of the different modeling and simulation approaches that have been developed. An idea of the promise and limitations of the various approaches and perspectives for future development will also be obtained. Take home message: Systems thinking has now come of age enabling a `bird's eye view' of the biological systems under study, at the same time allowing us to `zoom in', where necessary, for a detailed description of individual components. A number of different methods available for computational modeling and simulation of biological systems can be used effectively for drug target discovery.
Resumo:
The incorporation of DNA into nucleosomes and higher-order forms of chromatin in vivo creates difficulties with respect to its accessibility for cellular functions such as transcription, replication, repair and recombination. To understand the role of chromatin structure in the process of homologous recombination, we have studied the interaction of nucleoprotein filaments, comprised of RecA protein and ssDNA, with minichromosomes. Using this paradigm, we have addressed how chromatin structure affects the search for homologous DNA sequences, and attempted to distinguish between two mutually exclusive models of DNA-DNA pairing mechanisms. Paradoxically, we found that the search for homologous sequences, as monitored by unwinding of homologous or heterologous duplex DNA, was facilitated by nucleosomes, with no discernible effect on homologous pairing. More importantly, unwinding of minichromosomes required the interaction of nucleoprotein filaments and led to the accumulation of circular duplex DNA sensitive to nuclease P1. Competition experiments indicated that chromatin templates and naked DNA served as equally efficient targets for homologous pairing. These and other findings suggest that nucleosomes do not impede but rather facilitate the search for homologous sequences and establish, in accordance with one proposed model, that unwinding of duplex DNA precedes alignment of homologous sequences at the level of chromatin. The potential application of this model to investigate the role of chromosomal proteins in the alignment of homologous sequences in the context of cellular recombination is considered.
Resumo:
This paper begins with the assertion that research grounded in creative practice constitutes a new paradigm. We argue both for and against the idea. We argue against the idea in terms of applying it to the idealised ‘lone artist’ engaged in the production of their art, whose focus of research is a self-reflection upon the art they produce, and whose art is also the findings of the research. Our position is that such an approach cannot be considered as anything other than a form of auto-phenomenography, that such efforts are part of qualitative research, and they are thus trivial in paradigmatic terms. However, we argue in the positive for understanding the artistic event – by which we mean any mass ecology of artistic practice – as being paradigmatically new in terms of research potentials and demands. Our exemplar for that argument is a practice-led, large-scale annual event called Indie 100 which has run for five years and has demonstrated a distinct paradigmatic ‘settling in’ over its duration while clearly pushing paradigmatic boundaries for research into creative practice.
Resumo:
The project consisted of two long-term follow-up studies of preterm children addressing the question whether intrauterine growth restriction affects the outcome. Assessment at 5 years of age of 203 children with a birth weight less than 1000 g born in Finland in 1996-1997 showed that 9% of the children had cognitive impairment, 14% cerebral palsy, and 4% needed a hearing aid. The intelligence quotient was lower (p<0.05) than the reference value. Thus, 20% exhibited major, 19% minor disabilities, and 61% had no functional abnormalities. Being small for gestational age (SGA) was associated with sub-optimal growth later. In children born before 27 gestational weeks, the SGA had more neuropsychological disabilities than those appropriate for gestational age (AGA). In another cohort with birth weight less than 1500 g assessed at 5 years of age, echocardiography showed a thickened interventricular septum and a decreased left ventricular end-diastolic diameter in both SGA and AGA born children. They also had a higher systolic blood pressure than the reference. Laser-Doppler flowmetry showed different endothelium-dependent and -independent vasodilation responses in the AGA children compared to those of the controls. SGA was not associated with cardio-vascular abnormalities. Auditory event-related potentials (AERPs) were recorded using an oddball paradigm with frequency deviants (standard tone 500 Hz and deviant 750-Hz with 10% probability). At term, the P350 was smaller in SGA and AGA infants than in controls. At 12 months, the automatic change detection peak (mismatch negativity, MMN) was observed in the controls. However, the pre-term infants had a difference positivity that correlated with their neurodevelopment scores. At 5 years of age, the P1-deflection, which reflects primary auditory processing, was smaller, and the MMN larger in the preterm than in the control children. Even with a challenging paradigm or a distraction paradigm, P1 was smaller in the preterm than in the control children. The SGA and AGA children showed similar AERP responses. Prematurity is a major risk factor for abnormal brain development. Preterm children showed signs of cardiovascular abnormality suggesting that prematurity per se may carry a risk for later morbidity. The small positive amplitudes in AERPs suggest persisting altered auditory processing in the preterm in-fants.
Resumo:
The “distractor-frequency effect” refers to the finding that high-frequency (HF) distractor words slow picture naming less than low-frequency distractors in the picture–word interference paradigm. Rival input and output accounts of this effect have been proposed. The former attributes the effect to attentional selection mechanisms operating during distractor recognition, whereas the latter attributes it to monitoring/decision mechanisms operating on distractor and target responses in an articulatory buffer. Using high-density (128-channel) EEG, we tested hypotheses from these rival accounts. In addition to conducting stimulus- and response-locked whole-brain corrected analyses, we investigated the correct-related negativity, an ERP observed on correct trials at fronto-central electrodes proposed to reflect the involvement of domain general monitoring. The wholebrain ERP analysis revealed a significant effect of distractor frequency at inferior right frontal and temporal sites between 100 and 300-msec post-stimulus onset, during which lexical access is thought to occur. Response-locked, region of interest (ROI) analyses of fronto-central electrodes revealed a correct-related negativity starting 121 msec before and peaking 125 msec after vocal onset on the grand averages. Slope analysis of this component revealed a significant difference between HF and lowfrequency distractor words, with the former associated with a steeper slope on the time windowspanning from100 msec before to 100 msec after vocal onset. The finding of ERP effects in time windows and components corresponding to both lexical processing and monitoring suggests the distractor frequency effect is most likely associated with more than one physiological mechanism.
Resumo:
Previous research has shown that action tendencies to approach alcohol may be modified using computerized ApproacheAvoidance Task (AAT), and that this impacted on subsequent consumption. A recent paper in this journal (Becker, Jostman, Wiers, & Holland, 2015) failed to show significant training effects for food in three studies: Nor did it find effects on subsequent consumption. However, avoidance training to high calorie foods was tested against a control rather than Approach training. The present study used a more comparable paradigm to the alcohol studies. It randomly assigned 90 participants to ‘approach’ or ‘avoid’ chocolate images on the AAT, and then asked them to taste and rate chocolates. A significant interaction of condition and time showed that training to avoid chocolate resulted in faster avoidance responses to chocolate images, compared with training to approach it. Consistent with Becker et al.'s Study 3, no effect was found on amounts of chocolate consumed, although a newly published study in this journal (Schumacher, Kemps, & Tiggemann, 2016) did do so. The collective evidence does not as yet provide solid basis for the application of AAT training to reduction of problematic food consumption, although clinical trials have yet to be conducted.
Resumo:
This workshop aims at discussing alternative approaches to resolving the problem of health information fragmentation, partially resulting from difficulties of health complex systems to semantically interact at the information level. In principle, we challenge the current paradigm of keeping medical records where they were created and discuss an alternative approach in which an individual's health data can be maintained by new entities whose sole responsibility is the sustainability of individual-centric health records. In particular, we will discuss the unique characteristics of the European health information landscape. This workshop is also a business meeting of the IMIA Working Group on Health Record Banking.
Resumo:
Cosmological inflation is the dominant paradigm in explaining the origin of structure in the universe. According to the inflationary scenario, there has been a period of nearly exponential expansion in the very early universe, long before the nucleosynthesis. Inflation is commonly considered as a consequence of some scalar field or fields whose energy density starts to dominate the universe. The inflationary expansion converts the quantum fluctuations of the fields into classical perturbations on superhorizon scales and these primordial perturbations are the seeds of the structure in the universe. Moreover, inflation also naturally explains the high degree of homogeneity and spatial flatness of the early universe. The real challenge of the inflationary cosmology lies in trying to establish a connection between the fields driving inflation and theories of particle physics. In this thesis we concentrate on inflationary models at scales well below the Planck scale. The low scale allows us to seek for candidates for the inflationary matter within extensions of the Standard Model but typically also implies fine-tuning problems. We discuss a low scale model where inflation is driven by a flat direction of the Minimally Supersymmetric Standard Model. The relation between the potential along the flat direction and the underlying supergravity model is studied. The low inflationary scale requires an extremely flat potential but we find that in this particular model the associated fine-tuning problems can be solved in a rather natural fashion in a class of supergravity models. For this class of models, the flatness is a consequence of the structure of the supergravity model and is insensitive to the vacuum expectation values of the fields that break supersymmetry. Another low scale model considered in the thesis is the curvaton scenario where the primordial perturbations originate from quantum fluctuations of a curvaton field, which is different from the fields driving inflation. The curvaton gives a negligible contribution to the total energy density during inflation but its perturbations become significant in the post-inflationary epoch. The separation between the fields driving inflation and the fields giving rise to primordial perturbations opens up new possibilities to lower the inflationary scale without introducing fine-tuning problems. The curvaton model typically gives rise to relatively large level of non-gaussian features in the statistics of primordial perturbations. We find that the level of non-gaussian effects is heavily dependent on the form of the curvaton potential. Future observations that provide more accurate information of the non-gaussian statistics can therefore place constraining bounds on the curvaton interactions.
Resumo:
As we enter the second phase of creative industries there is a shift away from the early 1990s ideology of the arts as a creative content provider for the wealth generating ‘knowledge’ economy to an expanded rhetoric encompassing ‘cultural capital’ and its symbolic value. A renewed focus on culture is examined through a regional scan of creative industries in which social engineering of the arts occurs through policy imperatives driven by ‘profit oriented conceptualisations of culture’ (Hornidge 2011, p. 263) In the push for artists to become ‘culturpreneurs’ a trend has emerged where demand for ‘embedded creatives’ (Cunningham 2013) sees an exodus from arts-based employment through use of transferable skills into areas outside the arts. For those that stay, within the performing arts in particular, employment remains project-based, sporadic, underpaid, self-initiated and often self-financed, requiring adaptive career paths. Artist entrepreneurs must balance creation and performance of their art with increasing amounts of time spent on branding, compliance, fundraising and the logistical and commercial requirements of operating in a CI paradigm. The artists’ key challenge thus becomes one of aligning core creative and aesthetic values with market and business considerations. There is also the perceived threat posed by the ‘prosumer’ phenomenon (Bruns 2008), in which digital on-line products are created and produced by those formerly seen as consumers of art or audiences for art. Despite negative aspects to this scenario, a recent study (Steiner & Schneider 2013) reveals that artists are happier and more satisfied than other workers within and outside the creative industries. A lively hybridisation of creative practice is occurring through mobile and interactive technologies with dynamic connections to social media. Continued growth in arts festivals attracts participation in international and transdisciplinary collaborations, whilst cross-sectoral partnerships provide artists with opportunities beyond a socio-cultural setting into business, health, science and education. This is occurring alongside a renewed engagement with place through the rise of cultural precincts in ‘creative cities’ (Florida 2008, Landry 2000), providing revitalised spaces for artists to gather and work. Finally, a reconsideration of the specialist attributes and transferable skills that artists bring to the creative industries suggests ways to dance through both the challenges and opportunities occasioned by the current complexities of arts’ practices.
Resumo:
This study offers a reconstruction and critical evaluation of globalization theory, a perspective that has been central for sociology and cultural studies in recent decades, from the viewpoint of media and communications. As the study shows, sociological and cultural globalization theorists rely heavily on arguments concerning media and communications, especially the so-called new information and communication technologies, in the construction of their frameworks. Together with deepening the understanding of globalization theory, the study gives new critical knowledge of the problematic consequences that follow from such strong investment in media and communications in contemporary theory. The book is divided into four parts. The first part presents the research problem, the approach and the theoretical contexts of the study. Followed by the introduction in Chapter 1, I identify the core elements of globalization theory in Chapter 2. At the heart of globalization theory is the claim that recent decades have witnessed massive changes in the spatio-temporal constitution of society, caused by new media and communications in particular, and that these changes necessitate the rethinking of the foundations of social theory as a whole. Chapter 3 introduces three paradigms of media research the political economy of media, cultural studies and medium theory the discussion of which will make it easier to understand the key issues and controversies that emerge in academic globalization theorists treatment of media and communications. The next two parts offer a close reading of four theorists whose works I use as entry points into academic debates on globalization. I argue that we can make sense of mainstream positions on globalization by dividing them into two paradigms: on the one hand, media-technological explanations of globalization and, on the other, cultural globalization theory. As examples of the former, I discuss the works of Manuel Castells (Chapter 4) and Scott Lash (Chapter 5). I maintain that their analyses of globalization processes are overtly media-centric and result in an unhistorical and uncritical understanding of social power in an era of capitalist globalization. A related evaluation of the second paradigm (cultural globalization theory), as exemplified by Arjun Appadurai and John Tomlinson, is presented in Chapter 6. I argue that due to their rejection of the importance of nation states and the notion of cultural imperialism for cultural analysis, and their replacement with a framework of media-generated deterritorializations and flows, these theorists underplay the importance of the neoliberalization of cultures throughout the world. The fourth part (Chapter 7) presents a central research finding of this study, namely that the media-centrism of globalization theory can be understood in the context of the emergence of neoliberalism. I find it problematic that at the same time when capitalist dynamics have been strengthened in social and cultural life, advocates of globalization theory have directed attention to media-technological changes and their sweeping socio-cultural consequences, instead of analyzing the powerful material forces that shape the society and the culture. I further argue that this shift serves not only analytical but also utopian functions, that is, the longing for a better world in times when such longing is otherwise considered impracticable.
Resumo:
In the field of psychiatry semi-structured interview is one of the central tools in assessing the psychiatric state of a patient. In semi-structured interview the interviewer participates in the interaction both by the prepared interview questions and by his or her own, unstructured turns. It has been stated that in the context of psychiatric assessment interviewers' unstructured turns help to get focused information but simultaneously may weaken the reliability of the data. This study examines the practices by which semi-structured psychiatric interviews are conducted. The method for the study is conversation analysis, which is both a theory of interaction and a methodology for its empirical, detailed analysis. Using data from 80 video-recorded psychiatric interviews with 16 patients and five interviewers it describes in detail both the structured and unstructured interviewing practices. In the analysis also psychotherapeutic concepts are used to describe phenomena that are characteristic for therapeutic discourse. The data was received from the Helsinki Psychotherapy Study (HPS). HPS is a randomized clinical trial comparing the effectiveness of four forms of psychotherapy in the treatment of depressive and anxiety disorders. A total of 326 patients were randomly assigned to one of three treatment groups: solution-focused therapy, short-term psychodynamic psychotherapy, and long-term psychodynamic psychotherapy. The patients assigned to the long-term psychodynamic psychotherapy group and 41 patients self-selected for psychoanalysis were included in a quasi-experimental design. The primary outcome measures were depressive and anxiety symptoms, while secondary measures included work ability, need for treatment, personality functions, social functioning, and life style. Cost-effectiveness was determined. The data were collected from interviews, questionnaires, psychological tests, and public health registers. The follow-up interviews were conducted five times during a 5-year follow-up. The study shows that interviewers pose elaborated questions that are formulated in a friendly and sensitive way and that make relevant patients' long and story-like responses. When receiving patients' answers interviewers use a wide variety of different interviewing practices by which they direct patients' talk or offer an understanding of the meaning of patients' response. The results of the study are two-fold. Firstly, the study shows that understanding the meaning of mental experiences requires interaction between interviewer and patient. It is stated that therefore semi-structured interview is both relevant and necessary method for collecting data in psychotherapy outcome study. Secondly, the study suggests that conversation analysis, enriched with psychotherapeutic concepts, offers methodological possibilities for psychotherapy process research, especially for process-outcome paradigm.