954 resultados para Emerging Paradigm Shift
Resumo:
This article is a case study of how English teachers in England have coped with the paradigm shift from print to digital literacy. It reviews a large scale national initiative that was intended to upskill all teachers, considers its weak impact and explores the author’s involvement in the evaluation of the project’s direct value to English teachers. It explores how this latter evaluation revealed how best practice in English using ICT was developing in a variable manner. It then reports on a recent small scale research project that investigated how very good teachers have adapted ICT successfully into their teaching. It focuses on how the English teachers studied in the project are developing a powerful new pedagogy situated in the life worlds of their students and suggests that this model may be of benefit to many teachers. The issues this article reports on have resonance in all English speaking countries. This article is also a personal story of the author’s close involvement with ICT and English over 20 years, and provides evidence for his conviction that digital technologies will eventually transform English teaching.
Resumo:
Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.
Resumo:
This paper discusses the implications of the shifting cultural significance of public open space in urban areas. In particular, it focuses on the increasing dysfunction between people's expectations of that space and its actual provision and management. In doing so, the paper applies Lefebvre's ideas of spatiality to the evident paradigm shift from 'public' to 'private' culture, with its associated commodification of previously public space. While developing the construct of paradigm shift, the paper recognises that the former political notions inherent in the provision of public space remain in evidence. So whereas public parks were formerly seen as spaces of confrontation between the 'rationality' of public order as the 'irrationality' of individual leisure pursuits, they are now increasingly seen, particularly 'out of hours', as the domain of the dispossessed, to be defined and policed as 'dangerous'. Where once people were welcomed into public open spaces as a means of 'educating' them in good, acceptable, leisure practices, therefore, they are now increasingly excluded, but for the same ostensible reasons. Building on survey work undertaken in Reading, Berkshire, the paper illustrates how communities can become separated from 'their' space, leaving them with the overriding impression that they have been 'short-changed' in terms of both the provision and the management of urban open space. Rather than the intimacy of local space for local people, therefore, the paper argues that parks have become externalised places, increasingly responding to commercial definitions of culture and what is 'public'. Central urban open spaces are therefore increasingly becoming sites of stratification, signification of a consumer-constructed citizenship and valorisation of public life as a legitimate element of the market surface of town and city centres.
Resumo:
Since the Eighteenth Century the protection of public recreational access to private land has been maintained by the state through a mixture of legal rights of passage and the safeguarding of certain de facto access rights. While this situation has been modified in the last fifty years to facilitate some formalisation of access arrangements and landowner compensation in areas of high recreational pressure and low legal accessibility, recent policies indicate that a shift from public to private rights is underway. At the core of this paradigm shift are the new access payment schemes introduced as part of the restructuring of the European Common Agricultural Policy. Under these schemes landowners are now paid for 'supplying' recreational access, with the state, as the former upholder of citizen rights, now assuming the duplicitous position of further underwriting private property ownership through the effective commodification of access, while simultaneously proclaiming significant improvements in citizens' access rights.
Resumo:
Plant traits – the morphological, anatomical, physiological, biochemical and phenological characteristics of plants and their organs – determine how primary producers respond to environmental factors, affect other trophic levels, influence ecosystem processes and services and provide a link from species richness to ecosystem functional diversity. Trait data thus represent the raw material for a wide range of research from evolutionary biology, community and functional ecology to biogeography. Here we present the global database initiative named TRY, which has united a wide range of the plant trait research community worldwide and gained an unprecedented buy-in of trait data: so far 93 trait databases have been contributed. The data repository currently contains almost three million trait entries for 69 000 out of the world's 300 000 plant species, with a focus on 52 groups of traits characterizing the vegetative and regeneration stages of the plant life cycle, including growth, dispersal, establishment and persistence. A first data analysis shows that most plant traits are approximately log-normally distributed, with widely differing ranges of variation across traits. Most trait variation is between species (interspecific), but significant intraspecific variation is also documented, up to 40% of the overall variation. Plant functional types (PFTs), as commonly used in vegetation models, capture a substantial fraction of the observed variation – but for several traits most variation occurs within PFTs, up to 75% of the overall variation. In the context of vegetation models these traits would better be represented by state variables rather than fixed parameter values. The improved availability of plant trait data in the unified global database is expected to support a paradigm shift from species to trait-based ecology, offer new opportunities for synthetic plant trait research and enable a more realistic and empirically grounded representation of terrestrial vegetation in Earth system models.
Resumo:
Purpose – The purpose of this paper is to highlight the serious limitations of neo-liberal capitalism and urge for a shift to socialized capital before further economic deterioration leads to a succession of global conflicts. Design/methodology/approach – This conceptual paper adopts a macro perspective in presenting argument on how global, financial markets integration and capital flow liberalization have led to inadequate market and corporate governance measures. The argument is couched in a selected literature and is preceded by a proposed solution – the requirement for socialized capital. An analysis of the nature of socialized capital is outlined and the questions that require attention identified if a paradigm shift from neo-liberal capitalism is to take place. Findings – The need to urgently shift to a new philosophy of capitalism is overwhelming. Emphasized is that capital needs to adopt a socialised identity and is supported by investment horizons of 30 years or more. It is argued that non-market (e.g. state, NGOs, civil society) intervention is critical in setting appropriate frameworks within which socialized capital can operate. Research limitations/implications – This is a theoretical paper, in which questions are raised which require transparent, public debate. Originality/value – The paper presents the case for a fundamental reconsideration of present day markets, the role of capital and the influence of elites in determining the public good.
Resumo:
Customers will not continue to pay for a service if it is perceived to be of poor quality, and/or of no value. With a paradigm shift towards business dependence on service orientated IS solutions [1], it is critical that alignment exists between service definition, delivery, and customer expectation, businesses are to ensure customer satisfaction. Services, and micro-service development, offer businesses a flexible structure for solution innovation, however, constant changes in technology, business and societal expectations means an iterative analysis solution is required to i) determine whether provider services adequately meet customer segment needs and expectations, and ii) to help guide business service innovation and development. In this paper, by incorporating multiple models, we propose a series of steps to help identify and prioritise service gaps. Moreover, the authors propose the Dual Semiosis Analysis Model, i.e. a tool that highlights where within the symbiotic customer / provider semiosis process, requirements misinterpretation, and/or service provision deficiencies occur. This paper offers the reader a powerful customer-centric tool, designed to help business managers highlight both what services are critical to customer quality perception, and where future innovation
Resumo:
Bacterial resistance to antibiotics poses a serious health threat. Since research into new antibiotics is not progressing at the same rate as the development of bacterial resistance, widespread calls for alternatives to antibiotics have been made. Phage therapy is an ideal alternative candidate to be investigated. However the success of phage therapy may be hampered by a lack of investment support from large pharmaceutical companies, due to their narrow spectrum of activity in antibiotics, very large costs associated with clinical trials of the variety of phages needed, and regulatory requirements remaining unclear. Intellectual property is difficult to secure for therapeutic phage products for a variety of reasons, and patenting procedures vary widely between the US and the EU. Consequently, companies are more likely to invest in phage products for decontamination or veterinary use, rather than clinical use in humans. Some still raise questions as to the safety of phage therapy overall, suggesting the possibility of cytotoxicity and immunogenicity, depending on the phage preparation and route. On the other hand, with patients dying because of infections untreatable with conventional antibiotics, the question arises as to whether it is ethical not to pursue phage therapy more diligently. A paradigm shift about how phage therapy is perceived is required, as well as more rigorous proof of efficacy in the form of clinical trials of existing medicinal phage products. Phage therapy potential may be fulfilled in the meantime by allowing individual preparations to be used on a named-patient basis, with extensive monitoring and multidisciplinary team input. The National Health Service and academia have a role in carrying out clinical phage research, which would be beneficial to public health, but not necessarily financially rewarding.
Resumo:
Purpose of review Novel analyses of the relations between thyroid hormone receptor signaling and estrogen receptor—dependent mechanisms are timely for two sets of reasons. Clinically, both affect mood and foster neuronal growth and regeneration. Mechanistically, they overlap at the levels of DNA recognition elements, coactivators, and signal transduction systems. Crosstalk between thyroid hormone receptors and estrogen receptors is possibly important to integrate external signals to transcription within neurons. Recent findings It has been shown that reproductive functions, including behaviors, driven by estrogens can be antagonized by thyroid hormones, and it has been argued that such crosstalk is biologically adaptive to ensure optimal reproduction. Transcriptional facilitation during transient transfunction studies show that the interactions between thyroid receptor isoforms and estrogen receptor isoforms depend on cell type and promoter context. Overall, this pattern of interactions assures multiple and flexible means of transcriptional regulation. Surprisingly, in some brain areas, thyroid hormone actions can synergize with estrogenic effects, particularly when nongenomic modes of action are considered, such as kinase activation, which, as has been reported, affect later estrogen receptor—induced genomic events. Summary In summary, recent work with nerve cells has contributed to a paradigm shift in how the molecular and behavioral effects of hormones which act through nuclear receptors are viewed.
Resumo:
The aim of this qualitative respondent investigation is to delve into the various views that teachers have concerning the “One-to-One project”, as well as the use of computers as an aid in teaching. One-to-One means that teachers and students will be equipped with a laptop they can use at home and at school.This essay looks at how several factors have changed as a result of this. These factors are threefold: the role of the teacher, the teaching experience, and the student´s learning process. In order to answer the mentioned questions, four interviews have been conducted at two different high schools in southern Norrland. The theory used is the socio-cultural perspective. One result has been that computers can simplify teaching in various ways. Students have faster access to information, and there exists a platform for further communication between the teacher and student outside the classroom. However, there are also several negative aspects. One of these is that the students spend time doing non-school related activities, such as interacting using social mediums. Results also show that the role of the teacher has due to the "One-to-One project" gone from being structural to being interactional. The conclusions reached by the investigation are that today’s schools are experiencing a paradigm shift. Old teaching methods are being replaced by new methods and an altered teaching practice has developed as a result of the presence of the computer in the classroom.
Resumo:
o presente trabalho pretende responder à questão da adequação ou não da atual estrutura organizacional do Senado Federal frente às novas exigências decorrentes da promulgação da Constituição de 1988 e da mudança de parâmetros e valores trazida pelo paradigma emergente na Administração Pública a partir de 1990. O estudo se propõe a correlacionar de modo analítico quais são as funções de uma Casa legislativa com o órgão técnico responsável por sua realização, priorizando valores quanto ao atendimento do interesse público e atendendo ao cumprimento dos dispositivos constitucionais previstos nos artigos 48, 49, 50 e 52 da Carta Magna. A análise feita transporta conceitos e ferramentas desenvolvidos pelos teóricos da Administração para a prática da instituição, viabilizando dados e disponibilizando informações atualizadas com o objetivo de chamar a atenção para o problema e ampliar o seu espectro.
Resumo:
Este trabalho descreve algumas das soluções atualmente adotadas pelos tribunais federais para a gravação de audiência, bem como indica a oportunidade que o ambiente de interatividade da TV Digital Brasileira oferece para uma proposta de modelo de documento eletrônico de escritório que sirva de suporte para o resultado da gravação de audiência (texto, som e imagem), bem como possa contribuir para a mudança de paradigma dos atuais sistemas processuais (softwares). O objetivo é estabelecer um padrão fundamentado em uma política pública (Governo Eletrônico Brasileiro e Sistema Brasileiro de TV Digital Terrestre), onde não existam restrições comerciais quanto ao uso das novas tecnologias de comunicação e informação no que se refere ao mínimo para se privilegiar a inclusão social sem perda de eficiência. O trabalho é formado por dois tipos de conteúdo: parte textual e parte digital. A parte textual contém o resultado de uma pesquisa realizada junto aos tribunais federais, bem como apresenta os principais pontos do Governo Eletrônico Brasileiro e do Sistema Brasileiro de TV Digital Terrestre. Ainda descreve a estrutura montada na elaboração e realização da parte digital. Por sua vez, a parte digital reúne o material utilizado para a apresentação de protótipos (vídeos e exemplos de aplicações), para demonstrar as possibilidades de interatividade da TV Digital Brasileira e dos benefícios que os jurisdicionados e os operadores do Direito alcançariam com a proposta. Palavras-
Resumo:
Over one-third of global food production goes to waste while over 850million people are fighting chronic hunger. The United States is the world’s largest food waster. One third of America’s food with an economic value of US$161 billion is wasted and less than 7% is recycled. American food waste ends up in landfills creating powerful methane gas emissions. South Korea, on the other hand, has implemented the world’s strictest food waste laws, and today diverts 93% of wasted food away from landfills turning such waste into powerful economic opportunities. This Master Thesis investigates the reasons behind global food waste by comparing South Korea and the US. It explores what these two nations are doing to address their respective food waste problems, South Korea successfully, the US not. The paper looks at the two countries’ respective policies and national characteristics, which impact decision-making and recycling processes. The effort concludes that South Korea has embarked on a necessary paradigm shift turning food waste into powerful economic drivers leading to a sharp decline in food waste. In the US, food waste continues to be a major problem without a national strategy to remedy waste. Any effort in the US, while laudable, is sporadic and local, and hence the US misses out on possibly important economic growth opportunities.
Resumo:
O objetivo deste estudo é propor a implementação de um modelo estatístico para cálculo da volatilidade, não difundido na literatura brasileira, o modelo de escala local (LSM), apresentando suas vantagens e desvantagens em relação aos modelos habitualmente utilizados para mensuração de risco. Para estimação dos parâmetros serão usadas as cotações diárias do Ibovespa, no período de janeiro de 2009 a dezembro de 2014, e para a aferição da acurácia empírica dos modelos serão realizados testes fora da amostra, comparando os VaR obtidos para o período de janeiro a dezembro de 2014. Foram introduzidas variáveis explicativas na tentativa de aprimorar os modelos e optou-se pelo correspondente americano do Ibovespa, o índice Dow Jones, por ter apresentado propriedades como: alta correlação, causalidade no sentido de Granger, e razão de log-verossimilhança significativa. Uma das inovações do modelo de escala local é não utilizar diretamente a variância, mas sim a sua recíproca, chamada de “precisão” da série, que segue uma espécie de passeio aleatório multiplicativo. O LSM captou todos os fatos estilizados das séries financeiras, e os resultados foram favoráveis a sua utilização, logo, o modelo torna-se uma alternativa de especificação eficiente e parcimoniosa para estimar e prever volatilidade, na medida em que possui apenas um parâmetro a ser estimado, o que representa uma mudança de paradigma em relação aos modelos de heterocedasticidade condicional.