232 resultados para obsolete
Resumo:
In-memory databases have become a mainstay of enterprise computing offering significant performance and scalability boosts for online analytical and (to a lesser extent) transactional processing as well as improved prospects for integration across different applications through an efficient shared database layer. Significant research and development has been undertaken over several years concerning data management considerations of in-memory databases. However, limited insights are available on the impacts of applications and their supportive middleware platforms and how they need to evolve to fully function through, and leverage, in-memory database capabilities. This paper provides a first, comprehensive exposition into how in-memory databases impact Business Pro- cess Management, as a mission-critical and exemplary model-driven integration and orchestration middleware. Through it, we argue that in-memory databases will render some prevalent uses of legacy BPM middleware obsolete, but also open up exciting possibilities for tighter application integration, better process automation performance and some entirely new BPM capabilities such as process-based application customization. To validate the feasibility of an in-memory BPM, we develop a surprisingly simple BPM runtime embedded into SAP HANA and providing for BPMN-based process automation capabilities.
Resumo:
Fossils provide the principal basis for temporal calibrations, which are critical to the accuracy of divergence dating analyses. Translating fossil data into minimum and maximum bounds for calibrations is the most important, and often least appreciated, step of divergence dating. Properly justified calibrations require the synthesis of phylogenetic, paleontological, and geological evidence and can be difficult for non-specialists to formulate. The dynamic nature of the fossil record (e.g., new discoveries, taxonomic revisions, updates of global or local stratigraphy) requires that calibration data be updated continually lest they become obsolete. Here, we announce the Fossil Calibration Database (http://fossilcalibrations.org), a new open-access resource providing vetted fossil calibrations to the scientific community. Calibrations accessioned into this database are based on individual fossil specimens and follow best practices for phylogenetic justification and geochronological constraint. The associated Fossil Calibration Series, a calibration-themed publication series at Palaeontologia Electronica, will serve as one key pipeline for peer-reviewed calibrations to enter the database.
Resumo:
The construction industry accounts for a significant portion of the material consumption of our industrialised societies. That material consumption comes at an environmental cost, and when buildings and infrastructure projects are demolished and discarded, after their useful lifespan, that environmental cost remains largely unrecovered. The expected operational lifespan of modern buildings has become disturbingly short as buildings are replaced for reasons of changing cultural expectations, style, serviceability, locational obsolescence and economic viability. The same buildings however are not always physically or structurally obsolete; the materials and components within them are very often still completely serviceable. While there is some activity in the area of recycling of selected construction materials, such as steel and concrete, this is almost always in the form of down cycling or reprocessing. Very little of this material and component resource is reuse in a way that more effectively captures its potential. One significant impediment to such reuse is that buildings are not designed in a way that facilitates easy recovery of materials and components; they are designed and built for speed of construction and quick economic returns, with little or no consideration of the longer term consequences of their physical matter. This research project explores the potential for the recovery of materials and components if buildings were designed for such future recovery; a strategy of design for disassembly. This is not a new design philosophy; design for disassembly is well understood in product design and industrial design. There are also some architectural examples of design for disassembly; however these are specialist examples and there is no significant attempt to implement the strategy in the main stream construction industry. This paper presents research into the analysis of the embodied energy in buildings, highlighting its significance in comparison with operational energy. Analysis at material, component, and whole-of-building levels shows the potential benefits of strategically designing buildings for future disassembly to recover this embodied energy. Careful consideration at the early design stage can result in the deconstruction of significant portions of buildings and the recovery of their potential through higher order reuse and upcycling.
Resumo:
Similar to most other creative industries, the evolution of the music industry is heavily shaped by media technologies. This was equally true in 1999, when the global recorded music industry had experienced two decades of continuous growth largely driven by the rapid transition from vinyl records to Compact Discs. The transition encouraged avid music listeners to purchase much of their music collections all over again in order to listen to their favourite music with ‘digital sound’. As a consequence of this successful product innovation, recorded music sales (unit measure) more than doubled between the early 1980s and the end of the 1990s. It was with this backdrop that the first peer-to-peer file sharing service was developed and released to the mainstream music market in 1999 by the college student Shawn Fanning. The service was named Napster and it marks the beginning of an era that is now a classic example of how an innovation is able to disrupt an entire industry and make large swathes of existing industry competences obsolete. File sharing services such as Napster, followed by a range of similar services in its path, reduced physical unit sales in the music industry to levels that had not been seen since the 1970s. The severe impact of the internet on physical sales shocked many music industry executives who spent much of the 2000s vigorously trying to reverse the decline and make the disruptive technologies go away. At the end, they learned that their efforts were to no avail and the impact on the music industry proved to be transformative, irreversible and, to many music industry professionals, also devastating. Thousands of people lost their livelihood, large and small music companies have folded or been forced into mergers or acquisitions. But as always during periods of disruption, the past 15 years have also been very innovative, spurring a plethora of new music business models. These new business models have mainly emerged outside the music industry and the innovators have been often been required to be both persuasive and persistent in order to get acceptance from the risk-averse and cash-poor music industry establishment. Apple was one such change agent that in 2003 was the first company to open up a functioning and legal market for online music. iTunes Music Store was the first online retail outlet that was able to offer the music catalogues from all the major music companies; it used an entirely novel pricing model, and it allowed consumers to de-bundle the music album and only buy the songs that they actually liked. Songs had previously been bundled by physical necessity as discs or cassettes, but with iTunes Music Store, the institutionalized album bundle slowly started to fall apart. The consequences had an immediate impact on music retailing and within just a few years, many brick and mortar record stores were forced out of business in markets across the world. The transformation also had disruptive consequences beyond music retailing and redefined music companies’ organizational structures, work processes and routines, as well as professional roles. iTunes Music Store in one sense was a disruptive innovation, but it was at the same time relatively incremental, since the major labels’ positions and power structures remained largely unscathed. The rights holders still controlled their intellectual properties and the structures that guided the royalties paid per song that was sold were predictable, transparent and in line with established music industry practices.
Resumo:
This creative work is an original soundtrack for the multimedia performance adaptation of Oscar Wilde’s De Profundis, led by David Fenton and Brian Lucas and produced by Metro Arts. Intermediality offers unique challenges to the composer creating towards live performance. Given the text-based nature of the piece, and the prevalence of screen content, music had a distinct role to play in supporting the intermedial performance environment. Drawing from Oscar Wilde’s own writings in the initial stages [“...richer cadences…more curious effects” “…the cry of Marysas” and the “deferred resolution of Chopin”] , the deliberately risky compositional process experimented with improvised location recordings and found sounds, random and fragmented assemblages of vintage recordings, rough methods and obsolete recording technology, and the sonic kinship of the hissing sibilances of the sea, theatrical applause and the crackle of antique recording devices (which had just been invented in Wilde’s time) worked into wefts of sound. As the soundtrack emerged, is was clearly resistant to ‘concepts’ imposed from the outside, and as the field of possibilities expanded and engaged in dialogue with the other elements of the performance (live and projected) certain pieces were selected by the director and curated into the emerging work. Thus leitmotifs emerged, rather than being imposed from the outset, with a particular through line holding: if it was too obviously like ‘music’, (which is usually used in theatre as emotional lubrication and narrative signpost) it didn’t work, and if it sounded like avant-garde sound-art, it was too grating and detracted from the primacy of the text. As a composer I worked this sweet spot inbetween these two poles as well as serving David Fenton’s curation: he determined which compositions to incorporate, reiterate and omit as part of the process of writing text, action and image and the compositional process responded with organic elaborations and variations on these selections. Musical resolution was mostly deferred until the closing stages of the performance. The soundtrack was present for the duration of the show, and Artshub reviewed the musical component thus: “...the score by David Megarrity is a refined, understated ambient scaffolding.” It premiered at the Visy Theatre, Brisbane Powerhouse, on 22 April 2015.
Resumo:
The dissertation "From Conceptual to Corporeal, from Quotation to Site: Painting and History of Contemporary Art" explores the state of painting in contemporary art and art theory since the 1960s. The purpose of the study is to re-consider the dominant "end of painting" -narrative in contemporary art history, which goes back to the modernist ideology of painting as a reductive, medium-specific form of art. Drawing on Michel Foucault´s concepts of discursive formation and archive, as well as Jean-Luc Nancy´s post-phenomenological philosophy on corporeality, I suggest that contemporary painting can be redefined as a discursive-sensuous practice. Instead of seeing painting as obsolete or over as an avantgarde art genre, I show that there have been alternative, neo-avantgardist ways of defining painting since the end of the 1960s, such as French artist Daniel Buren´s early writings on painting as "theoretical practice". Consequently, the tendency of the canonical Anglo-American contemporary art narratives to underestimate the historical and institutional codes of art can be questioned. This tendency can be seen, for example, in Rosalind Krauss´s influential theory on index. The study also reflects the relations between conceptual art and painting since the 1960s and maps recent theories of painting, which re-examine the genre´s possibilities after the modernist rhetoric. Concepts of "flatbed", "painting in the extended field", "as painting" and so on are compared critically with the idea of painting as discursive practice. It is also shown that the issues in painting arise from the contemporary critical art debate while the dematerialisation paradigm of conceptual art has dissolved. The study focuses on the corporeal-material-sensuous -cluster of meanings attached to painting and searches for its avantgardist possibilities as redefined by postfeminist and post-phenomenological discourse. The ideas of hierarchy of the senses and synesthesia are developed within the framework of Jean-Luc Nancy´s and Luce Irigaray´s thought. The parameters for the study have been Finnish painting from 1990 to 2002. On the Finnish art scene there has been no "end of painting" ideology, strictly speaking. The mythology and medium-specificity of modernism have been deconstructed since the mid-1980s, but "the archive" of painting, like themes of abstraction, formalism and synesthesia have been re-worked by the discursive practice of painting, for example, in the works of Nina Roos, Tarja Pitkänen-Walter and Jussi Niva.
Resumo:
The attempt to refer meaningful reality as a whole to a unifying ultimate principle - the quest for the unity of Being - was one of the basic tendencies of Western philosophy from its beginnings in ancient Greece up to Hegel's absolute idealism. However, the different trends of contemporary philosophy tend to regard such a speculative metaphysical quest for unity as obsolete. This study addresses this contemporary situation on the basis of the work of Martin Heidegger (1889-1976). Its methodological framework is Heidegger's phenomenological and hermeneutical approach to the history of philosophy. It seeks to understand, in terms of the metaphysical quest for unity, Heidegger's contrast between the first (Greek) beginning or "onset" (Anfang) of philosophy and another onset of thinking. This other onset is a possibility inherent in the contemporary situation in which, according to Heidegger, the metaphysical tradition has developed to its utmost limits and thereby come to an end. Part I is a detailed interpretation of the surviving fragments of the Poem of Parmenides of Elea (fl. c. 500 BC), an outstanding representative of the first philosophical beginning in Heidegger's sense. It is argued that the Poem is not a simple denial of apparent plurality and difference ("mortal acceptances," doxai) in favor of an extreme monism. Parmenides' point is rather to show in what sense the different instances of Being can be reduced to an absolute level of truth or evidence (aletheia), which is the unity of Being as such (to eon). What in prephilosophical human experience is accepted as being is referred to the source of its acceptability: intelligibility as such, the simple and undifferentiated presence to thinking that ultimately excludes unpresence and otherness. Part II interprets selected key texts from different stages in Heidegger's thinking in terms of the unity of Being. It argues that one aspect of Heidegger's sustained and gradually deepening philosophical quest was to think the unity of Being as singularity, as the instantaneous, context-specific, and differential unity of a temporally meaningful situation. In Being and Time (1927) Heidegger articulates the temporal situatedness of the human awareness of meaningful presence. His later work moves on to study the situational correlation between presence and the human awareness. Heidegger's "postmetaphysical" articulation seeks to show how presence becomes meaningful precisely as situated, in an event of differentiation from a multidimensional context of unpresence. In resigning itself to this irreducibly complicated and singular character of meaningful presence, philosophy also faces its own historically situated finitude. This resignation is an essential feature of Heidegger's "other onset" of thinking.
Resumo:
Regardless of your industry, the marketplace is continually evolving. The reason, increasingly, is the evolution of disruptive technology. Disruptive technologies are enhanced or new technological innovations that essentially displace conventional and established technology, rendering it obsolete. They can create opportunities for new products, new markets, and new ways of conducting business. In 2016, business models will again change as businesses adapt. The enhancement of current technology and the development of new technological innovations will undeniably transform how new businesses are established, and how existing businesses compete. For small and medium-sized firms, technology will also enable significant leaps forward in terms of innovation, efficiency and competitiveness. Adapting quickly will be essential, so here’s the top six we think you should be prepared for.
Resumo:
Small firms are always vulnerable to complex technological change that may render their existing business model obsolete. This paper emphasises the need to understand how the Internet's ubiquitous World Wide Web is impacting on their operating environments. Consideration of evolutionary theory and the absorptive capacity construct provides the foundation for discussion of how learning and discovery take place within individuals, firms and the environments that interact with. Small firms, we argue, face difficulties identifying what routines and competencies are best aligned with the seemingly invisible dominant designs that support pursuit of new enterprise in web-impacted environments. We argue that such difficulties largely relate to an inability to acquire external knowledge and the subsequent reliance on existing internal selection processes that may reinforce the known, at the expense of the unknown. The paper concludes with consideration as to how managers can overcome the expected difficulties through the development of internal routines that support the continual search, evaluation and acquisition of specific external knowledge.
Resumo:
The MIT Lincoln Laboratory IDS evaluation methodology is a practical solution in terms of evaluating the performance of Intrusion Detection Systems, which has contributed tremendously to the research progress in that field. The DARPA IDS evaluation dataset has been criticized and considered by many as a very outdated dataset, unable to accommodate the latest trend in attacks. Then naturally the question arises as to whether the detection systems have improved beyond detecting these old level of attacks. If not, is it worth thinking of this dataset as obsolete? The paper presented here tries to provide supporting facts for the use of the DARPA IDS evaluation dataset. The two commonly used signature-based IDSs, Snort and Cisco IDS, and two anomaly detectors, the PHAD and the ALAD, are made use of for this evaluation purpose and the results support the usefulness of DARPA dataset for IDS evaluation.
Resumo:
My doctoral dissertation in sociology and Russian studies, Social Networks and Everyday Practices in Russia, employs a "micro" or "grassroots" perspective on the transition. The study is a collection of articles detailing social networks in five different contexts. The first article examines Russian birthdays from a network perspective. The second takes a look at health care to see whether networks have become obsolete in a sector that is still overwhelmingly public, but increasingly being monetarised. The third article investigates neighbourhood relations. The fourth details relationships at work, particularly from the vantage point of internal migration. The fifth explores housing and the role of networks and money both in the Soviet and post-Soviet era. The study is based on qualitative social network and interview data gathered among three groups, teachers, doctors and factory workers, in St. Petersburg during 1993-2000. Methodologically it builds on a qualitative social network approach. The study adds a critical element to the discussion on networks in post-socialism. A considerable consensus exists that social networks were vital in state socialist societies and were used to bypass various difficulties caused by endemic shortages and bureaucratic rigidities, but a more debated issue has been their role in post-socialism. Some scholars have argued that the importance of networks has been dramatically reduced in the new market economy, whereas others have stressed their continuing importance. If a common denominator in both has been a focus on networks in relation to the past, a more overlooked aspect has been the question of inequality. To what extent is access to networks unequally distributed? What are the limits and consequences of networks, for those who have access, those outside networks or society at large? My study provides some evidence about inequalities. It shows that some groups are privileged over others, for instance, middle-class people in informal access to health care. Moreover, analysing the formation of networks sheds additional light on inequalities, as it highlights the importance of migration as a mechanism of inequality, for example. The five articles focus on how networks are actually used in everyday life. The article on health care, for instance, shows that personal connections are still important and popular in post-Soviet Russia, despite the growing importance of money and the emergence of "fee for service" medicine. Fifteen of twenty teachers were involved in informal medical exchange during a two-week study period, so that they used their networks to bypass the formal market mechanisms or official procedures. Medicines were obtained through personal connections because some were unavailable at local pharmacies or because these connections could provide medicines for a cheaper price or even for free. The article on neighbours shows that "mutual help" was the central feature of neighbouring, so that the exchange of goods, services and information covered almost half the contacts with neighbours reported. Neighbours did not provide merely small-scale help but were often exchange partners because they possessed important professional qualities, had access to workplace resources, or knew somebody useful. The article on the Russian work collective details workplace-related relationships in a tractor factory and shows that interaction with and assistance from one's co-workers remains important. The most interesting finding was that co-workers were even more important to those who had migrated to the city than to those who were born there, which is explained by the specifics of Soviet migration. As a result, the workplace heavily influenced or absorbed contexts for the worker migrants to establish relationships whereas many meeting-places commonly available in Western countries were largely absent or at least did not function as trusted public meeting places to initiate relationships. More results are to be found from my dissertation: Anna-Maria Salmi: Social Networks and Everyday Practices in Russia, Kikimora Publications, 2006, see www.kikimora-publications.com.
Resumo:
Soon after the Bolshevik Revolution of 1917, a three-year civil war broke out in Russia. As in many other civil wars, foreign powers intervened in the conflict. Britain played a leading role in this intervention and had a significant effect on the course of the war. Without this intervention on the White side, the superiority of numbers in manpower and weaponry of the Bolsheviks would have quickly overwhelmed their opponents. The aim of this dissertation is to explain the nature and role of the British intervention on the southern, and most decisive, front of the Civil War. The political decision making in London is studied as a background, but the focus of the dissertation is on the actual implementation of the British policy in Russia. The British military mission arrived in South Russia in late 1918, and started to provide General Denikin s White army with ample supplies. General Denikin would have not been able to build his army of more than 200,000 men or to make his operation against Moscow without the British matériel. The British mission also organized the training and equipping of the Russian troops with British weapons. This made the material aid much more effective. Many of the British instructors took part in fighting the Bolsheviks despite the orders of their government. The study is based on primary sources produced by British departments of state and members of the British mission and military units in South Russia. Primary sources from the Whites, including the personal collections of several key figures of the White movement and official records of the Armed Forces of South Russia are also used to give a balanced picture of the course of events. It is possible to draw some general conclusions from the White movement and reasons for their defeat from the study of the British intervention. In purely material terms the British aid placed Denikin s army in a far more favourable position than the Bolsheviks in 1919, but other military defects in the White army were numerous. The White commanders were unimaginative, their military thinking was obsolete, and they were incapable of organizing the logistics of their army. There were also fundamental defects in the morale of the White troops. In addition to all political mistakes of Denikin s movement and a general inability to adjust to the complex situation in Revolutionary Russia, the Whites suffered a clear military defeat. In South Russia the Whites were defeated not because of the lack of British aid, but rather in spite of it.
Resumo:
This work investigates the relationship between unemployment and technological progress. We analyze briefly the historical context in which the problem of unemployment begins: the emergence of a new economic system, the capitalism. In our research we consider the contributions of two different economists: John Stuart Mill and Karl Marx. They offer two different perspectives to this subject. We also consider the contribution David Ricardo, the first author that realized the conflictive relationship between technical progress and the interest of the workers. We also include an analysis of the problem by using different data collected from Global Wage Report, that is annually offered by the International Labor Organization (ILO). Finally, we also analyze the phenomenon called planned obsolescence. The idea of producing goods with low quality in order to guarantee a future demand (replacing the obsolete goods) has been considered a good option to avoid unemployment. This work ends with some final comments and open questions.
Resumo:
A soberania surge como uma categoria do Estado Moderno. Do início da modernidade aos tempos atuais, o presente trabalho demonstra como surgiu a soberania e como sofreu vicissitudes, de uma concepção absolutista, como um fundamento legitimador do direito positivo, passando pelo fenômeno constitucionalista e a emergência da ideia de Estado de Direito, o apogeu nos âmbitos interno e externo com o totalitarismo e sua relativização na segunda metade do Século XX. A soberania moderna, com a instituição das Nações Unidas inicia seu processo de transmutação, o que foi acelerado pela globalização contemporânea e atualmente é uma categoria que foi ressignificada para se adequar à concepção contemporânea de Estado, com novas funções internamente e externamente. Essa nova soberania implicou em transformações no direito internacional, que classicamente era baseado em um sistema de Westphalia. As tradicionais teorias que explicam as relações entre o direito interno e o direito internacional tornam-se insuficientes diante da nova configuração da sociedade internacional, baseada na nova soberania, que não se trata mais de uma categoria oposta ao direito, mas que permite uma integração entre as diversas ordens jurídicas estatais e a internacional em uma só, sem que haja uma relação de hierarquia entre elas.
Resumo:
Este trabalho apresenta e analisa o Consórcio Intermunicipal do Leste Fluminense (CONLESTE), criado em 2006 e composto por dezesseis municípios do estado do Rio de Janeiro, com o objetivo de buscar soluções para os impactos do Complexo Petroquímico da Petrobras (Comperj). Discute a crise do planejamento regional e a ausência de arranjos de gestão do território que atendam às especificidades do pacto federativo brasileiro. Apresenta os consórcios intermunicipais como alternativa utilizada por muitas cidades brasileiras ao modelo obsoleto das Regiões Metropolitanas, sem, contudo, se deter na comprovação da eficiência dos mesmos. Através de uma revisão bibliográfica de autores clássicos e contemporâneos, aborda o controverso conceito de região, como método para compreender a função regional exercida pelo consórcio. Debate a importância das escalas intermediárias em meio à falsa polarização global-local. Reforça a ideia da necessidade de articulação entre as distintas esferas de governo como forma de minimizar os transtornos provocados pelo caos urbano em que vive a população da área do CONLESTE.