841 resultados para Outlines of Pyrrhonism
Resumo:
X-ray photoelectron spectroscopy (XPS) can play an important role in guiding the design of new materials, tailored to meet increasingly stringent constraints on performance devices, by providing insight into their surface compositions and the fundamental interactions between the surfaces and the environment. This chapter outlines the principles and application of XPS as a versatile, chemically specific analytical tool in determining the electronic structures and (usually surface) compositions of constituent elements within diverse functional materials. Advances in detector electronics have opened the way for development of photoelectron microscopes and instruments with XPS imaging capabilities. Advances in surface science instrumentation to enable time-resolved spectroscopic measurements offer exciting opportunities to quantitatively investigate the composition, structure and dynamics of working catalyst surfaces. Attempts to study the effects of material processing in realistic environments currently involves the use of high- or ambient-pressure XPS in which samples can be exposed to reactive environments.
Resumo:
Adult illiteracy rates are alarmingly high worldwide. The portability, affordability, and ease of use of mobile (or handheld) devices offer a realistic opportunity to provide novel, context-sensitive literacy resources to adults with limited literacy skills. To this end, we developed the concept of ALEX – a mobile Adult Literacy support application for EXperiential learning (Lumsden et al., 2005). On the basis of a medium-fidelity prototype of this application, we conducted an evaluation of ALEX using participants from our in tended user group. This evaluation had two goals: (a) to assess the usefulness of the ALEX concept and the usability of its current design; and (b) to reflect on the appropriateness of our evaluation process given the literacy-related needs of our participants. This paper outlines our approach to this evaluation as well as the results we obtained and our reflections on the process.
Resumo:
The paper has been presented at the International Conference Pioneers of Bulgarian Mathematics, Dedicated to Nikola Obreshko and Lubomir Tschakalo , So a, July, 2006.
Resumo:
Aristotle is well known to have taught that the brain was a mere coolant apparatus for overheated blood and to have located the hegemonikon in the heart. This teaching was hotly disputed by his immediate successors in the Alexandrian Museum, who showed that the brain played the central role in psychophysiology. This was accepted and developed by the last great biomedical figure of classical antiquity - Claudius Galen. However, Aristotle's cardiocentric theory did not entirely disappear and this article traces its influence through the Arabic physicians of the Islamic ascendancy, into the European Middle Ages where Albertus Magnus' attempt to reconcile cardiocentric and cerebrocentric physiology was particularly influential. It shows how cardiocentricity was sufficiently accepted to attract the attention of, and require refutation by, many of the great names of the Renaissance, including Vesalius, Fernel, and Descartes, and was still taken seriously by luminaries such as William Harvey in the mid-seventeenth century. The article, in rehearsing this history, shows the difficulty of separating the first-person perspective of introspective psychology and the third-person perspective of natural science. It also outlines an interesting case of conflict between philosophy and physiology. © 2013 Copyright Taylor & Francis Group, LLC.
Resumo:
In dimensional metrology, often the largest source of uncertainty of measurement is thermal variation. Dimensional measurements are currently scaled linearly, using ambient temperature measurements and coefficients of thermal expansion, to ideal metrology conditions at 20˚C. This scaling is particularly difficult to implement with confidence in large volumes as the temperature is unlikely to be uniform, resulting in thermal gradients. A number of well-established computational methods are used in the design phase of product development for the prediction of thermal and gravitational effects, which could be used to a greater extent in metrology. This paper outlines the theory of how physical measurements of dimension and temperature can be combined more comprehensively throughout the product lifecycle, from design through to the manufacturing phase. The Hybrid Metrology concept is also introduced: an approach to metrology, which promises to improve product and equipment integrity in future manufacturing environments. The Hybrid Metrology System combines various state of the art physical dimensional and temperature measurement techniques with established computational methods to better predict thermal and gravitational effects.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014
Resumo:
Advertising and other forms of communications are often used by government bodies, non-government organisations, and other institutions to try to influence the population to either a) reduce some form of harmful behaviour (e.g. smoking, drunk- driving) or b) increase some more healthy behaviour (e.g. eating healthily). It is common for these messages to be predicated on the chances of some negative event occurring if the individual does not either a) stop the harmful behaviour, or b) start / increase the healthy behaviour. This design of communication is referred to by many names in the relevant literature, but for the purposes of this thesis, will be termed a ‘threat appeal’. Despite their widespread use in the public sphere, and concerted academic interest since the 1950s, the effectiveness of threat appeals in delivering their objective remains unclear in many ways. In a detailed, chronological and thematic examination of the literature, two assumptions are uncovered that have either been upheld despite little evidence to support them, or received limited attention at all, in the literature. Specifically, a) that threat appeal characteristics can be conflated with their intended responses, and b) that a threat appeal always and necessarily evokes a fear response in the subject. A detailed examination of these assumptions underpins this thesis. The intention is to take as a point of departure the equivocality of empirical results, and deliver a novel approach with the objective of reducing the confusion that is evident in existing work. More specifically, the present thesis frames cognitive and emotional responses to threat appeals as part of a decision about future behaviour. To further develop theory, a conceptual framework is presented that outlines the role of anticipated and anticipatory emotions, alongside subjective probabilities, elaboration and immediate visceral emotions, resultant from manipulation of the intrinsic message characteristics of a threat appeal (namely, message direction, message frame and graphic image). In doing so, the spectrum of relevant literature is surveyed, and used to develop a theoretical model which serves to integrate key strands of theory into a coherent model. In particular, the emotional and cognitive responses to the threat appeal manipulations are hypothesised to influence behaviour intentions and expectations pertaining to future behaviour. Using data from a randomised experiment with a sample of 681 participants, the conceptual model was tested using analysis of covariance. The results for the conceptual framework were encouraging overall, and also with regard to the individual hypotheses. In particular, empirical results showed clearly that emotional responses to the intrinsic message characteristics are not restricted to fear, and that different responses to threat appeals were clearly attributed to specific intrinsic message characteristics. In addition, the inclusion of anticipated emotions alongside cognitive appraisals in the framework generated interesting results. Specifically, immediate emotions did not influence key response variables related to future behaviour, in support of questioning the assumption of the prominent role of fear in the response process that is so prevalent in existing literature. The findings, theoretical and practical implications, limitations and directions for future research are discussed.
Resumo:
The production of recombinant therapeutic proteins is an active area of research in drug development. These bio-therapeutic drugs target nearly 150 disease states and promise to bring better treatments to patients. However, if new bio-therapeutics are to be made more accessible and affordable, improvements in production performance and optimization of processes are necessary. A major challenge lies in controlling the effect of process conditions on production of intact functional proteins. To achieve this, improved tools are needed for bio-processing. For example, implementation of process modeling and high-throughput technologies can be used to achieve quality by design, leading to improvements in productivity. Commercially, the most sought after targets are secreted proteins due to the ease of handling in downstream procedures. This chapter outlines different approaches for production and optimization of secreted proteins in the host Pichia pastoris. © 2012 Springer Science+business Media, LLC.
Resumo:
This chapter introduces Native Language Identification (NLID) and considers the casework applications with regard to authorship analysis of online material. It presents findings from research identifying which linguistic features were the best indicators of native (L1) Persian speakers blogging in English, and analyses how these features cope at distinguishing between native influences from languages that are linguistically and culturally related. The first chapter section outlines the area of Native Language Identification, and demonstrates its potential for application through a discussion of relevant case history. The next section discusses a development of methodology for identifying influence from L1 Persian in an anonymous blog author, and presents findings. The third part discusses the application of these features to casework situations as well as how the features identified can form an easily applicable model and demonstrates the application of this to casework. The research presented in this chapter can be considered a case study for the wider potential application of NLID.
Resumo:
Purpose: This paper aims to examine the influence of the culture of the service firm on its interpretation of the role of the brand and on the development and implementation of its brand values. Design/methodology/approach: A grounded theory approach was used. Interviews were conducted with 20 managers within two leading banking firms in Ireland and two leading grocery retailers in Ireland. Findings: The development of the brand, and its role within the firm, is closely related to the firm's culture. The research shows obstacles and opportunities created by the cultural context of firms wishing to disseminate and embed a set of brand values. The paper presents an "involvement model" of brand values implementation and outlines changes required to implement brand values. Research limitations/implications: The study was bound by access to firms, and managers' availability. The authors sought an insight into the relationship between each firm's culture and its brands. They advocate quantitative research to further investigate the findings within these service sectors and to test proposed antecedents (transformational leadership, employee involvement) and outcomes (employee-based brand equity and consumer-based brand equity) of values adoption. Practical implications: The paper identifies aspects of retail and banking cultures which support or detract from brand development. In particular, it presents the learnings from successful brand values implementation in a clan culture, aspects of which are applicable across other cultures. Originality/value: The paper provides valuable insights into the role of the brand within the service firm and the positive and negative influence of context on brand values and their development and implementation. © Emerald Group Publishing Limited.
Resumo:
This paper outlines a novel elevation linear Fresnel reflector (ELFR) and presents and validates theoretical models defining its thermal performance. To validate the models, a series of experiments were carried out for receiver temperatures in the range of 30-100 °C to measure the heat loss coefficient, gain in heat transfer fluid (HTF) temperature, thermal efficiency, and stagnation temperature. The heat loss coefficient was underestimated due to the model exclusion of collector end heat losses. The measured HTF temperature gains were found to have a good correlation to the model predictions - less than a 5% difference. In comparison to model predictions for the thermal efficiency and stagnation temperature, measured values had a difference of -39% to +31% and 22-38%, respectively. The difference between the measured and predicted values was attributed to the low-temperature region for the experiments. It was concluded that the theoretical models are suitable for examining linear Fresnel reflector (LFR) systems and can be adopted by other researchers.
Resumo:
Sustainability has become a watchword and guiding principle for modern society, and with it a growing appreciation that anthropogenic 'waste', in all its manifold forms, can offer a valuable source of energy, construction materials, chemicals and high value functional products. In the context of chemical transformations, waste materials not only provide alternative renewable feedstocks, but also a resource from which to create catalysts. Such waste-derived heterogeneous catalysts serve to improve the overall energy and atom-efficiency of existing and novel chemical processes. This review outlines key chemical transformations for which waste-derived heterogeneous catalysts have been developed, spanning biomass conversion to environmental remediation, and their benefits and disadvantages relative to conventional catalytic technologies.
Resumo:
THE YOUTH MOVEMENT NASHI (OURS) WAS FOUNDED IN THE SPRING of 2005 against the backdrop of Ukraine’s ‘Orange Revolution’. Its aim was to stabilise Russia’s political system and take back the streets from opposition demonstrators. Personally loyal to Putin and taking its ideological orientation from Surkov’s concept of ‘sovereign democracy’, Nashi has sought to turn the tide on ‘defeatism’ and develop Russian youth into a patriotic new elite that ‘believes in the future of Russia’ (p. 15). Combining a wealth of empirical detail and the application of insights from discourse theory, Ivo Mijnssen analyses the organisation’s development between 2005 and 2012. His analysis focuses on three key moments—the organisation’s foundation, the apogee of its mobilisation around the Bronze Soldier dispute with Estonia, and the 2010 Seliger youth camp—to help understand Nashi’s organisation, purpose and ideational outlook as well as the limitations and challenges it faces. As such,the book is insightful both for those with an interest in post-Soviet Russian youth culture, and for scholars seeking a rounded understanding of the Kremlin’s initiatives to return a sense of identity and purpose to Russian national life.The first chapter, ‘Background and Context’, outlines the conceptual toolkit provided by Ernesto Laclau and Chantal Mouffe to help make sense of developments on the terrain of identity politics. In their terms, since the collapse of the Soviet Union, Russia has experienced acute dislocation of its identity. With the tangible loss of great power status, Russian realities have become unfixed from a discourse enabling national life to be constructed, albeit inherently contingently, as meaningful. The lack of a Gramscian hegemonic discourse to provide a unifying national idea was securitised as an existential threat demanding special measures. Accordingly, the identification of those who are ‘notUs’ has been a recurrent theme of Nashi’s discourse and activity. With the victory in World War II held up as a foundational moment, a constitutive other is found in the notion of ‘unusual fascists’. This notion includes not just neo-Nazis, but reflects a chain of equivalence that expands to include a range of perceived enemies of Putin’s consolidation project such as oligarchs and pro-Western liberals.The empirical background is provided by the second chapter, ‘Russia’s Youth, the Orange Revolution, and Nashi’, which traces the emergence of Nashi amid the climate of political instability of 2004 and 2005. A particularly note-worthy aspect of Mijnssen’s work is the inclusion of citations from his interviews with Nashicommissars; the youth movement’s cadres. Although relatively few in number, such insider conversations provide insight into the ethos of Nashi’s organisation and the outlook of those who have pledged their involvement. Besides the discussion of Nashi’s manifesto, the reader thus gains insight into the motivations of some participants and behind-the-scenes details of Nashi’s activities in response to the perceived threat of anti-government protests. The third chapter, ‘Nashi’s Bronze Soldier’, charts Nashi’s role in elevating the removal of a World War II monument from downtown Tallinn into an international dispute over the interpretation of history. The events subsequent to this securitisation of memory are charted in detail, concluding that Nashi’s activities were ultimately unsuccessful as their demands received little official support.The fourth chapter, ‘Seliger: The Foundry of Modernisation’, presents a distinctive feature of Mijnssen’s study, namely his ethnographic account as a participant observer in the Youth International Forum at Seliger. In the early years of the camp (2005–2007), Russian participants received extensive training, including master classes in ‘methods of forestalling mass unrest’ (p. 131), and the camp served to foster a sense of group identity and purpose among activists. After 2009 the event was no longer officially run as a Nashi camp, and its role became that of a forum for the exchange of ideas about innovation, although camp spirit remained a central feature. In 2010 the camp welcomed international attendees for the first time. As one of about 700 international participants in that year the author provides a fascinating account based on fieldwork diaries.Despite the polemical nature of the topic, Mijnssen’s analysis remains even-handed, exemplified in his balanced assessment of the Seliger experience. While he details the frustrations and disappointments of the international participants with regard to the unaccustomed strict camp discipline, organisational and communication failures, and the controlled format of many discussions,he does not neglect to note the camp’s successes in generating a gratifying collective dynamic between the participants, even among the international attendees who spent only a week there.In addition to the useful bibliography, the book is back-ended by two appendices, which provide the reader with important Russian-language primary source materials. The first is Nashi’s ‘Unusual Fascism’ (Neobyknovennyi fashizm) brochure, and the second is the booklet entitled ‘Some Uncomfortable Questions to the Russian Authorities’ (Neskol’ko neudobnykh voprosov rossiiskoivlasti) which was provided to the Seliger 2010 instructors to guide them in responding to probing questions from foreign participants. Given that these are not readily publicly available even now, they constitute a useful resource from the historical perspective.
Resumo:
Private label branding strategies differ to that of the manufacturer. The study aims to identify optimal private label branding strategies for (a) utilitarian products and (b) hedonistic products, considering the special factors reflected in consumer behavior related to private labels in Hungary. The issue of House of Brands and Branded House strategies are discussed and evaluated in the light of retail business models. Focus group interviews and factor analysis of the survey found differences in branding strategies preferred by consumers for the two product categories. The study also outlines a strong trend in possible private label development based on consumer’s changing attitude in favor of national products.
Resumo:
Savings and investments in the American money market by emerging countries, primarily China, financed the excessive consumption of the United States in the early 2000s, which indirectly led to a global financial crisis. The crisis started from the real estate mortgage market. Such balance disrupting processes began on the American financial market which contradicted all previously known equilibrium theories of every school of economics. Economics has yet to come up with models or empirical theories for this new disequilibrium. This is why the outbreak of the crisis could not be prevented or at least predicted. The question is, to what extent can existing market theories, calculation methods and the latest financial products be held responsible for the new situation. This paper studies the influence of the efficient market and modern portfolio theory, as well as Li’s copula function on the American investment market. Naturally, the issues of moral risks and greed, credit ratings and shareholder control, limited liability and market regulations are aspects, which cannot be ignored. In summary, the author outlines the potential alternative measures that could be applied to prevent a new crisis, defines the new directions of economic research and draws the conclusion for the Hungarian economic policy.